While companies are still in the early days of preparing to comply, senior executives are thinking deeply about whose shoulders should bear these new responsibilities. Meanwhile, the AI governance industry is developing at full speed, drawing from the experience of the General Data Protection Regulation a few years ago, but with differences.
— Compliance preparation —
Most companies’ compliance efforts are at an early stage, centered on figuring out their exact responsibilities according to the AI Act’s taxonomy, namely whether they are an AI provider or a deployer, with all the legal obligations that come with these roles.
The line between someone who puts an AI system on the market and someone who uses it can be very thin. One can easily switch from deployer to provider by “substantially modifying” the system, sometimes even without being fully aware of the legal implications.
That demarcation line might even be thinner concerning AI models, as the downstream players would have to pick up some of the duties of the model providers just for a simple fine tuning (see here).
The concepts of substantial modification of AI systems and fine tuning of AI models are still somewhat blurred. Still, they will be crucial in how organizations position themselves regarding compliance.
The second step is to map out AI systems used across organizations for different activities and classify them according to their level of risk. This might be challenging due to limited awareness, especially for large organizations.
Companies’ preparation is unsurprisingly tending to track to the AI Act’s timeline, notes Fritz Pieper, a partner at the law firm Taylor Wessing. They started by trying to determine whether they fall under the prohibited practices ban that will kick in on February 2025.
The next step will be when the duties for AI model providers start applying in August 2025, which will ripple down the value chain to “fine tuners.” All eyes will consequently turn to whether one’s AI system falls into the high-risk category, with all the legal uncertainty that surrounds this categorization.
— AI governance industry —
With new AI rules being written around the globe and investors punishing companies that don't have an AI strategy, the professionalization of the AI governance space is prompting the development of an industry of its own rank.
While AI governance courses and professional certifications are proliferating, none is better positioned to own this space than the International Association of Privacy Professionals, or IAPP, the membership of which has skyrocketed since the EU’s GDPR went into effect.
For Joe Jones, IAPP’s director of research and insights, the current state of AI compliance and the GDPR’s early days share some strong similarities.
“Again, you ask around the company, who's using AI and what AI is there, and some people and some functions don't appreciate that they're using AI in the same way that they didn't appreciate that they were processing personal data,” he said.
Jones notes that while the GDPR mandates the appointment of specialized professionals in organizations, such as data protection officers, the responsibilities for AI governance are much more dispersed.
As a result, there is a huge demand for AI governance competencies across the board for issues such as legal compliance, security and product design. IAPP alone has already trained 10,000 professionals in this space.
According to an IAPP report*, 69 percent of chief privacy officers surveyed have acquired additional responsibility for AI governance, while only 37 percent are responsible for cybersecurity regulatory compliance and 20 percent for platform liability.
— Complementarity and tensions —
Companies tend to choose privacy officers to take on AI-related due diligence because they are reluctant to reinvent the wheel and prefer to use existing internal processes.
As a result, familiarity with GDPR requirements, such as the data protection impact assessment and transparency obligations, might come in handy for AI compliance. Still, the similarity between the two laws should not be exaggerated.
As Pieper of Taylor Wessing notes, the AI Act is, at its core, product safety legislation. This means that it comes with a whole set of governance structures and market monitoring tools with which data protection departments are unfamiliar.
However, it might not be only that privacy teams are often not familiar with product safety compliance, but the underlying philosophy of the GDPR and AI Act might even conflict in some crucial aspects.
“Privacy teams are familiar with the data minimization principle, which states that you need to collect as little data as you need and be cautious about sensitive data. By contrast, in the AI governance context, one is looking at collecting large samples of data that are representative of the population, including on sensitive information like race and sexuality,” IAPP’s Jones said.
— Playing the long game —
Another critical difference between the GDPR and the AI Act is likely to be that the former had a significant impact from the start on all organizations processing personal data. By contrast, the importance of the AI law looks likely to grow with technological uptake.
In other words, whereas the GDPR has reached something of a plateau, the EU’s AI law will be much more living legislation because technology will change with time. The critical question here is how closely the EU regulators will manage to follow it.
“In terms of compliance, many companies will be waiting to hear the posture of the regulators. What's their philosophy? Are they here to enable? Are they here to enforce? Regulators will frame how compliance will occur,” IAPP’s Jones said.
To what extent the EU regulators will be able to keep up with market developments and technological advancements will be a decisive factor in how the AI Act is applied. In turn, how on the pulse the regulators turn out to be will inform the challenge faced by those tasked with ensuring their companies stay compliant.
* Organizational Digital Governance Report 2024, IAPP, September 2024.
Please email editors@mlex.com to contact the editorial staff regarding this story, or to submit the names of lawyers and advisers.