Three years in the making, the EU AI Act finally got approval from the Council of the European Union in May 2024. On 1 August , it entered into force.
With strict rules on data quality, transparency, human oversight and accountability, and hefty fines for non-compliance, the EU AI Act is weighty and powerful. And like GDPR, it looks set to become a regulatory benchmark globally. Its extraterritorial reach means it will apply to all providers and deployers of AI systems to the EU market, no matter where they are based.
Definition of an AI system
AI system means a machine-based system designed to operate with varying levels of autonomy, that may exhibit adaptiveness after deployment and that, for explicit or implicit objectives, infers, from the input it receives, how to generate outputs such as predictions, content, recommendations, or decisions that can influence physical or virtual environments. EU AI Act, article 3
How risky is your AI system according to the EU AI Act?
The EU AI Act categorises AI systems on a sliding scale: the higher the risk, the stricter the rules and penalties. EU AI Act compliance obligations depend on where the AI system sits within the Commission’s risk pyramid.
"Low or limited-risk AI systems have light compliance obligations. At the top end, banned AI systems, used for cognitive behavioural manipulation, social scoring, or biometric profiling, could incur fines up to €35m, or seven per cent of worldwide turnover, whichever is higher," explains Tina.
High-risk AI systems are the main focus of the EU AI Act, given their potential to significantly impact health, safety and individuals' fundamental rights. They include products already covered by existing safety regulations, like medical devices, which undergo third-party conformity testing.
The intended purpose of the AI system also determines its classification as high or low risk. Christopher explains: "AI systems used to filter candidates for jobs, or to manage critical infrastructure, or to determine creditworthiness or eligibility for healthcare, are deemed high risk and must be registered in the Commission's database. But if you can prove your AI system does not materially influence decision making - perhaps it is used only to identify duplicate job applications - then it is not considered high risk."
Regulatory burden of EU AI Act compliance
The regulatory burden increases the closer the stakeholder is to the development of high-risk AI systems. At the top end are providers; at the bottom are deployers, though the Act has obligations for importers and distributors too.
AI provider
Role: Develops and markets the AI system under its own name or trademark, either freely or at a price.
Obligations: Responsibilities include data quality (a first-ever legal stipulation), risk management, technical documentation and record keeping, human oversight with scope to intervene, and cybersecurity. The EU AI Act goes further than product liability law, as providers' obligations continue for the lifecycle of the product.
Penalties: Fines for breaches are up to €15m, or three per cent of worldwide turnover, whichever is higher. For small and medium-sized enterprises, the lower sanction applies.
AI deployer
Role: Companies with responsibility for use of high-risk AI systems.
Employer requirements: Ensures users - or natural persons, as described in the Act - are AI literate, with the competency, training and education to undertake their tasks. Users must know that they are subject to the use of high-risk AI systems.
Monitoring and reporting: Continuous monitoring and record keeping. Reports risks and serious incidents to the provider and the relevant market authority. Retains data for at least six months, unless otherwise required by GDPR.
Impact assessment: Assesses the impact on fundamental rights if an AI system is used in an employment context, or in determining creditworthiness (except fraud detection) or for health/life insurance risk assessments and pricing.
Watch out for possible shifts in roles and regulatory obligations. An importer, distributor or deployer can become a provider of high-risk AI systems if they:
Add their name or trademark to an already-marketed high-risk AI system
Modify substantially a high-risk AI system
Change the intended purpose of a low-risk AI system, making it high risk
What European AI regulations mean for general-purpose AI models
A last-minute addition, due to their rapid and massive integration into everyday life, the EU AI Act now sets out obligations for providers of general-purpose AI models.
"There is quite some discretion on the side of general-purpose AI models," Christopher admits. "It's not as stringent as it could have been. Providers must only put in place a policy to demonstrate that they respect copyright law and publish a sufficiently detailed summary of the data used to train the AI model."
Large open-source models, including ChatGPT-4 and LLaMA, which are trained using greater computer power, are deemed to present systemic risks and face more rigorous cybersecurity obligations and penalties under European AI regulation.
Preparing for EU AI Act compliance
Substantive obligations will be phased in over three years, but provisions for banned AI systems become effective within six months. Until then:
Make an inventory of your AI systems and applications
Identify where your AI system fits within the EU risk pyramid
Understand your role as a provider, importer, distributor or deployer, and the obligations that apply
Consider the intended purpose of your AI system
Ensure relevant documentation, risk reporting and compliance frameworks evidence your performance against criteria defined by the Act.
Register high-risk AI systems on the EU database
Support employee AI literacy
Be alert to other regulations, like GDPR and the Digital Operational Resilience Act, which are not superseded by the EU AI Act


_11zon.jpg?crop=300,495&format=webply&auto=webp)


_11zon.jpg?crop=300,495&format=webply&auto=webp)





_11zon.jpg?crop=300,495&format=webply&auto=webp)

_11zon.jpg?crop=300,495&format=webply&auto=webp)


_11zon.jpg?crop=300,495&format=webply&auto=webp)
_11zon_(1).jpg?crop=300,495&format=webply&auto=webp)


