The EU Artificial Intelligence Act [1], which was adopted earlier this year after difficult and tense negotiations, is the world’s first sweeping rules to govern AI, especially powerful systems like OpenAI’s ChatGPT.
Although the rules were first proposed in 2021, they took on greater urgency when ChatGPT burst onto the scene in 2022, showing generative AI’s human-like ability to churn out eloquent text within seconds. Other examples of generative AI include Dall-E and Midjourney, which can generate images in nearly any style with a simple input in everyday language.
"With our artificial intelligence act, we create new guardrails not only to protect people and their interests, but also to give business and innovators clear rules and certainty," European Commission President Ursula von der Leyen said.
Companies will have to comply by 2026 but rules covering AI models like ChatGPT will apply 12 months after the law enters into force.
Strict bans on using AI for predictive policing based on profiling and systems that use biometric information to infer an individual’s race, religion or sexual orientation will apply six months after the law enters into force.
The law known as the "AI Act" takes a risk-based approach: if a system is high-risk, a company has a stricter set of obligations to fulfil to protect citizens’ rights. The higher the risk to Europeans’ health or rights, for example, the greater the companies’ requirements to protect individuals from harms.
Extended geographic scope
"The geographic scope of the AI Act is very broad, so organisations with any connections to the EU in their business or customer base will need an AI governance programme in place to identify and comply with their obligations," said Marcus Evans, partner at law firm Norton Rose Fulbright.
Companies in violation of the rules on banned practices or data obligations face fines of up to seven percent of worldwide annual revenue.
The EU in May established an "AI Office" of tech experts, lawyers and economists under the new law to ensure compliance.