It's Coming - How To Prepare For The EU AI Act

February 7, 2024

The pre-final text of this new upcoming EU law was endorsed by all 27 EU Member States on 2 February 2024, and it has now passed to the European Parliament's Internal Market and Civil Liberties Committees for adoption of the pre-final text, followed by a vote provisionally scheduled for 10 - 11 April 2024. So, what do you need to be thinking about?

What's the rush?

The timeline for the adoption of the EU AI Act is tight but considered necessary in view of the upcoming European Parliament elections in June. So, we can expect it to come in fairly soon and it is therefore necessary to reflect on what this might mean for businesses in terms of their contractual obligations and expectations.  

What is the EU AI Act?

It's worth remembering that the aim of the AI Act is to set a standard for the provision of AI systems across Europe, but it is thought that it will go further than that and will set a standard across the world. It remains to be seen if this is the case but, if you are a UK business which already provides AI systems in Europe then you will need to ensure compliance with this new law.  

The Act takes a risk approach to different AI systems, with some systems being outright banned, others being considered “high risk” and others being general-purpose AI systems.  

What contractual obligations need to be considered?

When it comes to “high risk” AI systems, your contractual obligations and expectations will differ depending on whether you are a “seller” or “buyer” of an AI system.  While some AI systems are banned entirely, barring narrow exceptions, the EU AI Act imposes specific obligations on the providers and deployers of so-called high-risk AI systems, including testing, documentation, transparency, and notification duties.

If you are a provider, then there is no doubt that your customers will expect you to be able to evidence compliance and also give contractual obligations in respect of this compliance.  

If you are a provider and have used an external third party to assist you in being able to provide the AI system to the market then the legal requirements for the high-risk AI system in question should be included in any development agreement you have with the third party, to attempt to minimise risk as much as possible and limit liability where possible.  

Will your customer contracts need to be updated?

The users of AI systems have obligations under the draft EU AI Act. It, therefore, may be a precaution to reflect any appropriate liability provisions in the contractual agreements with the users of your AI systems. This will need to be balanced against what is considered appropriate and will need to be tailored depending on what the AI system is and what output it has. Like any contract risk will need to be apportioned depending on the relationship and will need to be negotiated.  

Using third parties in AI development

If you are business that integrates an externally developed AI solution into your own software products, then you must still be able to create documentation and logs. Therefore, any contract should be drafted with the effect that the AI developer provides support for all claims if needed.  Some information about the AI system may contain business secrets of the company developing the AI and this may lead to resistance from AI developers in contract negotiations which is something to be aware of so all eventualities can be covered.  

As a business, ultimately if you are working with third parties to develop your AI systems you are going to want contractual assurances from the third party, for support and assistance in responding to queries and claims in accordance with any AI related legislation, to include the EU AI Act.  

AI developers are naturally going to want to limit their risk to clients where possible and to cover only what is required under the law. However, in our view, it is in the interest of both parties to find a way forward create legal certainty to avoid claims from users of AI systems.  

Things to think about

If you are providing AI systems to customers and users in the EU then you will want to review the EU AI act in detail to make sure you can allocate risk appropriately in your contracts, with both third parties who may be assisting you in development of AI systems and also with your customers. You will also need to think about this if you are thinking about providing AI systems in the EU in the future.  

We can expect to see that customers will want reassurance that businesses have complied with all relevant laws and regulations which apply when they are paying for an AI service. We expect compliance requirements for AI systems to become market standard in the same way that GDPR compliance has, so it is worth considering this now and making it a priority across the business.  

Receive our insights directly to your inbox by signing up to our newsletter

Recommended content