Model contractual clauses for public procurement of AI systems in light of the EU AI Act have been published.  They are based on the requirements and obligations of the EU AI Act, in particular for high-risk AI systems.  They are not an official EU document and are currently in draft for discussion purposes only. However, they are of use to those considering procuring AI.  For those in the UK (including public organisations who are subject to separate procurement obligations), these model clauses nevertheless provide a useful indication of how buyers of AI systems may seek to comply with the EU AI Act.

Here we summarise the key points of the clauses.

Is the AI system high-risk or low-risk?

There are various aspects to navigating the EU AI Act - we summarise the key aspects of navigating the EU AI Act in our flowchart here (as at Summer 2023). 

One question relevant to the model clauses is what risk-level an AI system is at the time of contracting and will, or could, be in the future (assuming the AI system isn't prohibited). Take two examples.

First, a provider of an AI system (including a public authority which develops an AI system or that has an AI system developed with a view to placing it on the market or putting it into service under its own name) must consider whether the AI system poses a significant risk:

  • if the AI system is considered high-risk, or if it is considered not high-risk but a National Supervisory Authority or the EU AI Act says it has been miss-classified, the AI system is considered high-risk.  In which case the provider must meet the obligations of high-risk AI systems under the AI Act;
  • if the AI system is not considered high-risk, the EU AI Act provides voluntary requirements.

Second, an AI system may go from being low risk to high-risk if a substantial modification is made. For example, the intended purpose of the system may change or unplanned changes occur which go beyond predetermined changes.

In either scenario, the provider needs to consider how the EU AI Act applies or could apply and, consequently, how the contract needs to accommodate those obligations and risks.  Contracting parties should be aware of all options and consider what is appropriate for them, rather than starting with (for example) the low-risk obligations as a default.

In addition, those contracting for AI should consider how the EU AI Act applies to their counter-party and and any third party, as the contract may need to reflect obligations both upstream and downstream the AI value chain, including any specific requirements in the Act about obligations on third party contracts.

The role of further guidance, technical standards and state of the art

The EU AI Act anticipates that the EU Commission will publish technical guidance to assist providers and deployers on how to develop and use AI systems, and that European Standardisation Organisations will publish technical standards that explain implementation of the AI Act obligations. Further, there are occassions where the EU AI Act sets obligations which will take into account the state of the art.

Consequently, the model clauses are designed to refer out to that guidance, those standards and the state of the art.  There is therefore a question as to what that will look like - particularly given the speed at which the state of the art is developing - and whether those obligations could (and need to be) met.

It is not uncommon for contracts to use terms which have a degree of flexibility; think, for example, of contractual obligations to do something in a reasonable way. However, what is required, and what the partyies understand, may be less clear where guidance and standards are not yet published (or, when published, may be open to interpretation), in particular with respect to AI given its highly specialised and technical nature.  

Public organisations may not have the in-house technical expertise or legal expertise required to determine whether the AI system being procured is suitable for their needs.  They will need to consider carefully about who they engage and when, and how they will monitor performance of the contract, including any development and deployment of the AI system.

Limitations on the model AI clauses

The Commission stresses that:

The EU model contractual AI clauses contain provisions specific to AI Systems and on matters covered by the proposed AI Act, thus excluding other obligations or requirements that may arise under relevant applicable legislation such as the General Data Protection Regulation. Furthermore, these EU model contractual AI clauses do not comprise a full contractual arrangement. They need to be customized to each specific contractual context. For example, EU model contractual AI clauses do not contain any conditions concerning intellectual property, acceptance, payment, delivery times, applicable law or liability. The EU model contractual AI clauses are drafted in such a way that they can be attached as a schedule to an agreement in which such matters have already been laid down.

Next steps

The clauses are likely to require change as a result of further negotiations of the EU AI Act and as guidance and technical standards are published in respect of the Act. They may need to be read in light of other model clauses being developed, such as those by the (UK) Society for Computers and Law.

If you would like to discuss how current or future regulations impact what you do with AI, please contact Lucy Pegler, Tom Whittaker, David VarneyMartin Cook or any other member in our Technology team