A recent survey of over 100 AI startups and 15 venture capital firms in the EU has shone a light on potential issues raised by the EU AI Act, including:
- potentially many more AI systems will be classed as high-risk (33%+), and therefore subject to the EU AI Act's obligations, than the EU Commission had assumed in its impact assessment of the AI Act (5-15%);
- the obligations on high-risk AI systems (and their providers) are seen as a 'significant challenge... in terms of technical and organisational complexity and compliance cost'. Around half of the startups expect compliance costs to be in the region expected by the EU (160k-330k EUR) but 19% expect higher.
- Some AI startups (16%) will consider stopping to develop AI or relocation outside the EU (however, the EU AI Act affects AI Systems which are placed on or put into service in the EU market, so it is not limited to organisations located in the EU);
- 50% of startups surveyed are concerned that the AI Act will 'slow down AI innovation in Europe'
- most of the VCs expect the AI Act to reduce the competitiveness of European Startups in AI, and for there to be a shift in where and on what VCs focus their investments (9/15 say they would focus on low-risk AI systems, although notably 2/15 say they would focus on high-risk startups)
The survey comes at an interesting time from a UK perspective as we await the UK government's White Paper on proposals to regulate AI and the House of Commons Science and Technology Commons Select Committee have launched an inquiry into the 'Governance of artificial intelligence (AI)'. In particular whether there are the kinds of 'support or assistance' startups would like to see for meeting the requirements and obligations of the AI Act and potentially, by extension, any UK AI-regulation. These include:
- best practice methods and templates;
- case studies of how other companies apply the AI Act;
- training & education offers;
- reliable (binding) Q&A or references for specific questions;
- direct communication with policy-makers.
If you would like to discuss how current or future regulations impact what you do with AI, please contact Tom Whittaker or Brian Wong.
Policy Recommendations 1. Keep European Competitiveness in the center of the discussions. 2. Reduce amount of high-risk cases: Narrow the High-Risk Criteria to get closer to the anticipated 5%-15% of affected AI-Systems. 3. GP AI: Consider the role of Startups as GP AIs provider in the light of GP AI Obligations. 4. Foresee bottlenecks and systematically debottleneck them e.g. in the area of 3rd party conformity assessments to not additionally slow down innovation. 5. Conceptualise Regulatory Sandboxes as drivers for innovation in a protected, but attractive environment. 6. Update the Coordinated Plan and take the needs of European AI Startups into account, specifically in areas that are considered very difficult or costly. Reduce cost of compliance