Health Tech developers, manufacturers and distributors need to understand how impending AI law and regulation will interact with their current regulatory obligations. 

As explained in our previous update on Health Tech and AI regulation, the EU and UK have taken a different approach to regulating AI, with the EU’s AI Act providing a cross-sector set of requirements, and the UK looking to existing regulators to implement AI requirements specific to their sector. 

We set out an update below on how both approaches are being implemented in practice, with a focus on medical devices, and what this means for Health Tech businesses

Update on UK approach

The MHRA first published their roadmap for regulation of software and AI as a medical device in October 2022. To help developers and adopters of AI and digital technologies in health and social care understand their regulatory obligations, UK regulators launched the Artificial Intelligence and Digital Regulations Service  in March 2023. This was closely followed by the UK’s framework for regulating AI (also known as ‘the White Paper’) (see our flowchart to navigate the UK’s position). 

The MHRA published an updated roadmap for reform in January 2024, which included timescales for publishing new guidance relating to software and AI as a medical device (SAMD):

  • Guidance expected in early 2024 on SAMD relating to good machine learning practice for medical device development mapping and AI as a Medical Device development and deployment best practice
  • The AI Airlock regulatory sandbox due in mid 2024.
  • SaMD guidance on data-driven SaMD research, development and governance is expected in late 2024.

The MHRA intends to implement broader regulatory changes to the medical device regime (which will include requirements for SAMD) via three separate statutory instruments in the following order:

  • Post Market Surveillance regulations to bring forward improvements to proactive safety monitoring and put measures in place to ensure higher risk devices are subject to additional post-market surveillance (draft expected before Parliament mid-2024).
  • Future Core Regulations to implement the core requirements of the new MHRA regulatory framework for medical devices (draft expected before Parliament in early 2025)
  • Future Enhancement Regulations to detail further requirements, fees and expenses (not clear how soon after the Future Core Regulations this is expected)

AI is regulated by MHRA where it is classified as a medical device. For standalone software which satisfies the definition of a medical device, this is likely to fall under lower risk Class I or Class IIa devices under the current UK regulations. Under the future regulations, the MHRA has stated it intends to adopt the IMDRF risk categorisations for software as a medical device, which means Health Tech businesses will need to re-assess how their products would be classified under the IMDRF approach to ensure compliance with future UK regulations. 

The MHRA has stated that they intend to align the medical device regulations to the updated EU regime under the MDR and IVDR. The MHRA has also made clear that as an independent sovereign regulator, they will depart from the EU regime where changes for the UK are required. What is not clear, is whether the principle of aligning with the EU regulatory framework for medical devices will extend to the regulation of AI and SAMD in the UK, such that the MHRA carries across principles from the EU AI Act (and associated guidance) into the equivalent UK provisions. 

Update on EU approach

Health Tech developers, manufacturers and distributors whose products include AI systems will need to understand the requirements of the EU AI Act if their product is used in the EU (regardless of where they are based).

Under the EU AI Act, AI systems which qualify as medical devices and require a conformity assessment under the EU Medical Devices Regulation (“MDR”) or In Vitro Diagnostic Medical Devices Regulation (“IVDR”), will be classed as high-risk AI systems. 

In practice, this means that any AI system, whether as a standalone software product or as a safety component of another product, which is a Class IIa, IIb or III medical device under MDR will be a high-risk AI system. Such medical devices will be subject to both requirements in the MDR/IVDR and the EU AI Act. 

The requirements for high-risk AI systems under the EU AI Act include:

  • Establishing risk management systems throughout the AI system’s lifecycle
  • Conducting data governance, ensuring that training, validation and testing datasets are relevant, sufficiently representative and, to the best extent possible, free of errors and complete according to the intended purpose
  • Drafting Technical documentation to demonstrate compliance
  • Designing systems to allow automatic record keeping
  • Transparency and provision of instructions to deployers to ensure compliance
  • Designing systems to allow human oversight
  • Achieving required levels of accuracy, robustness and cybersecurity

While many requirements under the EU AI Act cover similar areas to the MDR/IVDR requirements, there are additional obligations and companies will need to understand how this will impact the design of products, conformity assessment process and ongoing monitoring. 

The EU AI Act addresses harmonisation with the requirements under the MDR/IVDR in several areas, including:

  • Conformity Assessments: medical devices covered by MDR/IVDR will not need a separate conformity assessment to show compliance with the EU AI Act. Instead MDR/IVDR notified bodies will be responsible for ensuring compliance with both regimes. 
  • Technical Documentation: a single set of technical documentation should be used to cover the requirements of the MDR/IVDR and the EU AI Act.
  • Medical Device manufacturing: manufacturers of medical devices incorporating AI systems will have the same obligations to comply with the EU AI Act as the provider of the AI system.  
  • Market surveillance: the EU AI Act states that the AI regulator will perform market surveillance for compliance with the EU AI Act, which is a different approach from the performance of the conformity assessment. Derogations to this are allowed so it remains to be seen exactly how market surveillance will be enforced for medical devices using AI systems. 

Health Tech AI systems are used in a wide range of settings other than as medical devices, from the medicinal product lifecycle to administrative support and a wide range of products which do not qualify as medical devices. The European Medicines Agency (EMA) published a reflection paper in July 2023 on the use of AI in the medicinal product lifecycle. Even if an AI system is not considered high-risk, other obligations may apply under the EU AI Act such as ensuring minimum levels of transparency to persons affected by AI systems.

Data protection

Another key area for Health Tech businesses using AI systems will be ensuring compliance with data protection obligations and AI regulation. 

We will shortly be publishing a post about data protection risks when procuring, developing and deploying AI systems as part of our What you need to know about AI series. 

For those providing AI systems for the healthcare sector there will be other factors to consider, see our briefing, Health Tech Series: Health Tech and Personal Data (, for more information.

Next steps 

Further guidance explaining how AI regulation will impact the Health Tech and Life Sciences sector is due to be published by regulators in the UK and in the EU. In addition to the MHRA roadmap mentioned above, further detail is likely to come in the MHRA’s response to a letter from Government which requested that MHRA publish its strategy to implement the AI White Paper by no later than 30 April 2024. For their part, EU regulators have published a Workplan for 2023 – 2028, setting out their strategy for maximising the benefit of AI in health and Life Sciences, including ensuring compliance with the EU AI Act.

Health Tech companies need to plan now to ensure they can ensure continuity of supply of current products and prevent regulatory requirements from hampering future innovation. 


If you have any questions or would otherwise like to discuss any of the issues raised in this article, please contact Rory Trust or another member of our Healthcare Team. For the latest updates on AI law, regulation, and governance, see our AI blog at: AI: Burges Salmon blog (