This browser is not actively supported anymore. For the best passle experience, we strongly recommend you upgrade your browser.
viewpoints
Welcome to Reed Smith's viewpoints — timely commentary from our lawyers on topics relevant to your business and wider industry. Browse to see the latest news and subscribe to receive updates on topics that matter to you, directly to your mailbox.
| 2 minute read

The EU AI Act and Medical Devices: Navigating High-Risk Compliance

The EU AI Act entered into force in August 2024 and is set to apply gradually over the coming years. The AI Act essentially establishes a risk classification system to regulate AI systems based upon their potential risk, banning systems posing an unacceptable risk and regulating those with high, medium, and low-risk to ensure safety, performance, transparency and accountability. 

The AI Act is sector-agnostic and therefore also applies to AI-driven medical devices and diagnostics, imaging systems, decision support systems and other digital health technology which are already regulated under the Medical Devices and In Vitro Diagnostic Regulations (MDR/IVDR).

AI systems used for medical purposes or so-called ‘Medical device AI’ (“MDAI”) will qualify as a high-risk AI system under the AI Act if the MDAI is a safety component of a device or a medical device in itself, and the MDAI is required to undergo a third-party conformity assessment by a notified body. This essentially means that MDR class IIa, IIb and III devices and IVDR class A-D will normally classify as high-risk for purposes of the AI Act (but not necessarily under the MDR/IVDR).

This high-risk classification imposes compliance requirements under the AI Act that go above and beyond the MDR/IVDR. These additional requirements are mainly AI-specific such as data quality, data governance, record-keeping, transparency, accountability and human oversight but also touch on aspects of risk management, conformity assessment, post-market surveillance and vigilance. So despite this AI-specific focus there is nonetheless a risk of overlapping and/or inconsistent rules, in particular since the MDR/IVDR already governs MDAI through comprehensive requirements for safety, performance and clinical data. 

Medtech and digital health companies will now need to conform to this dual legal framework as the AI Act applies in parallel to the MDR/IVDR. Fortunately, the AI Act allows companies to incorporate the AI-specific requirements into their existing documentation, processes and procedures as established already under the MDR/IVDR. This means that companies can demonstrate compliance with the AI Act by adapting and extending their existing quality management system, technical documentation and other MDR/IVDR procedures. A single conformity assessment is also allowed by notified bodies who are accredited both under the AI Act and MDR/IVDR. 

The regulatory burden to meet the AI Act requirements will remain high and companies are advised to take the following actionable steps:

  1. Organize a cross-functional team consisting of legal, regulatory affairs, quality, engineering, privacy.
  2. Perform a GAP analysis of current MDAI compliance under MDR/IVDR versus AI Act requirements.
  3. Reach out to your notified body to learn more about their own accreditation under the AI Act and their expectations on the AI Act requirements. 
  4. Future proof new MDAI or MDAI under development to anticipate compliance with the AI Act requirements when it applies from August 2027 onwards. 
  5. Keep an eye out for new European and international standards (e.g. on quality management or data quality) that will be developed as conformity with these standards creates a presumption of conformity with the AI Act.
  6. Review the June 2025 EU guidance regarding the interplay between the AI Act and MDR/IVDR.  

Tags

ai act, medical devices, digital health, eu, compliance, regulatory, health care & life sciences, emerging technologies