EU guidance details how technical and clinical standards converge for AI in medtech
Published on 30th June 2025
FAQ explains research exemption, dual conformity assessment and clinical data obligations for devices and diagnostics

The EU is steadily shaping the rules for artificial intelligence (AI) in medical technology, with new guidance designed to shed light on the evolving landscape. A question and answer (Q&A) document, which has been released jointly by the Medical Device Coordination Group (MDCG) and the Artificial Intelligence Board (AIB), explores how the recent AI Act dovetails with both the Medical Devices Regulation (MDR) and the In Vitro Diagnostic Medical Devices Regulation (IVDR).
The Q&A publication is set to become a go-to resource for medtech and digital health businesses as they navigate the dual regulatory path now required for medical device artificial intelligence (MDAI). For those developing or rolling out AI-driven solutions in healthcare, this is one of the first comprehensive guides to making sense of the new compliance environment.
Applicability and qualification
The AI Act's provisions on high-risk AI cover systems intended for medical purposes if they either act as a safety component of a medical device or are themselves a medical device, and are subject to third-party conformity assessment by a notified body under MDR or IVDR. This means that most higher-risk medical devices and IVDs incorporating AI – such as MDR Class IIa, IIb, III, and IVDR Class B, C, D – are automatically considered “high-risk” AI systems under the AI Act, while lower-risk Class I (non-sterile, non-measuring, non-reusable surgical) and in-house devices are not.
Notably, the AI Act's high-risk classification does not alter the risk class of the device under MDR or IVDR; it simply triggers additional AI-specific requirements. The guidance further provides a table mapping device classes to high-risk AI system status, ensuring manufacturers can determine whether their products fall within the dual regulatory scope.
Research exemption for AI in clinical studies
The joint MDCG-AIB guidance offers a perspective on how the AI Act’s exemptions for research and real-world testing apply to high-risk MDAI during clinical investigations and performance studies.
Under the AI Act, activities involving research, testing or development of AI systems before they are placed on the market or put into service are generally outside the regulation’s scope. However, this exclusion does not cover testing in real-world conditions, which falls under article 60 of the AI Act (entitled "Testing of high-risk AI systems in real world conditions outside AI regulatory sandboxes").
For high-risk MDAI, the guidance confirms that clinical investigations under the MDR and performance studies under the IVDR typically qualify as “testing in real world conditions” under article 60. This means such testing is permitted before market placement, provided it follows the applicable requirements of the MDR or IVDR. In practical terms, manufacturers can conduct these studies under the MDR/IVDR frameworks without applying the full set of AI Act obligations at this stage. The MDCG explicitly references the AI Act’s provision that this approach is “without prejudice to Union or national law” on testing for medical devices or diagnostics.
This exemption is not unconditional. Real-world testing should comply with MDR and IVDR requirements, including those for study design, ethical review, informed consent and data protection. Article 60 introduces additional safeguards – such as the need for a testing plan, prior approval by market surveillance authorities, registration, and protections for vulnerable subjects. However, the guidance states that for MDAI, these requirements should be interpreted within the existing oversight structures of the MDR and IVDR, which are currently being reviewed as part of a targeted evaluation at EU level.
Once a high-risk MDAI is placed on the market or put into service, the landscape changes: the research and real-world testing exemptions cease to apply, and manufacturers are mandated to ensure full compliance with the MDR, IVDR and the AI Act. This includes meeting all requirements for conformity assessment, technical documentation and post-market monitoring.
Dual conformity assessment and technical documentation
For high-risk MDAI, the guidance confirms that the MDR, IVDR and the AI Act apply in parallel, each bringing its own set of requirements to the conformity assessment process. The starting point is that the conformity assessment route is determined by the device’s classification under the MDR or IVDR. For most high-risk MDAI, this means a notified body will be involved, reviewing the quality management system and technical documentation, and carrying out inspections to ensure compliance with all applicable requirements.
Crucially, the AI Act introduces additional obligations for high-risk MDAI – such as requirements on risk management, data governance, transparency, and human oversight – which must also be addressed as part of the conformity assessment. The guidance clarifies that these AI Act requirements are to be considered as part of the MDR/IVDR conformity assessment process leading to the CE-marking of conformity.
This also means manufacturers should strive to ensure their technical documentation demonstrates compliance with both frameworks. To that effect, they are encouraged to integrate the information and evidence required by the AI Act into their existing MDR and IVDR technical documentation, making use of the flexibility provided by article 8 of the AI Act, entitled "Compliance with the requirements". This approach is intended to reduce duplication and administrative burden, while still ensuring that relevant obligations are met. The technical documentation should therefore cover not only the device’s design, intended purpose, and clinical evidence (as required by MDR and the IVDR), but also AI-specific aspects such as bias mitigation strategies, data quality and representativeness, and transparency measures.
Clinical data requirements
The MDCG and the AIB provide detail on the clinical data requirements for MDAI. Under the MDR and IVDR, manufacturers are required to generate robust clinical evidence through clinical or performance evaluation, using data that is representative of the intended patient population and use environment.
The AI Act adds further requirements for high-risk MDAI, mandating that training, validation and testing datasets be relevant, sufficiently representative, as error free as possible and complete. The AI Act distinguishes between training data (used to fit AI parameters), validation data (for tuning and evaluation), validation datasets (for split evaluation) and testing data (for final performance confirmation).
Both regimes imply that manufacturers implement data governance and bias mitigation strategies, with the AI Act explicitly requiring documentation of procedures to detect, prevent, and mitigate unwanted bias. These combined requirements reinforce the need for high-quality, well-documented data pipelines and ongoing post-market monitoring to ensure continued safety and performance.
Osborne Clarke comment
The EU’s recent guidance offers some progress in clarifying the regulatory landscape for AI in medtech and diagnostics, providing a degree of certainty for manufacturers, deployers and digital health businesses operating in or entering the EU market.
It confirms which higher-risk medical devices and IVDs with AI components will be subject to dual compliance under the MDR, IVDR and the AI Act, and it outlines options to help streamline conformity assessment and technical documentation.
The sections addressing the research exemption for investigational AI use, as well as the requirements for clinical data and data governance, reflect an attempt to balance innovation with safety. Companies may wish to take these elements into account as they start to develop or review their technical documentation and data management practices for MDAI.
Additional EU guidance, harmonised standards and implementing legislation are set to follow as the AI regulatory environment continues to evolve. It is hoped that these will help address a number of open questions and areas where further detail would be valuable for life sciences businesses navigating the dual regulatory framework, particularly during the research and pre-market phases.