Artificial intelligence

A new CE marking for European healthcare: when and why?

Published on 8th May 2024

The EU AI Act's new CE-marking regime for high-risk systems has implications for life sciences and healthcare businesses

Microscope testing

The CE ("Conformité Européenne") marking of conformity is typically required to demonstrate a product's compliance with applicable European Union legislation. It is a pillar of the EU's new legislative framework (NLF) for industrial products, of which artificial intelligence (AI) systems and the EU AI Act form a part.

The regulation requires that high-risk AI systems bear the CE marking to indicate their conformity with the EU AI Act, so that they can move freely within the internal market like other products such as personal protective equipment, in vitro diagnostics and medical devices.

Affixing the CE marking is a provider's obligation under the regulation, although other operators such as distributors and importers must verify that an AI system bears the required CE marking (where required).

A digital CE marking must be used for high-risk AI systems provided digitally. A condition for digital marking is that it should easily be accessed via the interface from which the system is accessed or via an easily accessible machine-readable code or other electronic means.

For high-risk embedded AI systems that are regulated as industrial products, a physical CE marking should be affixed (and may be complemented by a digital CE marking).

In any event - and in line with the NLF legislation - the CE marking for high-risk AI systems must be affixed visibly, legibly and indelibly. The new regulation recognises that this is not always possible or warranted on account of the nature of the high-risk AI system and, therefore, permits a CE marking to be affixed, as appropriate, to the packaging or to the accompanying documentation.

Conformity assessment procedures

Prior to affixing the CE marking, providers of high-risk AI systems must conduct a conformity assessment, which is the process of demonstrating whether the applicable requirements set out in the regulation have been fulfilled. However, this does not necessarily mean that providers of high-risk AI systems must hire an external, third-party conformity assessment body - for example, a notified body - to conduct the procedure.

It is the EU's intention to limit, at least to the appropriate extent and in the initial phase of the regulation, the scope of application of third-party conformity assessment for high-risk AI systems. This results in a situation where third-party assessments are only required in two instances: when the high-risk AI system is regulated as an industrial product and when used in certain situations described in the regulation, such as biometrics.

As an industrial product - or when it is intended to be used as a safety component of this category of product - the high-risk AI system is already regulated under the NLF. As a reminder, the regulatory requirements specific to high-risk AI systems solely apply if the industrial product is required to undergo a third-party conformity assessment under the applicable NLF legislation.

In this first case, the notified body of a high-risk AI system that is also an NLF industrial product - for example, the notified body of a medical device software tool - or serves as a safety component may also control the AI's conformity with the EU AI Act.

Certain requirements must be fulfilled by the notified body; notably, its compliance with the EU AI Act's conditions on independence, professional integrity and sufficient internal competence must have been assessed. This assessment must occur in the context of the notification procedure under the applicable EU NLF legislation; for example, the EU medical devices regulation.

Third-party assessments are also required when the high-risk AI systems is used in the areas listed in the regulation's annex three, point one, such as biometrics. Examples include remote biometric identification systems, AI systems intended to be used for biometric categorisation and AI systems intended to be used for emotion recognition. There are multiple applications of such systems in life sciences, by pharmaceutical and medical device businesses.

In this second case, the requirement to involve a notified body is transient and is not automatic. It is meant to apply, for instance, so long as applicable harmonised standards do not exist and common specifications are not available or so long as the provider does not fully apply to them.

Internal control v notified body

Where the assessment by a notified body is not required, providers of high-risk AI systems may follow a conformity assessment procedure based on a so-called "internal control".

This internal control process is threefold. Firstly, verification is required that an established quality management system (QMS) is in compliance with the EU AI Act's requirements. Secondly, self-examination needs to be undertaken of the information contained in the high-risk AI systems' technical documentation to assess its compliance with the regulation's relevant essential requirements. And, thirdly, it is required to control that the design, the development process and the post-market monitoring of the high-risk AI system is consistent with the technical documentation (TD).

The procedure involving a notified body, on the other hand, is intricate and more stringent for providers. Both the QMS and the TD must be assessed by the provider's notified body, who is also in charge of the approved QMS' surveillance. Periodic audits may be conducted by the notified body and additional tests of the AI systems may be required. Where the AI system is in conformity with the requirements applicable to high-risk AI systems, the notified body must issue a TD assessment certificate. Where the AI system does not meet the requirement relating to the data used to train it, retraining of the AI system may be required by the notified body.

In either case, the CE marking is followed by the identification number of the notified body responsible for the conformity assessment procedures. The identification number must also be indicated in any promotional material mentioning that the high-risk AI system fulfils the requirements for CE marking.

Osborne Clarke comment

Life sciences businesses that do not have CE-marked products in their portfolio will have to familiarise themselves with the new requirements for applicable high-risk AI systems when they act as providers. Those who are familiar with the CE-marking requirements - for example, medical device manufacturers - will have to navigate a new set of rules specifically designed for AI-powered technologies.

The rules will supplement existing conformity assessment procedures under current EU legislation and manoeuvring the interplay between prevailing and new EU requirements will require considerable efforts. Those efforts are likely to become particularly significant when a notified body is in play.

Pharmaceutical or medical device businesses that only use or deploy high-risk AI systems without acting as provider are not in charge of CE-marking high-risk products but they will be required - among other things - to familiarise themselves with the instructions for use accompanying each high-risk AI system they deploy. This is to ensure appropriate technical and organisational measures are taken to use the systems as required by the EU AI Act. Multiple additional requirements apply to these businesses under the regulation.

This Insight series on the EU AI Act explores over the coming months its implications for businesses active across life sciences and healthcare. Coverage will include AI supply chains, product logistics, research and development, SMEs, compliance monitoring, liability, and more. The next Insight in the series will focus on healthcare AI and low-risk systems.

Share
Interested in hearing more from Osborne Clarke?

* This article is current as of the date of its publication and does not necessarily reflect the present state of the law or relevant regulation.

Connect with one of our experts

Interested in hearing more from Osborne Clarke?