New requirements of the Notified Bodies
AI-based software, just like classical software, is subject to regulation by the European Medical Device Regulation 2017/745 (MDR) if it is to be placed on the market in Europe as a medical device. In simple terms, manufacturers of medical device software must primarily fulfil the general obligations from Art. 10 MDR as well as the General Safety and Performance Requirements (GSPRs) in Annex I MDR. Section 17 contains software-specific requirements.
Since the MDR does not contain any AI-specific requirements, manufacturers are ultimately uncertain about what additional documentation Notified Bodies want to see as part of the conformity assessment. The question also arises what additional effort is required.
New question catalogue of the German Notified Bodies
In 2020, the Interest Group of Notified Bodies for Medical Devices in Germany has published the second version of its question catalogue “Artificial Intelligence in Medical Devices“. Although the question catalogue is not legally binding and has not yet been included in the relevant guidance of the European Medical Device Coordination Group (MDCG), we believe that it is an important guideline for preparing MDR-compliant documentation. Essentially, the formulated requirements can be divided into two categories:
- additional requirements for regulatory fields, such as manufacturer information, usability, information security, clinical evaluation/follow-up, risk management, quality management, post-market surveillance, etc., for which standards and/or MDCG Guidances already exist.
- new requirements in the area of software lifecycle processes, for which the relevant standards IEC 62304 and IEC 82304-1 do not provide concrete guidance in their current form.
How can the requirements be implemented?
The additional requirements of the first category can be easily integrated into existing processes of the manufacturer. Post-market surveillance shall serve as an example here. In the associated plan according to ISO/TR 20416, the manufacturer specifies the sources and the way of assessing the results for post-market surveillance. In the case of AI-based software, the common sources should be supplemented by “performance monitoring in the operation of AI-based software” and “data quality monitoring in the operation of AI-based software”. These should be analysed quantitatively and in terms of a possible trend.
The second category mainly contains new requirements for AI-based software from the fields of data management and model development. The software lifecycle processes (software development, software release, software maintenance and software decommissioning) can still be applied. But they need to include the additional process “AI model”, as we propose in the figure below.
The new process “AI Model” also includes new documents, namely “Data Management AI Model” and “Development AI Model”. And don’t worry! It is possible for both categories to integrate used tools into the manufacturer’s processes.
Certifiability of AI-based software
We are also repeatedly asked whether continuously learning AI-based software (dynamic AI) can also be CE-certified. The Notified Bodies take a clear position on this in the questionnaire: “In principle [dynamic AI] is not certifiable, as the system must be verified and validated (among other things, the functionality must be validated on the basis of the intended use)”. But there may also be restrictions on certifiability for a static AI if it is a “black box system” (cf. Art. 22 and 35 European Data Protection Regulation as well as No. 17.2 in Annex I MDR). In this case, according to the questionnaire, it is a case-by-case decision by the Notified Body involved.
The MDCG schedule includes the guidance “Artificial Intelligence under MDR/IVDR framework”, but unfortunately with an unspecified date for publication. Furthermore, the EU Commission has published the proposal “Artificial Intelligence Act” on 21.4.2021. Here are the main points:
– Aim: To provide a high level of protection of health, safety and fundamental rights and to ensure the free movement of AI-based goods and services.
– Classification as high-risk products: AI-based medical devices and AI systems intended to be used for, or to establish a priority for, the shipment of […] medical supplies (Art. 6)
– Requirements for high-risk devices: quality of data sets, technical documentation and records, transparency and provision of information to users, human oversight, and robustness, accuracy and cybersecurity.
It remains to be seen in how the final version of this new European Regulation will be looking and in what way it will affect the medical device sector.