Guidance on Artificial Intelligence research
Regulatory framework:
Artificial Intelligence (AI) is the use of digital technology to create systems capable of performing tasks commonly thought to require human intelligence.
Currently, the use of software and AI specifically, across health and social care is becoming more prominent. Those with a medical purpose, could have a considerable impact on health services and would be regulated as medical devices to ensure their safety and function as intended, to protect patients.
Note1: consult the guidance on crafting an intended purpose for Software as a Medical Device (SaMD).
In 2021, a Roadmap for Software and AI as a Medical Device was released by the Medicines and Healthcare products Regulatory Agency (MHRA) to provide a vision to future changes but as it stands, there is no further guidance or regulation. As such, the Medical devices: software applications (apps) should be used as the regulatory guide on which software applications are considered to be a medical device and their respective regulatory pathway.
Note 2: the International Medical Device Regulators Forum (IMDRF) aims to harmonize and converge the international medical device regulations from different authorities, e.g., United States of America, United Kingdom, European Union, Australia and Canada. The IMDRF has published guidance on Software as a Medical Device (SaMD): Clinical Evaluation that provides useful insights on clinical verification and validation requirements.
Additionally, the U.S. Food and Drug Administration (FDA), Health Canada, and the United Kingdom’s MHRA identified ten guiding principles, in Good Machine Learning Practice, to help promote safe, effective, and high-quality medical devices that use AI and Machine Learning. Furthermore, the MHRA, FDA and Health Canada have also jointly identified 5 guiding principles for the use of Predetermined Change Control Plans (PCCPs). These principles will help to ensure that our future guidance on PCCPs aligns internationally within these jurisdictions. Please consult the available information on Software and Artificial Intelligence (AI) as a Medical Device here.
Note 3: if developing an AI algorithm, you will need to assess the diversity of data used to develop it and take steps to reduce potential bias. Please consider carrying out an AI Impact Assessment (AIA) – please consult the AIA from the Ada Lovelace Institute and publications discussing this issue, Health Care AI Systems Are Biased (2020); Geographic Distribution of US Cohorts Used to Train Deep Learning Algorithms (2020) and Algorithmic fairness in artificial intelligence for medicine and healthcare (2023), among others.
Note 4: The MHRA AI-Airlock is intended to launch on 23rd July 2024, after the project webinar. This follows a sandbox process that intends to offer a unique and safe learning space for manufacturers to work with regulators and other parties to explore new, cutting-edge solutions to help resolve these challenges.
Note 5: The MHRA have published a policy paper on ‘The Impact of AI on the regulation of medical products’
Application in Healthcare/NHS:
If working on AI for healthcare the NHS AI Lab provides guidance on AI development, case studies and challenges faced by researchers. A linked resource published in September 2020 contains the expected reporting guidelines for AI clinical trials (Guidelines for clinical trial protocols for interventions involving artificial intelligence: the SPIRIT-AI extension;).
Furthermore, the NHS AI and Digital Regulations Service for health and social care provides comprehensive information for developers on regulations for medical devices and non-medical devices. If the digital technology is intended for use in the NHS, it will need to meet safety standards set by NHS Digital. A useful guide can be consulted on the NHS Digital website.
Contact details for more information and/or advice:
If you have any questions related to clinical investigations of medical devices do not hesitate to contact ACCORD at enquiries@accord.scot