Spurred, in part, by the COVID-19 pandemic and the need for new ways to reach patients at home, 2023 saw a boom in digital technologies and healthcare solutions: one-stop-shop telemedicine platforms, app-based remote patient monitoring, direct-to-consumer online pharmacies, software-based medical devices, and artificial intelligence/machine learning to bolster delivery of telehealth services. Then came a robust government response. In the EU and UK, regulatory bodies grappled with the introduction of machine learning, AI, and other software into healthcare services by, for example, new guidance from the EU Medical Device Coordination Group and UK Medicines and Healthcare products Regulatory Agency on software medical devices, the EU’s AI Act and the UK government’s AI White paper, the European Medicines Agency reflection paper on use of AI in the product lifecycle, the EU Data Privacy Framework and the equivalent UK-U.S. data bridge, and the European Health Data Space.
We call this the “Race to Regulate.” This push-pull dynamic between digital health innovation and government regulation is key to evaluating regulatory risks in today’s shifting legal landscape. This digest seeks to keep up with these changes and provide you with an overview of the key guidelines and developments as the landscape develops. As we come to the end of 2023 and publish our latest Digest, join us on December 13 as we unpack pivotal moments in the 2023 Race to Regulate and discuss what’s next for virtual and digital health.
Regulatory Updates
The MDCG Adopts Guidance on Medical Device Software Intended To Work in Combination With Hardware or Hardware Components. On October 18, the Medical Devices Coordination Group (MDCG) adopted guidance on Medical Device Software (MDSW) intended to work in combination with hardware or hardware components that generate or provide input data to the software. For example, MDSW downloaded or available on wearables (e.g., smartwatches or augmented reality goggles) may achieve their intended purpose by receiving and analyzing data provided by a hardware or hardware component (e.g., camera or optical sensors). The guidance clarifies how to identify whether the hardware or hardware component are regulated as a medical device or an accessory to a medical device and how to comply with the respective regulatory requirements by setting out guidance on the qualification and appropriate regulatory pathway for the hardware. The guidance also sets out the three regulatory options for manufacturers of such products:
- The hardware or hardware component is placed on the market as an accessory to a MDSW.
- The hardware or hardware component is placed on the market as a medical device either (1) as part of a system, (2) as a combination with another medical device, or (3) as an integral part of a medical device.
- The hardware or hardware component is an integral part of a general consumer product or wearable digital product and is not a medical device or an accessory to a medical device and has no intended medical purpose.
UK Regulatory Sandbox Coming Soon. On October 30, the MHRA announced that it aims to launch the “AI-Airlock” in April 2024. The AI-Airlock will be a novel regulatory sandbox, which will allow developers of software and AI medical devices to test their products in a safe environment, generate robust evidence for regulatory submissions, and address any challenges with a technology’s safety and efficacy evaluation. The sandbox will be monitored by the MHRA and will consist of a collaborative approach between regulators, developers, academia, and the NHS. It is hoped the AI-Airlock will ultimately mean that patients can access new technologies faster.
UK Government Announces Large Investments in Innovative Technologies. On October 3, the UK Department of Health and Social Care (DHSC) announced a £30 million investment to support the roll out of innovative technologies for the NHS. According to the DHSC, this investment will help ease the pressures on the NHS this winter and could include the expansion of virtual wards, the investment in wearable medical devices for use by patients at home to aid diagnosis and management of chronic conditions, and the investment in diagnostic imaging technologies. On October 29, the Prime Minister announced the launch of a £100 million investment in AI in healthcare particularly in areas such as dementia, mental health, and oncology. Finally, on October 30, the DHSC also announced £21 million of funding to deploy AI tools to speed up the diagnosis and treatment of lung cancer.
CPI Report Reveals Challenges and Opportunities for UKMedTech. On October 23, the UK Centre for Process Innovation (CPI) published two reports calling for an urgent MedTech industrial strategy to avoid the UK falling behind in the rapidly growing HealthTech sector. The first report, written in collaboration with the Association of British HealthTech Industries, is titled “Challenges and Opportunities for UK HealthTech Manufacturing Scale Up.” It highlights that many companies may be moving from the UK to other countries to benefit from more competitive pricing and more flexible manufacturing processes. The second report titled “An Action Plan: Driving Growth of the UK Digital Health Industry” maps the changes that may be needed for the UK to maximize its global potential in the digital health market.
Digital Transformations for Health Lab Launched During World Health Summit. On October 16, the Digital Transformations for Health Lab (DTH-Lab) was launched during the World Health Summit. The DTH-Lab is a global consortium that will implement the Lancet and Financial Times Commission Report on Governing Health Futures 2030. The report contained four actions to address health inequalities and promote public health in the era of digitalization:
- Recognize digital technologies as determinants of health.
- Build a governance architecture that creates trust in digital health.
- Develop a new approach to the collection and use of health data based on data solidarity.
- Invest in digitally transformed health systems.
These recommendations will now be implemented by the DTH-Lab, which will explore how digital and AI transformations can improve health and well-being and strengthen citizenship and empowerment.
WHO Publishes Guidance on Regulatory Principles Applicable To Use of AI in Health. On October 18, the World Health Organization (WHO) published guidance on “Regulatory considerations on artificial intelligence for health.” The publication aims to outline key principles that governments and regulatory authorities can follow to develop new guidance or adapt existing guidance on AI at national or regional levels. The new guidance emphasizes the importance of establishing AI systems’ safety and effectiveness, rapidly making appropriate systems available to those who need them, and fostering dialogue among stakeholders, including developers, regulators, manufacturers, health workers, and patients. It outlines six areas for regulation of AI for health: transparency and documentation; risk management; validating data and being clear about intended use; data quality; privacy and data protection; and collaboration between relevant bodies and individuals.
G7 Agree on Guiding Principles and Voluntary Code of Conduct for AI Developers. On October 30, G7 leaders agreed on International Guiding Principles on Artificial Intelligence and a voluntary Code of Conduct for AI developers under the Hiroshima AI process. These principles and the voluntary Code of Conduct will complement, at an international level, the legally binding rules that the EU co-legislators are currently finalizing under the EU AI Act. The aim of the Code of Conduct and the Guiding Principles is to promote safe and trustworthy AI. As discussed in our September Digest, the voluntary Code of Conduct will provide practical guidance and attempt to create a non-binding rulebook for AI developers. Both documents will be reviewed and updated as necessary, including through multistakeholder consultations, to ensure they remain fit for purpose and responsive to this rapidly evolving technology.
Privacy Updates
GC Dismisses Request for Interim Relief Sought Against the EU-US Data Privacy Framework. On October 12, the European General Court dismissed the application for interim measures lodged by a French member of the European Parliament, Philippe Latombe, to suspend the application of the EU-U.S. Data Privacy Framework (the Data Bridge), which was discussed in the October Digest. The General Court dismissed the interim measures application on the grounds that the urgency required for the adoption of such measures had not been demonstrated. Accordingly, the Data Bridge remains fully applicable for the time being. However, the dismissal has been appealed, although it is not yet clear when this appeal will be determined. The results of these pending proceedings are not only relevant for entities concerned by the Data Bridge, but also for those concerned by the UK-U.S. Data Bridge, which, as discussed in our October Digest, is an extension of the EU Data Bridge.
Updated Code of Practice for Operators and Developers of Apps. On October 13, the UK’s Department for Science, Innovation and Technology published an updated version of the code of practice for app store operators and app developers (Code). The Code was first published on December 9, 2022, with the aim of setting out minimum security and privacy requirements of apps to protect users. As mentioned in our January Digest, the Code applies to all apps, including health-related apps. Some of the changes include:
- Principle 2.7: Instead of the previous requirement that developers should provide users with a mechanism to delete locally held data, developers need only provide a mechanism for users to request deletion of their personal data.
- Principles 3.1 and 3.3.1: The vulnerability disclosure process, which the developer must create and maintain for every app, must be accessible within the app store.
- Principle 8.1: The reporting process for personal data breaches has been clarified such that the operator must inform the developer, and the developer informs other relevant stakeholders.
Operators and developers were initially granted nine months to implement the Code, but based on feedback that some provisions required clarification and that certain barriers to implementation existed, this has been extended by a further nine months. Operators and developers should now comply with the Code by June 2024.
Opinions From the EDPS on AI Act. On October 23, the European Data Protection Supervisor (EDPS) adopted Opinion 44/2023 on the EC proposal for the AI Act in the light of legislative developments. Details on the AI Act can be found in our Advisories here and here. The EDPS sets out a number of recommended changes to the proposal. These include:
- Broadening the scope of the AI Act (e.g., to high-risk AI systems existing prior to its application date)
- Introducing explicit prohibitions on the use of AI systems (e.g., using AI to infer emotions if not used for health or research purposes)
- Introducing additional specifications for high-risk AI systems
- Clarifying elements for cross-border cases involving AI-systems (e.g., definition of national territory)
- Clarifying the tasks, duties and powers of the authorities involved in the implementation of the AI Act, including those of the EDPS
Product Liability Updates
European Parliament Adopts Negotiating Position on the New EU Product Liability Directive. On October 18, the European Parliament (EP) adopted its negotiating mandate on the European Commission’s (EC) proposal for the revised Product Liability Directive (PLD), as discussed in our November 2022 Digest. The EP’s proposed revisions to the PLD are set out in a report dated October 12, 2023. Some of the key changes include clarification that the PLD will not apply to free and open-source software, extending the limitation period to 30 years for latent defects and clarification that economic operators that make substantial modifications to a product should be limited to the modified part of the product only. The European Council’s negotiation position was published in June 2023 (discussed in our July Digest) and so on October 23, the EC, European Council, and EP began trialogue negotiations to agree on the final text of the PLD. The next trialogue is likely to happen in December 2023.
Statement From Industry on the Proposed EU Product Liability Directive. On October 23, the European Federation of Pharmaceutical Industries and Associations, MedTech Europe, and others published an industry statement calling for “a major rethink” on the EC’s proposal for a revised PLD. The industry states that as currently proposed, the PLD is unbalanced, being too consumer-friendly. For example, industry notes that the current draft disproportionately shifts the burden of proof onto defendants and could lead to abusive disclosure exercises. The industry also calls for compensation thresholds to be reintroduced and for further investigation into the effects of including software in the strict liability regime. Overall, the industry has concerns that the PLD would lead to an increase in litigation, a reduction in innovation, and much more uncertainty for businesses.
Opinion From the EDPS on AI Liability Act. On October 11, the EDPS adopted Opinion 42/2023 on the EC proposals for the revised PLD and the AI Liability Directive. The proposal for the AI Liability Directive aims to ensure victims of damage caused by AI can obtain equivalent protection to damage caused by other products. The EDPS fully endorses this aim and sets out a number of recommended changes to the proposal. These include:
- Ensure individuals that suffer damage caused by AI systems produced or used by EU institutions enjoy the same protection as if the damage were caused by AI systems produced or used by private entities or national authorities.
- Extend the disclosure of evidence mechanism and the rebuttable presumption of a causal link to all AI systems, not just those defined as “high-risk.”
- State that the proposal is without prejudice to EU GDPR, such that individuals can obtain redress through different avenues.