Developments in product liability law are always potentially significant for pharmaceutical and medical device manufacturers.

On 13 March 2024, the European Parliament adopted new EU consumer protection legislation to repeal and replace the EU Product Liability Directive 85/374/EEC, which has been in force for almost 40 years.  Once the new legislation has been approved by the European Council it will become law, and is likely to come into force in around mid-2026. The intention is for EU consumers to have easier access to compensation caused by defective products. 

The International Comparative Legal Guide (ICLG) on Product Liability Laws and Regulations 2024 is now available, and we have prepared:

Continue Reading Implications of the New EU Product Liability Directive

The UK Medicines and Healthcare products Regulatory Agency (MHRA) has published its strategic approach to artificial intelligence (AI). The publication is in response to the request from the Secretaries of State of DSIT and DHSC dated 1 February 2024, in which the MHRA was asked to provide details about what steps it is taking in accordance with the principles and expectations of the Government’s pro-innovation approach set out in the white paper published in 2023. Further information is set out in our previous post.

The strategy provides information on how the MHRA views the risks and opportunities of AI from three perspectives:

  • MHRA as a regulator of AI products
  • MHRA as a public service organisation delivering time-critical decisions
  • MHRA as an organisation that makes evidence-based decisions that impact on public and patient safety, where that evidence is often supplied by third parties

The document is likely to be of particular interest to AIaMD manufacturers as it sets out in detail current and proposed regulations and guidance, and areas where this is likely to be tightened. Following the launch of the strategic approach, the government also published details of the AI Airlock regulatory sandbox, discussed in another post.

With a raft of measures relating to AI being published and additional measures expected in the next couple of years, pharmaceutical and medical device companies operating in the UK need to continually review how they will be impacted and respond appropriately.Continue Reading MHRA sets out its AI regulatory strategy

On 19 July 2023, the European Medicines Agency (EMA) published a draft Reflection paper on the use of artificial intelligence (AI) in the lifecycle of medicines (the Paper). The Paper recognises the value of this technology as part of the digital transformation within healthcare, and acknowledges its increasing use and potential to “support the acquisition, transformation, analysis, and interpretation of data within the medicinal product lifecycle”, provided of course it is “used correctly”.

The Paper reflects EMA’s early experience with and considerations on the use of AI, and gives a sense of how EMA expects applicants and holders of marketing authorisations to use AI and machine learning (ML) tools. The EMA has made clear that the use of AI should comply with existing rules on data requirements as applicable to the particular function that the AI is undertaking. It is clear that any data generated by AI/ML will be closely scrutinised by the EMA, and a risk-based approach should be taken depending on the AI functionality and the use for which the data is generated.

The Paper is open for consultation until 31 December 2023. EMA also plans to hold a workshop on 20-21 November 2023 to further discuss the draft Paper. EMA’s plan is to use the feedback from the public consultation to finalise the Paper and produce future detailed guidance. Our summary below sets out the key takeaways and the key issues that arise in the Paper.Continue Reading EMA publishes first draft of reflection paper on the use of AI in the medicinal product lifecycle

On June 14, 2023, an overwhelming majority of the European Parliament (Parliament) recently voted to pass the Artificial Intelligence Act (AI Act), marking another major step toward the legislation becoming law. As we previously reported, the AI Act regulates artificial intelligence (AI) systems according to risk level and imposes highly prescriptive requirements on systems considered to be high-risk. The AI Act has a broad extraterritorial scope, sweeping into its purview providers and deployers of AI systems regardless of whether they are established in the EU. Businesses serving the EU market and selling AI-derived products or deploying AI systems in their operations should continue preparing for compliance.

Now, the Parliament, Council, and Commission have embarked on the trilogue, a negotiation among the three bodies to arrive at a final version for ratification by the Parliament and Council. They aim for ratification before the end of 2023 with the AI Act to come into force two (or possibly three) years later.

In our recent advisory, we summarize the major changes introduced by the Parliament and guide businesses on preparing for compliance with the substantial new mandates the legislation will impose.Continue Reading European Parliament Adopts Its Version of AI Act

Welcome to the first installment of Arnold & Porter’s Virtual and Digital Health Digest. This inaugural edition covers September and October highlights across the virtual and digital health space. This newsletter focuses on key virtual and digital health and telehealth-related developments in the United States, United Kingdom, and European Union in the healthcare, regulatory, privacy, and corporate transactions space.
Continue Reading Virtual and Digital Health Digest

The MHRA is continuing to publish details on how software and AI medical devices will be regulated in the UK post Brexit, with the aim of making the UK an attractive place to launch such products. The MHRA’s recent updates to its ‘Software and AI as a Medical Device Change Programme’ (the Change Programme) intend to “deliver bold steps to provide a regulatory framework that provides a high degree of protection for patients and public, but also makes sure that the UK is recognised globally as a home of responsible innovation for medical device software looking towards a global market.

The MHRA has also recently announced it will extend the period during which EU CE marks on medical devices (including for software) will be accepted on the UK market, until July 2024.

We set out an overview of these updates below.Continue Reading Latest on software and AI devices from the MHRA

There is currently no specific legislation in the UK that governs AI, or its use in healthcare. Instead, a number of general-purpose laws apply. These laws, such as the rules on data protection and medical devices, have to be adapted to specific AI technologies and uses. They sometimes overlap, which can cause confusion for businesses trying to identify the relevant requirements that have to be met, or to reconcile potentially conflicting provisions.

As a step towards a clearer, more coherent approach, on 18 July, the UK government published a policy paper on regulating AI in the UK. The government proposes to establish a pro-innovation framework of principles for regulating AI, while leaving regulatory authorities discretion over how the principles apply in their respective sectors. The government intends the framework to be “proportionate, light-touch and forward-looking” to ensure that it can keep pace with developments in these technologies, and so that it can “support responsible innovation in AI – unleashing the full potential of new technologies, while keeping people safe and secure”. This balance is aimed at ensuring that the UK is at the forefront of such developments.

The government’s proposal is broadly in line with the MHRA’s current approach to the regulation of AI. In the MHRA’s response to the consultation on the medical devices regime in the UK post-Brexit, it announced similarly broad-brush plans for regulating AI-enabled medical devices. In particular, no definition of AI as a medical device (AIaMD) will be included in the new UK legislation, and the regime is unlikely to set out specific legal requirements beyond those being considered for software as a medical device. Instead, the MHRA intends to publish guidance that clinical performance evaluation methods should be used for assessing safety and meeting essential requirements of AIaMD, and has also published the Software and AI as a medical device change programme to provide a regulatory framework with s a high degree of protection for patients and public.Continue Reading UK Policy Paper on regulation of AI

On 7 April 2020, the European Medicines Agency (EMA) issued a Notice to sponsors on validation and qualification of computerised systems used in clinical trials (Notice). This Notice was developed by the EMA’s GCP Inspectors Working Group (IWG) and the Committee for Medicinal Products for Human Use (CHMP) to highlight for clinical trial sponsors the legal and regulatory requirements which apply to software tools used in the conduct of clinical trials.

In addition, the EMA updated the Answers to Questions 8 and 9 of the Agency’s Q&A on Good Clinical Practice (GCP) (GCP Q&A) in line with the Notice.Continue Reading EMA’s Notice on validation and qualification of software tools used in clinical trials

Data-driven technologies, particularly artificial intelligence and other complex algorithms, have the potential to enhance patient care and catalyse medical breakthroughs. However, these technologies are heavily reliant on data, which poses challenges in ensuring that patient information is handled in a safe, secure and legally compliant way.

In response to early issues with the deployment of artificial intelligence and other algorithmic tools in healthcare, on 5 September 2018 the UK Department of Health & Social Care (DH) published an Initial Code of Conduct for Developers and Suppliers of Data-driven Health and Care Technology (the Code). The Code is not legally binding but aims to raise standards by establishing best practices.Continue Reading UK guidance for developers of health care software and technologies

On 28 June, the Advocate General of the Court of Justice of the European Union gave his opinion on the SNITEM and Philips France case against France. In this case, the Conseil d’Etat in France asked whether a particular software programme intended to be used by doctors to support prescribing decisions falls within the definition of medical device as provided by Directive 93/42/EEC (Medical Devices Directive).

Continue Reading Advocate General’s opinion on software as medical devices