Welcome to the latest installment of Arnold & Porter’s Virtual and Digital Health Digest. This digest covers key virtual and digital health regulatory and public policy developments during May and early June 2025 from the United Kingdom and European Union.
There has been a lot of focus on AI this month. The European Commission has launched a consultation on high-risk AI systems, which includes medical devices and is therefore highly relevant for digital health companies. The European Medicines Agency has published a workplan on data and AI use, which sets out how the European Medicines Regulatory Network plans to leverage large volumes of regulatory and health data to support regulatory decision-making for better medicines. There has also been international guidance published on the use of AI in pharmacovigilance. However, there has also been controversy as the UK Data Use and Access Bill continues through the parliamentary process, as there is disagreement on its treatment of copyright-protected material in the development of AI systems. As uses of AI continue and authorities seek to put in place relevant legislation and guidance to match the speed of development, expect this focus to continue.
Regulatory Updates
European Commission Launches Public Consultation on High-Risk AI Systems. The consultation will collect practical examples and seek to clarify issues relating to high-risk AI systems, which includes medical devices. This feedback will be taken into account in the upcoming European Commission guidelines, which will focus on classifying high-risk AI systems, as well as requirements and obligations for high-risk AI systems for those in the supply chain. There is also a question on the need for amendment of the list of high-risk use cases. The consultation will be open until July 18, 2025.
European Medicines Agency and Heads of Medicines Agencies Publish 2025-2028 Workplan on Data and AI Use. The workplan was developed for the EU Network Data Steering Group, an advisory group tasked with maximizing data interoperability, access to data, and the use of AI within the European Medicines Regulatory Network (i.e., the EU national competent authorities, the European Medicines Agency, and the European Commission). The workplan outlines targeted actions across six areas, including:
- Strategy and governance (e.g., developing a new data strategy)
- Data analytics (e.g., launching a pilot on clinical study data)
- AI (e.g., developing a framework for AI tool sharing and collaboration)
- Data interoperability (e.g., developing a data catalog for critical data assets)
- Stakeholder engagement and change management (e.g., developing a data change management)
- Guidance and international initiatives (e.g., developing a guidance on AI use in medicinal products)
Draft Guidance Published on Best Practices for Using AI in Pharmacovigilance. The Council for International Organizations of Medical Sciences, which represents the international biomedical scientific community, has published draft guidance setting out six guiding principles that should be considered by pharmacovigilance (PV) departments or organizations developing AI solutions for pharmacovigilance. The six principles are: (1) a risk-based approach, (2) human oversight, (3) validity and robustness, (4) transparency, (5) data privacy, and (6) governance and accountability. The report proposes best practices for integrating and implementing AI within PV to ensure AI is used ethically and reliably. Feedback on the draft guidance is open until June 6, 2025.
Privacy and Cybersecurity Updates
European Commission Proposes Simplified EU General Data Protection (GDPR) Obligation for Small- and Medium-Sized Companies (SMEs) and Small Mid-Cap Companies (SMCs). The proposal uses the existing EU definition of SMEs (i.e., companies with fewer than 250 employees and either an annual turnover of under €50 million or a total balance sheet below €43 million), and defines SMCs as organizations that do not meet the SME definition but have a size threshold about three times that of SMEs. The proposal limits the GDPR obligation for data controllers and processors to maintain records of processing activities (ROPA) for SMEs and SMCs to cases where processing is likely to pose a high risk to individuals. However, processing special categories of data, such as health data, may involve a high risk, so life sciences SMEs and SMCs may not be exempt from the ROPA obligation. The proposal now needs to be adopted by the European Parliament and the Council of the European Union, which may further amend the GDPR. You can read more in our May 2025 BioSlice Blog.
UK Government Publishes Code of Practice for Software Vendors (Code). In our April 2025 digest, we reported that the UK government published its response to its call for views on the Code, which was published in its final form on May 7, 2025. The Code, although voluntary, outlines the government’s expectations for the security and resilience of organizations’ software through 14 principles across four themes, including secure design and development, secure deployment and maintenance, and communication with customers. The principles are seen as fundamental and achievable for organizations of any size across different sectors. The Code aims to support both vendors and users by establishing a minimum level of software security and resilience across the market, with the aim of reducing the occurrence of supply chain attacks and other issues.
IP Updates
UK Courts Provide Guidance on Lawful Reverse Engineering or Contractual Breach. On March 10, 2025, the High Court handed down the judgment in IBM United Kingdom Ltd v LzLabs GmbH and others [2025] EWHC 532 (TCC) which included a claim for the breach of a software licensing agreement between IBM and a subsidiary of LzLabs and another claim for conspiracy to develop software via reverse engineering of the licensed software.
In reaching its decision in favor of IBM, the court:
- Set out how reverse engineering restrictions in licensing agreements should be construed to be compatible with UK copyright laws and are, therefore, enforceable
- Considered the meaning and scope of the contractually agreed restrictions on reverse engineering in light of the defendants’ conduct
- Delineated the narrow scope of the statutory exceptions to reverse engineering for the purposes of achieving interoperability, and observing, studying, and testing computer programs with a view to determining the ideas and elements that underpin them
The court’s analysis in this case provides important guidance points on lawful reverse engineering and other key takeaways for developers of proprietary software as a medical device, manufacturers of connected medical devices, and licensees of software with medical applications more generally.
The UK’s Data Use and Access Bill Sparks AI Copyright Controversy. UK information law reform is nearing the final stages of the parliamentary process through the Data (Use and Access) Bill (DUA Bill) which, among other things, aims to facilitate lawful data sharing across industry sectors, with the aim of supporting innovation. See our May 2025 Advisory for information regarding the likely impact on UK data protection compliance for businesses.
Most recently, the House of Lords has raised concerns on the DUA Bill’s treatment of copyright protected material in the development of AI systems and the failure to require AI developers to seek consent or disclose information regarding the text and data used in pre-training, training, and fine-tuning AI models that are protected by copyright. The UK government argues that such additional restrictions on the DUA Bill could stifle AI development and harm the UK’s competitive standing in global technology. As the bill progresses toward Royal Assent, the tension between enabling AI innovation and protecting intellectual property remains unresolved. Digital health organizations developing or deploying AI systems should monitor developments to ensure appropriate compliance.