Editor's Note
This article was published in the October 2024 issue of Pharmaceutical Technology® Europe.
Regulators, EMA and HMA, have published principles and recommendations on the use of LLMs, which are being increasingly used for daily tasks.
On 5 Sept. 2024, the European Medicines Agency (EMA) and the Heads of Medicines Agencies (HMA) published high-level principles and recommendations for staff across the European Medicines Regulatory Network (EMRN) regarding the use of generative artificial intelligence (AI), namely large language models (LLMs) (1).
This article was published in the October 2024 issue of Pharmaceutical Technology® Europe.
LLMs are widely used by regulatory staff in various daily tasks including education and tutoring, language translation, search and summarizing information, writing assistance, and providing coding language support (2). However, the deployment of LLMs may result in variable results and inaccurate responses, and they can present potential data security risks in terms of confidential information, data protection, and privacy (2). Thus, the guidelines and principles lay the foundation for the safe and responsible use of LLMs so users “avoid pitfalls and risks” (1).
The EMA/HMA’s guiding principles aim to ensure users to do the following:
Overall, the guiding principles provide a working framework to ensure LLMs are used safely and ethically and are effectively implemented in regulatory science and for medicines regulatory activities.
The guiding principles represent one of the deliverables from the HMA/EMA Multi-annual AI Workplan 2023–2028, published in December 2023 (3). This workplan guides EMA and EMRN in their use of AI, maximizing the benefits while managing the risks and facilitating information sharing. The first version of the Multi-annual AI Workplan focuses on four critical dimensions—guidance, policy and product support; tools and technologies; collaboration and change management; and experimentation—and will be regularly updated under the oversight of the HMA-EMA Big Data Steering Group (BDSG) (3).
Over the coming months and years, EMA’s Committee for Medicinal Products for Human Use (CHMP) Methodology Working Party (MWP) will help to develop the AI Guidance in Medicines Lifecycle document, which will include domain-specific guidance on pharmacovigilance and other areas. From mid-2024 work was initiated on preparations for the European Artificial Intelligence Act (AI Act), which came into force in August 2024, and an observatory will be created to monitor the impact of AI and the emergence of novel systems and approaches (3,4).
The EMRN will survey its capabilities to analyse data including the use of AI, and this survey will be used to inform collaborative efforts to enhance the analytics capability of the ERMN to ensure all parties are compliant with data protection legislation. In addition, a Network Tools policy will be published to help foster collaboration, integration, and reusability of tools and models within the EMRN (3).
The EMRN will continue to work with partners on AI on an international scale including the International Coalition of Medicines Regulatory Authorities and other agencies in the European Union, as well as with academia and the devices sector. Furthermore, the European Specialised Expert Community of the EMA MWP, will establish a Special Interest Area on AI to provide a forum for collaboration and knowledge sharing, and the EU-Network Training Centre will provide a framework and platform to upskill staff on AI and Data Analytics and support the development and delivery of BDSG Data Science Curriculum (3).
EMA and HRA recognize that AI is a rapidly moving field, and it is essential for regulatory personnel to be provided with a structured approach to integrate these tools in their daily tasks while safeguarding information and maintaining trust. EMA is leading by example and has promptly responded to the implementation of the AI Act, which aims to ensure that AI developed and used in the EU is trustworthy and that there are sufficient safeguards in place to protect people’s fundamental rights. Pharma should take heed as the AI ACT will impact all industries, and it will be essential that companies work closely with the AI Office and other relevant authorities to ensure the safe deployment of AI solutions that will be beneficial to society and drive future innovation in the European pharmaceutical industry (5).
Cheryl Barton, PhD, is founder and director of PharmaVision, Pharmavision.co.uk.
Pharmaceutical Technology® Europe
Vol. 36, No. 9
October 2024
Pages: 8–9
When referring to this article, please cite it as Barton, C. Responding to the Increased Use of Generative AI. Pharmaceutical Technology Europe 2024 36 (9).
Legal and Regulatory Perspectives on 3D Printing: Drug Compounding Applications
December 10th 2024This paper explores the legal and regulatory framework around 3D drug printing, particularly for personalized medicine, considering regulatory compliance, business concerns, and intellectual property rights.