Responding to the Increased Use of Generative AI

News
Article
Pharmaceutical TechnologyPharmaceutical Technology, October 2024
Volume 48
Issue 10
Pages: 8–9

Regulators, EMA and HMA, have published principles and recommendations on the use of LLMs, which are being increasingly used for daily tasks.

Landkarte *** Europa | Image credit: ©beugdesign - Stock.adobe.com

Landkarte *** Europa | Image credit: ©beugdesign - Stock.adobe.com

On 5 Sept. 2024, the European Medicines Agency (EMA) and the Heads of Medicines Agencies (HMA) published high-level principles and recommendations for staff across the European Medicines Regulatory Network (EMRN) regarding the use of generative artificial intelligence (AI), namely large language models (LLMs) (1).

Editor's Note

This article was published in the October 2024 issue of Pharmaceutical Technology® Europe.

LLMs are widely used by regulatory staff in various daily tasks including education and tutoring, language translation, search and summarizing information, writing assistance, and providing coding language support (2). However, the deployment of LLMs may result in variable results and inaccurate responses, and they can present potential data security risks in terms of confidential information, data protection, and privacy (2). Thus, the guidelines and principles lay the foundation for the safe and responsible use of LLMs so users “avoid pitfalls and risks” (1).

Guiding principles

The EMA/HMA’s guiding principles aim to ensure users to do the following:

  • take appropriate measures to ensure safe input of data, apply critical thinking, and cross-check outputs continuously learn how to use LLMs effectively know who to consult when facing concerns and report issues (2). In addition, the organizational principles:
  • define governance that helps users have a safe and responsible use, specifying permitted use cases, providing training, and monitoring risks
  • help users maximize value from LLMs
  • collaborate and share experiences (2).

Overall, the guiding principles provide a working framework to ensure LLMs are used safely and ethically and are effectively implemented in regulatory science and for medicines regulatory activities.

HMA-EMA multiannual AI workplan 2023-2028

The guiding principles represent one of the deliverables from the HMA/EMA Multi-annual AI Workplan 2023–2028, published in December 2023 (3). This workplan guides EMA and EMRN in their use of AI, maximizing the benefits while managing the risks and facilitating information sharing. The first version of the Multi-annual AI Workplan focuses on four critical dimensions—guidance, policy and product support; tools and technologies; collaboration and change management; and experimentation—and will be regularly updated under the oversight of the HMA-EMA Big Data Steering Group (BDSG) (3).

Over the coming months and years, EMA’s Committee for Medicinal Products for Human Use (CHMP) Methodology Working Party (MWP) will help to develop the AI Guidance in Medicines Lifecycle document, which will include domain-specific guidance on pharmacovigilance and other areas. From mid-2024 work was initiated on preparations for the European Artificial Intelligence Act (AI Act), which came into force in August 2024, and an observatory will be created to monitor the impact of AI and the emergence of novel systems and approaches (3,4).

The EMRN will survey its capabilities to analyse data including the use of AI, and this survey will be used to inform collaborative efforts to enhance the analytics capability of the ERMN to ensure all parties are compliant with data protection legislation. In addition, a Network Tools policy will be published to help foster collaboration, integration, and reusability of tools and models within the EMRN (3).

The EMRN will continue to work with partners on AI on an international scale including the International Coalition of Medicines Regulatory Authorities and other agencies in the European Union, as well as with academia and the devices sector. Furthermore, the European Specialised Expert Community of the EMA MWP, will establish a Special Interest Area on AI to provide a forum for collaboration and knowledge sharing, and the EU-Network Training Centre will provide a framework and platform to upskill staff on AI and Data Analytics and support the development and delivery of BDSG Data Science Curriculum (3).

EMA embracing AI and leading by example

EMA and HRA recognize that AI is a rapidly moving field, and it is essential for regulatory personnel to be provided with a structured approach to integrate these tools in their daily tasks while safeguarding information and maintaining trust. EMA is leading by example and has promptly responded to the implementation of the AI Act, which aims to ensure that AI developed and used in the EU is trustworthy and that there are sufficient safeguards in place to protect people’s fundamental rights. Pharma should take heed as the AI ACT will impact all industries, and it will be essential that companies work closely with the AI Office and other relevant authorities to ensure the safe deployment of AI solutions that will be beneficial to society and drive future innovation in the European pharmaceutical industry (5).

References

  1. EMA. Harnessing AI in Medicines Regulation: Use of Large Language Models (LLMs). Press Release. EMA.europa.eu. 11 Sept 2024.
  2. EMA Guiding Principles on the Use of Large Language Models in Regulatory Science and for Medicines Regulatory Activities. Press Release. EMA.europa.eu. 5 Sept. 2024
  3. EMA-HMA. Multi-Annual Artificial Intelligence Workplan 2023-2028: HMA-EMA Joint Big Data Steering Group. Version 1—November 2023.
  4. European Commission. European Artificial Intelligence Act Comes into Force. Press Release. EC.europa.eu. 1 Aug. 2024.
  5. Barton, C. European Artificial Intelligence Act Comes into Force. Pharm. Tech. Eur. 202436 (8) 9–10.

About the author

Cheryl Barton, PhD, is founder and director of PharmaVision, Pharmavision.co.uk.

Article details

Pharmaceutical Technology® Europe
Vol. 36, No. 9
October 2024
Pages: 8–9

Citation

When referring to this article, please cite it as Barton, C. Responding to the Increased Use of Generative AI. Pharmaceutical Technology Europe 2024 36 (9).

Recent Videos
Behind the Headlines, episode 7
CPHI Milan 2024: Compliance and Automation in Aseptic Processing
Related Content