EMA and FDA on Process Validation

Publication
Article
Pharmaceutical TechnologyPharmaceutical Technology-01-02-2013
Volume 37
Issue 1

Siegfried Schmitt, a principal consultant with PAREXEL, discusses the EMA's guideline on process validation and how it compares with FDA's process validaton guidance.

Q. Since FDA published their Guidance for Industry Process Validation: General Principles and Practices in 2011, the EMA has published their draft Guideline on Process Validation. Are these two guidelines aligned?

A.

Though there are differences in terminology, the underlying principles are the same. The overarching principle is the lifecycle approach to validation. Once sufficient knowledge has been established about product and process (i.e., the process qualification phase), the validation batches can be manufactured. What is new here is that all following (commercial) batches are considered verification batches. Thus, validation is an ongoing process until product discontinuation.

Q. So three batches are still ok to demonstrate initial validation?

A.

In Europe, this is pretty much the accepted standard. FDA, on the other hand would really like to see a rationale for the number of batches. This is not to say that three batches are not acceptable, but scientific reasoning, or at least historic data (from previous validation campaigns), should be provided.

Q. Can you tell us more about knowledge and process understanding?

" >
SPOTLIGHT EVENT

IVT’s 5th Annual Validation Week EU


RELATED ARTICLES

More in GMPs/Validation

A. Over the past decade or two, companies were encouraged to merely file the bare minimum of information in their applications. As a result, process and product understanding that may have helped with process and product quality improvements were not explored. Nowadays the agencies have changed their stance on this and not only encourage, but actually expect industry to establish, maintain and continually improve their knowledge base. Furthermore, the expectation is to also file much of this understanding in the submissions.

By gaining a wider and deeper process understanding, industry can achieve more flexibility with their manufacturing processes, as the working ranges can be expanded accordingly. Such enhanced flexibility typically results in a significant reduction in deviations from the registered process and/or fewer changes to the registration (variations in the EU).

Q. Is continuous process verification (CPV) not something industry is doing already?

A. Companies have to prepare annual product quality (APQ) reports, which provide details on the number of batches produced in that period, abnormal events, such as deviations and recalls, and the validation status for processes and analytical methods. The information for these reports is gathered retrospectively at year end; for some batches nearly a year after their date of manufacture. This is what industry has been doing, but that is not the regulators' intent with regards to CPV. Continuous verification requires real-time or "near-to-the-event" data and information analysis. In principle, verification of the validated state should be performed before manufacture of any new batch. Such an approach requires, first of all, readily available data/information and, secondly, statistical methods for their analysis. Ready data availability can only be reasonably achieved where these are provided in electronic format. Imagine having to manually transpose data from the batch record to spreadsheets or similar; that would not be feasible for the majority of manufacturing environments.

Q. You mention statistical analyses, can you elaborate?

A. Though data have always been analysed to determine whether they met their specifications, such data were often not investigated for trends. Obviously, if I do not have real time data or omit to perform trend analysis near the event, there is little value in doing so, and opportunities for corrective actions will have been missed. The regulatory agencies' expectations are that industry now performs data trending using widely accepted statistical rules and methods, which will allow detection and analysis of out-of-trend events. The purpose is to prevent out-of-specification events from occurring. Such statistical methods have long been around; their use, however, in this context is not yet widespread.

Recent Videos
Christian Dunne, director of Global Corporate Business Development at ChargePoint Technology
Behind the Headlines episode 6
Behind the Headlines episode 5