As new process validation guidelines emerge, industry needs to reinvent how it releases product.
At the 3rd Annual Pharmaceutical Technology Conference in Philadelphia, in August, Brian K. Nunnally, associate director of process validation at Wyeth (Madison, NJ), and John McConnell, principal of the Australian consulting firm Wysowl, tried to hammer the "go/no–go" mindset out of the brains of the pharmaceutical industry members in attendance. Their talks were based on the principles within the US Food and Drug Administration's draft guidance, Process Validation: General Principles and Practices, issued in November 2008, and the quality-by-design and process-analytical-technology approaches cascading over industry. Currently, industry tends to wait for a failure before it takes action—pharma is the only industry like that, said McConnell. Instead of studying the out-of-specification (OOS) point, "we need to study where the process began to go out of whack—usually about 15 batches before the actual OOS point occurred," he explained. "Meeting spec should not be the end point, but rather a beginning expectation. Our job is to reduce variability way beyond spec."
Angie Drakulich
Nunnally added that when the final guidance comes out, industry will need to understand the sources of its product variation and control that variation commensurate with the risks involved. Industry can no longer say a batch is "good" or "not good" based on end-product testing. "Proper control charting can help us see OOS in a different way ... by looking at it in three forms: process, sample, and analytical."
Both speakers reiterated that the benefit of reducing variability is increased capability. The challenge, however, is changing industry's mindset. "Learning is easy," said McConnell, "unlearning is the hard part."
Angie Drakulich is the managing editor of Pharmaceutical Technology.