Want Faster and Cheaper with Higher Quality? Get it Right the First Time

Article

Weak or faulty quality systems can hurt a company at every stage of a product’s lifecycle.

In development, poor quality clinical data can imperil a regulatory submission and force companies to redo expensive trials, delaying product launches. In production, a failure to comply with current good manufacturing practice (cGMP) can lead to production stoppages, regulatory sanctions and fines, product recalls, and shortages of needed medicines.

The financial consequences of substandard products range from the erosion of market share (benefitting competitors) to the loss of investor confidence (slashing share prices and market capitalization) and reputational damage (that often cannot be repaired).

Why then do people not do things right in the first place? Pressure from a variety of sources, both internal and external, may cause companies to take short-cuts that result in expensive, time-consuming do-overs or expensive remediation projects.

Internal pressures can include the demand to stick to production schedules no matter what (despite the engineering or supplier issues that invariably arise) or the drive to achieve personal performance goals. Imagine a production operator working in a 4 °C cleanroom who gets the news at 2 AM, just before his shift ends, that his relief operator is ill and won’t be coming in. Halting or slowing production would blow the schedule, so the operator-tired and cranky-takes shortcuts to get to the next step. This could compromise quality. 

External pressures can also work against quality. Supply-chain service providers, whose businesses are affected negatively by fluctuations in demand or schedules, encourage manufacturers to ignore problems. Marketing firms have their own timelines that they press manufacturers to meet, and investors demand deadlines and forecasts be met or exceeded.

The way to resist these pressures is to make data quality the flywheel that drives the business. A flywheel needs lots of force to start it going. Once moving, however, it’s difficult to stop. Likewise, an enterprise-wide commitment to data quality builds momentum toward predictability, problem detection, accountability, risk mitigation, and continuous improvement.

First, you need to get things rolling.

 

Build a right-first-time environment

The first question to ask is if product quality is being consistently achieved with your current processes. To answer it, a company needs to assess its critical processes and critical process parameters, understand how they affect the key quality attributes of its products, and determine if they are consistently met. At the end of such an assessment, a company should know whether it has a right-first-time (RFT) quality culture or not. If the answer is no, there’s work to do.

An RFT environment does not need to be implemented all at once and probably can’t be. It must be built project by project.

As a starting point, pick a project with high value-added potential, such as data integrity (DI). The objective of an initial DI project could be to ensure that all electronic data collection systems are FDA 21 Code of Federal Regulations (CFR) Part 11 compliant (1) and that they are designed to prevent and detect DI issues.

Track your company’s adherence to RFT by leveraging technology. New software can greatly enhance a company’s ability to track quality metrics. Automating production equipment and laboratory instrumentation can decrease the chance of error and quickly catch those that occur. For example, software can be programmed to include electronic edits made to batch records during production. Likewise, hand-held scanners can enable operators to gather information about batch components from bar codes, entering accurate data into the system faster. These tools increase a company’s ability to produce batches RFT.

Make production predictable

Innovation is required and needs to be challenged in development, but it’s rarely helpful in production. To ensure consistent quality, eliminate on-the-fly innovation from processes.

For example, the production of monoclonal antibodies (mAbs) involves 15–20 critical steps. If all (or any) of them are based on new innovative processes, the chance for error increases exponentially. In contrast, if a series of well-understood and validated methodologies are linked together in sequence then a company can tick all the boxes for controls and be confident it is conforming to requirements. The financial consequences of mismanaging mAb production are staggering, even for large, profitable companies; it can cost as much as $25 million in materials alone to produce one batch of product and it takes several batches just to establish consistency. Operational failures are not an option.

Substantial efficiency gains may be worth the time and expense of validating a new process but minor improvements generally are not.

 

Keep quality systems simple

Sometimes, larger companies don’t bother to retire portions of quality systems that are no longer working and instead keep adding more standard operating procedures (SOPs). After all, why trim anything out if you aren’t getting any FDA Form 483 observations? (2).

This can result in a bloated redundant quality system. Layers of quality checks and balances can be a drag on production without adding value. That kind of environment can encourage people to develop shortcuts and workarounds to meet deadlines, leading to mistakes and batch failures.

To keep the quality system lean, focused, and efficient, companies should focus on:

  • Data integrity: Implement processes and systems that gather all the data required to ensure quality, but make sure those systems can both prevent fraud and falsification, and flag innocent omissions and mistakes. Generating poor quality data is a cost you can’t afford.

  • Quality metrics: Collect, track, and analyze metrics that foster predictability, detection, accountability, and risk mitigation. This will help create a culture of continuous improvement. Don’t over-collect data by piling on metrics that aren’t critical or interpretable.

 

Don’t fight Murphy’s Law 

Problems always arise. You can count on it. RFT costs are a one-time investment to deal with this inevitability. In contrast, remediation is a recurring and unpredictable expense that obliterates operating expense forecasts, is not necessarily tax friendly, and can inflict lost market share and reputational harm. Investors don’t thrill to remediation either.

Many companies overestimate the cost of quality (an investment that has to be justified just once) and underestimate the cost of remediation (an expense which requires continual apologies, especially if there are cost overruns). They mistakenly conclude that remediation will be more cost-effective. While a clinical trial can sometimes be redone or a batch remade, the costs can be prohibitive and the resulting delays/recalls could decrease market share and damage public confidence in the company. And while lawyers and third-party contractors have become very good at fire-fighting, the financial impact of hiring third parties can further increase budgets. Finally, timelines for remediation projects almost always creep, which increases the real costs and causes delays.

A structural commitment to data quality-organized around the proper collection, analysis, and application of meaningful metrics and maintained by systems and procedures that safeguard the integrity of data-is the only clear route to achieve regulatory compliance and competitive advantage. And it improves a business’s balance sheet.  

Doing things right the first time will always take less time, cost less money, and result in a higher quality product than doing things over or fixing mistakes. As a bonus, it avoids the potential secondary costs of failure such as fines and loss of reputation.

 

References

1. FDA, Guidance for Industry: Part 11, Electronic Records; Electronic Signatures-Scope and Application (2003).

2. FDA, Inspectional Observations and Citations, FDA.gov, 2016.

Recent Videos