Improving Technology Transfer

Publication
Article
Pharmaceutical TechnologyPharmaceutical Technology-11-01-2012
Volume 2012 Supplement
Issue 6

This article focuses on the growing need for effective data management in the life sciences industry-especially among smaller pharmaceutical manufacturers.

As the life-sciences industry amasses ever-increasing volumes of manufacturing data, the need grows for better data management, especially among smaller biopharmaceutical manufacturers. Like their larger colleagues, small companies can strategically leverage data with the transformational power of process intelligence—defined as the technology and systems needed to design, commercialize, and sustain robust manufacturing processes that give predictable, high-quality outcomes cost-effectively, based on scientific process understanding.

Current approaches to data management

Traditional approaches to data management attempt to provide process intelligence, but are not efficient or effective. Proceeding "the way we've always done it" is no longer enough due to three significant industry trends working against life-sciences manufacturers today.

First, across the range of small and large companies, most new processes are operated in existing captive or contract manufacturing facilities. Gone are the days when companies could raise capital to build a plant to produce only one product. Today's available manufacturing capacity means products typically have to perform within the constraints of existing facilities, so new processes need to be developed from the design stage on to fit the available facilities.

In the second trend, the best resources available to address problems that arise in commercial process scale-up and manufacturing, or to identify improvements in those processes, are often the experts who developed new manufacturing processes in the first place. The ideal team to get process variability under control quickly would include those who designed the process and best understand its components. These teams need easy access to all the data from the early process design stages through commercial operations and a collaborative environment in which to work with it.

A related and final trend points to changes in the knowledge base. The people with the deepest process knowledge are usually concentrated at a few sites, rather than at every site. Today's process experts have responsibility for multiple sites and face challenges gathering reliable data from remote sites to execute on their area of expertise (e.g., investigative analysis, predictive modeling, or batch/campaign comparisons).

Whether we refer to boundaries in the geographical sense or to the "organizational boundaries" that exist around departments and manufacturing sites, we need better methods of crossing "borders" to improve collaboration and technology transfer and to reduce associated risks that threaten product batches, consumer confidence, and ultimately company reputations. These very real trends are shaping the business landscape today and will continue, so manufacturers need to adjust accordingly by incorporating better process-intelligence approaches and tools across their organizations.

Along with trends driving process-intelligence challenges and increased outsourcing, regulatory guidance is encouraging new approaches to achieve the desired state of process understanding and validation. FDA defines process validation as the collection and evaluation of data, from the process design stage throughout production, which establishes scientific evidence that a process is capable of consistently delivering quality products (1). FDA's focus in Stage 3 of its validation guideline states that, "Ongoing assurance is gained during routine production that the process remains in a state of control" (1).

An organization's size typically corresponds to the maturity of its data infrastructure that supports process control and validation. Smaller companies have fewer of the legacy data systems (e.g., LIMS, enterprise resource planning) typically found at larger manufacturers, and they tend to rely more on paper-based records and spreadsheets for data capture and reporting. Challenges and risks result when accessing, aggregating, and analyzing data for FDA approvals, annual product reviews (APRs), campaign and batch reports, investigations, and proactive analysis.

As financial models evolve in the life sciences, there is an increased number of new "small companies" funded by investors (or sometimes by larger pharmaceutical companies) to identify and evaluate new products quickly. Smaller companies can work efficiently by outsourcing drug development, clinical trials, and early-stage manufacturing, but typically they use traditional data-management methods that involve mountains of paper and spreadsheets. With manufacturing data dispersed across contractors and geographies, these smaller firms often struggle with technology transfer and controlling process variability.

New companies have the opportunity to overcome such challenges from the start by using newer process-intelligence-based approaches. These systems can add efficiencies, reduce risks, and deliver better results (i.e., commercialized products and profits) to investors and healthcare consumers.

Evolving approaches to data management

As noted, the traditional approach to data management involves manually collecting data from storage sources across manufacturing networks and supply chains, including paper batch records. Data is used, for example, to produce a weekly trends report for each product. Using Excel, for example, to compile and organize large amounts of data that accounts for important process variables, such as raw materials and in-process parameters organized in batch and genealogy context, can result in a state of "spreadsheet madness" and its related disease, "data warehouse madness," especially when data and reports need to be collected months or years later, when it is time to prepare a regulatory filing.

Process intelligence, by the previously stated definition, has a lot in common with quality-by-design (QbD) principals. It requires that product and process performance characteristics be scientifically designed to meet specific objectives, not merely empirically derived from the performance of test batches. Control of quality is designed into the process using scientific process understanding, so the desired outcomes are achieved reproducibly despite variability in process inputs.

According to FDA guidance, QbD is achieved in a manufacturing process when all: critical sources of variability are identified and explained; variability is minimized and managed by the process (instead of by specifications and release criteria at the end of the process); and product quality attributes are reliably predictable (2).

The business benefits of QbD include measurable supply-chain improvements from raw materials to end product. With variability in a process, for example, excess inventory must be stored to ensure that a manufacturer has plenty of raw materials available and final products in stock to meet customer demand in case an unpredicted event occurs in the process that threatens a stock-out. When process variability is reduced, there is a corresponding reduction in raw-material inventory required at the start of the process and less stockpiled inventory at the end of the supply chain. The result is less cost—and significant gains. A McKinsey report from 2010 looked at the business benefits of QbD components (3). The report notes that lower cost of goods sold through greater supply chain reliability and predictability accounts for $15–25 of the total $20–30 billion of increased profit to the industry expected when QbD is fully implemented (3).

As an effective path to QbD with known, minimized process variability, process intelligence must be used to go beyond the way technology transfer has traditionally been done, and thereby eliminate the "madness." It requires a new approach with improved collaboration using scientific process understanding that leads to reduced risks and more successful tech transfer (4). The industry has a long history of underestimating the complexity, level of difficulty, and time required for successful technology transfer (4).

As Figure 1 illustrates, successful technology transfer requires collaboration among local and contract process development teams, manufacturing, and quality operations to produce safe and efficacious products. Successful technology transfer has many facets: science-based procedures and specifications, clear process descriptions and protocols, robust assays and methods transfer, effective vendor selection, risk management, solid contracts, project management, training and communication, and so forth.

Figure 1: Successful technology transfer requires collaboration among various teams.

Another crucial requirement for the new process intelligence approach relies on leveraging technology, specifically using a self-service platform for data access, contextualization, analysis, and reporting to reduce risks and improve collaboration and regulatory compliance. A process-intelligence platform can provide a "layer" above all the relevant data sources that makes things simpler by allowing data to be accessed from a desktop view that fits the way users naturally think of where their data comes from. This is a key requirement for productive data analysis and more intensive, real-time collaboration. This approach allows teams to collaborate at a much deeper level than normally seen within organizations or between sponsors and CMOs. All parties benefit from a two-way, interactive platform where expertise can be applied for technology transfer, process support, and risk reduction using all relevant process and quality data.

Whether using data for modeling to predict process outcomes or examining data to investigate batch or supply chain problems, teams save time compared with digging through spreadsheets, and leverage expertise across sites. A platform approach provides automated data contextualization for observational and investigational analytics, along with access to all types of data, and delivers value for non-programmers and non-statisticians who need to collaborate with their more analytics-savvy team members.

Such an approach can be institutionalized from the beginning at smaller start-up companies or retrofitted in larger organizations and CMOs. The checklist for a supporting process-intelligence platform includes the following criteria:

  • Self-service access to data from multiple disparate sources

  • Flexible, accurate capture of data from paper records

  • Automatic contextualization for specific types of analysis

  • Working with continuous and discrete data together

  • Domain-specific observational and investigational analytics

  • Automated analysis and reporting (e.g., batch reports, APRs).

Change-management considerations

There are clear socio-political and change-management challenges that occur when applying a new approach to an organization. Introducing a shift from "the way it's always been done" requires that internal teams and CMOs understand how existing systems will be tapped to meet high-level goals, such as Six Sigma, technology transfer, APRs and other regulatory requirements. Demonstrating the value of a new approach can provide solutions to day-to-day challenges as well.

A process-intelligence platform approach can simplify CMO buy-in. It can provide contracted sites with web-based, on-screen forms where they can enter data, rather than sending spreadsheets over nonsecure Internet links to provide sponsors with data and reports. It also can address data-confidentiality obligations that CMOs have to make sure that only the data owned by a specific sponsor is available to that sponsor.

CMOs, like any type of organization, may be resistant to change or added requirements. Most CMOs have a contract with a sponsor and may be delivering on specific measurable-result promises within a particular cost structure that was agreed to in the contract. To incorporate a mutually beneficial system for collaboration and continuous improvement, sponsors need to demonstrate value to CMOs' businesses, including:

  • Automated trending and alerting

  • Faster Chemistry Manufacturing and Controls (CMC) preparation and approval

  • Shorter time to market

  • Higher yield and quality

  • Lower process variability

  • Acceptable process economics

  • Access to supporting data and institutionalized knowledge regardless of geographic location.

CMOs add value to sponsor relationships by allowing self-serve, on-demand access to designated process parameters with easy methods to capture paper-based data. This value cuts down on workloads, risks of errors, and time delays associated with sponsor requests for data. Reports like those shown in Figures 2 and 3 can be generated and automated.

Figure 2: Example of automated output for dashboard display.

FDA has stated that it hopes continued process performance verification will become a lifestyle that is consistent across an entire manufacturing network, including CMOs. "As a result of the 'trend toward outsourcing,' FDA is paying closer attention to contract relationships," said FDA Office of Compliance Director Richard Friedman at the PDA/FDA Joint Regulatory Meeting in September 2011, in Washington, DC. "Sponsors should expect to hear questions during inspections about how their companies are making sure that their CMOs are actually being monitored" (5).

Figure 3: Example of automated dashboard.

A new approach to process intelligence that relies on a process-intelligence platform for collaboration using scientific process understanding to reduce risks and improve compliance can ultimately lead to the desired state of tech transfer excellence and achieved QbD goals. Manufacturing, quality, and process development teams can work together across geographic boundaries to minimize capital costs for investors and risks to healthcare consumers, while at the same time delivering business benefits, including minimized variability, reliably predictable product quality attributes and positive supply chain impacts.

Justin Neway, PhD, is vice–president and chief science officer at Aegis Analytical Corporation, 1380 Forest Park Circle, Suite 200, Lafayette, CO, 80026, tel. 303.926.0317, jneway@aegiscorp.com, www.aegiscorp.com.

This article is based on "Reducing Technology Transfer Risks Using a Process Intelligence Platform That Spans Organizations and Geographies," presented at the CBI PharmTech Biomanufacturing Partnerships Conference in July 2012.

References

1. FDA, Guidance for Industry: Process Validation: General Principles and Practices (Rockville, MD, Jan. 2011).

2. FDA, Guidance for Industry: PAT — A Framework for Innovative Pharmaceutical Development, Manufacturing, and Quality Assurance (Rockville, MD, Sept. 2004).

3. CDER, Meeting of the Advisory Committee for Pharmaceutical Siences and Clinical Pharmacology (FDA Briefing, July 27, 2011), p. 31.

4. A. Webb et al., Pharm. Engineer. 30 (4) 2010.

5. International Pharmaceutical Quality 1 (4) (2010).

Recent Videos
Christian Dunne, director of Global Corporate Business Development at ChargePoint Technology
Behind the Headlines episode 6
Behind the Headlines episode 5
Related Content