Defining critical parameters and processing large quantities of data can be a challenge.
Using a design-of-experiment (DoE) approach allows a formulation development scientist to examine the formulation design space in a statistically significant manner from the beginning of experimentation. In addition, a DoE-driven approach meets the International Conference on Harmonization Q8 guidelines for multivariate design space and the subsequent justification of formulation components when developing a drug product, according to Steven R. LaBrenz, scientific director, pharmaceutical development and manufacturing sciences with Janssen Research & Development. At the same time, DoE results depend on the selected critical parameters and the sample material used in the studies. In addition, a large amount of data is generated. Consequently, reaping the benefits requires a significant commitment to DoE, including the training of researchers in study design and data-processing techniques.
The benefits of DoE
One of the major advantages of the DoE approach is the ability to study the effects of multiple excipients and their interactions in a formulation at the same time, unlike in the traditional one-factor-at-time approach, which is inherently flawed and biased, according to Mark Yang, director of fill/finish development for Genzyme. Phuong Nguyen, a senior scientist with Millennium Pharmaceuticals (The Takeda Oncology Company) agrees with this approach. “The quantitative models for these multiple quality attributes can then be used for global optimization of the formulation, leading to more robust formulation development,” she says. Nguyen also notes that depending on the data that are generated from an experimental design, it is sometimes possible to gain some mechanistic understanding of protein stabilization or behavior.
In addition, because multiple variables can be explored at once, fewer formulation studies are required and, therefore, formulation development is more efficient. An added benefit of this efficiency, Yang notes, is the reduced quantity of material needed for formulation development, which can be important early in the development of a biopharmaceutical product given the limited availability of the biologic API. Further, a DoE approach can be particularly useful for combination products that have device attributes that must be considered during formulation development, says Nguyen.
The challenges of DoE
While there clearly are numerous advantages to using a DoE approach in the formulation development of biopharmaceuticals, it is important to bear in mind that there are also limitations to consider with this method.
Two main challenges, asserts LaBrenz, include the fact that sampling of an adequate design space is often difficult and there is an overwhelming amount of data that is generated with factorial design. Yang agrees that the typical subset of critical quality attributes of the formulated product that are assayed in a DoE study may not be comprehensive enough to cover some of the quality changes that can occur. Nguyen also warns that the models used for DoE formulation studies may include statistically significant parameters that may not be of any importance, such as when the resulting change from an input parameter range is within the error of the method or is not large enough to have a real effect.
In addition, Yang notes that the API material used in DoE studies comes from one or possibly a few selected primary production lots that are not necessarily completely representative of later production lots. “As a result, the conclusions drawn based on DoE formulation studies can be biased to some degree,” he says.
LaBrenz believes that the challenges of a limited design space and the management of large quantities of data can be ameliorated with the adoption of high-throughput screening (HTS) techniques that provide both full design space coverage and the ability to collect data in spreadsheets and, eventually, databases.
Even with the use of HTS, however, implementation of DoE for formulation development can be time consuming and resource intensive and can translate into more extensive experimentation than planned, according to Nguyen. “It is a challenge to balance the number of experiments required for a statistically robust DoE study with the available supplies, personnel, resources, and time,” she says. In addition, user training is a must, according to Yang. “DoE study design and data processing requires an understanding of statistics theory and models, which can be challenging for many new users,” he comments.
Obtaining relevant data
In most DoE formulations studies, data related to the critical quality attributes and the stability of the API, such as the protein melting temperature, secondary or tertiary structure, or other biochemical/biophysical characteristics, are generated. The ultimate goal for formulation development is to maintain the API in a stable state with respect to both structure and function during processing and storage, according to Yang. Nguyen points out that all of the previously mentioned parameters may serve as indicators for a stable state.
It is also important to reduce the data to single-value results as often as possible, such as the percent monomer or aggregate as determined using size exclusion chromatography-high performance liquid chromatography (SEC-HPLC), or the percent purity. For spectral data from characterization assays, Janssen calculates the Center of Spectral Mass Wavelength (or Mean Center Mass), which allows the creation of a relative stability profile for a given spectrum-based assay. “This single-point, continuous, data-point strategy is readily testable using most statistical methods,” LaBrenz explains. In addition, Nguyen notes that the generated stability data can be reduced down to a reaction rate constant or an output parameter used to develop statistical models. “Overall,” she summarizes, “data on quality attributes are generated that will provide an understanding of how the excipient or formulation within the design space affect these attributes.”
Because DoE studies lead to the production of large quantities of data, it is important to ensure that all of the data that are collected are actually relevant. “A thorough understanding of the API and desired formulation characteristics is necessary before initiating a DoE study,” says Yang. “With such knowledge, a scientist can make sound scientific judgments and consider the potential impact of new factors or excipients when designing a DoE study.”
LaBrenz adds that developing the proper experimental design space before setting up DoE experiments ensures that the parameter critical for the stability of an API is identified. More specifically, conducting a screening DoE to determine which product quality attributes (and the corresponding output parameters used to represent them) are relevant in the design space is helpful, according to Nguyen. Accelerated stability studies can also be performed to determine relevant output parameters.
Another key to generating appropriate and relevant data is to make sure that normal-distributed data, obtained either directly or after transformation, are used. If there is a major process change, it is necessary to evaluate and understand any impacts on the protein and the formulation, according to Yang, which may mean conducting comparability or bridging studies.
Managing the data
For DoE studies to be useful, the generated data must not only be relevant, but manageable. There are different approaches to addressing this problem.
The solution for LaBrenz is to use an open, read-only database that can be shared between multiple groups, such as preformulation, formulation, and analytical development groups, so that each unit can focus on particular aspects of interest in a DoE. “While a formulation scientist needs to know where the most stable space for an API resides, an analytical chemist may be more interested in the variability and reproducibility of results, both of which can be obtained from a well-designed DoE,” he says.
Nguyen remarks that using software to automate data collection and extraction of the relevant output parameters for statistical analysis is important. She also stresses that having an understanding of which output parameters to collect and limiting the excipients examined to a subset of approved parenteral excipients can reduce the amount of data that must be analyzed.
Training on data processing is also crucial, regardless of what approach is used. LaBrenz uses many different tools depending on the level of complexity required for a given project, and often the choice of software is determined by collaborators. “It isn’t the software package that actually matters. It is all of the extra training to develop my applied statistics skills, including six-sigma training and seminars, that helps me on the job,” he states.
About the Author
Cynthia A. Challener is a contributing editor to BioPharm International.
Why is the PDA Pharmaceutical Microbiology Conference the Hottest Ticket in the Industry?
October 10th 2024Get a glimpse of the power and popularity behind the PDA Pharmaceutical Microbiology Conference from two planning committee members, Julia Marre, PhD (Associate Director, Scientific and Regulatory Affairs at Pocket Naloxone Corp) and Dawn Watson (Executive Director, Global Micro Quality and Sterility Assurance at Merck). This candid conversation reveals why this industry event is so influential…and always sold out! The speakers discuss what makes the PDA Pharmaceutical Microbiology Conference so vital to industry professionals, as well as how to become a part of this dynamic professional community.