OR WAIT null SECS
© 2024 MJH Life Sciences™ and Pharmaceutical Technology. All rights reserved.
Pharmaceutical Technology's In the Lab eNewsletter
Continuous manufacturing poses its own unique challenges to real-time monitoring of product- and process-related impurities.
Continuous bioprocessing provides numerous benefits, from high product quality and consistency to smaller physical footprints, increased flexibility, and potentially lower capital and operating expenses. It also provides the greatest benefits if real-time monitoring of process parameters and product quality, including detection and quantitation of residual impurities, is performed. While advances in analytical methods and automation are occurring, practical implementation of real-time monitoring of product- and process-related impurities has yet to be achieved.
“In an ideal world,” asserts Byron Kneller, senior director for analytical and formulation development with AGC Biologics, “we would have simple, rapid, physicochemical tests for residual impurities.”
Unfortunately, there are different types of residual impurities. They can be classified as product- and process-related, and as chemical impurities (typically small-molecule process additives such as isopropyl β-D-1-thiogalactopyranoside, antifoams, antibiotics, etc.) or biological impurities from the host cell or processing steps (host-cell proteins [HCP], host-cell DNA, residual Protein-A, etc.).
“Technologies that utilize mass spectrometry (MS) capabilities are, at this point, the seemingly best choice for residual impurity detection, characterization, and/or quantitation when coupled with liquid chromatography (LC) systems,” observes Amit Katiyar, director of analytical and formulation sciences for Thermo Fisher Scientific. Multi-attribute methods (MAM) capable of detecting, quantifying, and characterizing multiple product quality attributes are currently being developed.
A triple-quadrupole system would be best suited for quantitative analysis of small molecules/peptides, with quadrupole time-of-flight systems more appropriate for the characterization and semi-quantitative analysis of larger molecules, according to Charles Heise, senior staff scientist for bioprocess strategy and development at Fujifilm Diosynth Biotechnologies.
An amalgamation of both technologies for a single quantitative measurement covering a wide mass range would be an ultimate solution, he adds. A series of standards could be used to generate standard curves for target impurities and the software needed for identification and quantification of peaks. Aspects of high-throughput proteomics analysis may also be applicable to the continuous manufacturing environment.
Other analytical techniques Heise highlights include in-line spectroscopic measurements such as Raman, Fourier-transform infrared (FTIR), ultraviolet/visible (UV/Vis), and refractive index methods that can give quantitative as well as qualitative, real-time analysis. “The particular technology used will depend on the impurity, its expected concentration range, and whether monitoring is performed during steady-state conditions,” he observes. New methods or advances in current technologies may also become available that allow in-line measurements using nuclear magnetic resonance (NMR) imaging, sensor arrays, or LC systems.
The final option, Heise says, is to do no process monitoring and rely solely on the quality-by-design data generated during process development to define the operating ranges and edges of failure.
“In practice, the choice of method may be constrained by the availability of the instrumentation, turn-around time, analyst training, and potential interference from the complex matrices that may be involved in the downstream purification,” comments Ian Parsons, director of analytical development for biologics at Charles River Laboratories. His company has focused primarily on the use of high-performance LC (HPLC) and the enzyme-linked immunosorbent assay (ELISA), with MS and gas chromatography (GC) also used frequently.
“The pharmaceutical industry is still heavily utilizing traditional separation techniques such as size-exclusion, ion-exchange, and reversed-phase chromatography in addition to immunoassays such as ELISA and electrophoresis as the primary means for quantifying product- and process-related impurities,” notes Katiyar.
Often tests for process-related impurities require specialized reagents such as anti-HCP antibodies or natural products like limulus amebocyte lysate and can be more complex, adds Kneller. In addition, many impurities are heterogeneous (e.g., HCP) and thus not amenable to detection and quantitation using simple tests.
In practice for continuous processes, in-line confounded (e.g., spectroscopy) or secondary parameter (e.g., off-gas, pH, basic physical property) measurements are most user friendly, but accuracy of residual impurity detection is sacrificed for real-time control, as a result of the heterogeneous nature of the signal, or to allow for detection of a secondary response, according to Heise.
He also notes that off-line LC–MS, where mass windows can be limited by buffer solution and scanning rates, or LC–evaporative light scattering detection (ELSD), is typically used for small molecules, but these are not sensitive enough and can take too long to run for large-molecule manufacture.
These approaches could potentially be substituted for sensor technologies with equivalent modalities, according to Heise, such as the Octet system from Fortebio, which relies on biolayer interferometry. Similarly, he says that ELISAs for sensitive quantitation of biological impurities that have slow response times (material quarantining needed) could also be replaced with equivalent ‘dip-and-read’ methods.
The new technologies (e.g., MAM, MS for HCP analysis, automation for DNA, HCP, and Protein A residuals) are just starting to be introduced in the process development space, according to Katiyar. “The transition from exploratory to fully validated methods will be slower for such methods in a cGMP environment where method performance, equipment validation, and data-integrity measures need to be at the highest level. There will also be a significant cost aspect of not only reconfiguring current laboratories to accommodate the equipment, but also for training and/or hiring new personnel with expertise in these new technologies,” he says.
At the process level, identifying the residual impurities present and the appropriate analytical assays, with the required detection limits and sensitivity ranges, to monitor them across the process is critical, stresses Dan Pettit, senior staff scientist for analytical development at Fujifilm Diosynth Biotechnologies. At the operations level, he notes that identifying suitably equipped labs to develop and validate methods for GMP analysis is vital because of the heterogeneity of process impurities and the consequent analytical issues they pose.
Heterogeneous process-related impurities, specifically HCPs, are often quantified in early-phase projects using generic kits that may result in key impurities going undetected during purification process development. “These impurities have the potential to cause problems at production scale,” Kneller says. Current ELISA techniques lack the ability to identify specific immunogenic proteins, thereby making the process development aspect less strategic in design, agrees Michael Farris, scientific manager of analytical and formulation sciences with Thermo Fisher Scientific. “Building toxicology and immunogenicity databases for various HCPs and other process impurities would help fine-tune process development and, in itself, provide a deeper insight into process performance and robustness,” he asserts.
Identifying/developing suitably sensitive methods is also challenging, according to Pettit. “Typically, ppm or ppb levels of analyte in a complex mixture must be detected, and therefore, isolation/derivatization of the analyte may be required. These activities tend to be prohibitively expensive to outsource,” he observes.
In addition, isolation of impurities prior to quantitation may introduce artifacts or cause the generation of various altered states during the isolation process. For example, physical manipulation of samples prior to the detection of multiple analytes may alter the HCP antigen profile and relative abundance of the analytes themselves, according to Farris. Additional studies to understand the stability of isolated product(s) may therefore also be needed.
Other process impurities such as Long R3 IGF-1, which is a component of some cell-culture processes, have a tendency to adhere to certain plastics, thereby making accurate quantitation more difficult, Farris adds. Sampling instructions must be clearly defined so as to not artificially deplete the analyte during sampling and/or storage prior to analysis.
Parsons agrees that the main challenges are focused around developing methods of sufficient sensitivity that also minimize product and matrix-related interference, and thereby accomplish sufficient recovery of the analytes. In numerous instances, Farris points out that the quantitation and/or characterization of impurities is hindered by the API and/or the matrix composition.
“Immunoassays, such as ELISA, used for quantitation of host-cell proteins and leachables like Protein A are sensitive to extreme pH, high salt concentrations, and certain detergents. The typical means for overcoming the signal suppression or interference associated with their presence is to dilute the sample, which acts to decrease the sensitivity of the method to maintain acceptable precision and accuracy,” Farris explains. As a consequence, sensitivity constraints imposed by matrix/API interference must be a point of consideration when demonstrating that a method is fit for use for process characterization and/or process validation activities.
For products with small total batch volumes, the quantities of material required for analytical method development and sample analysis can also present a challenge, Parsons notes. “If method sensitivity is also an issue, an approach can be taken to spike an aliquot of upstream material with a higher known concentration of analyte and then demonstrate fold-removal of the analyte across the downstream purification step(s),” he comments.
For continuous processes, the turnaround time of typical assays is a key issue, according to Kneller. “The challenge is to increase our testing throughput, decrease our reliance on heterogeneous reagents, and to find ways to either simplify the method types used or engineer robust, easy-to-use systems for more complicated analyses (e.g., mass spectrometry),” he says.
Real-time monitoring of representative material for modeling the residence time distribution in continuous processes is also required for process control, according to Pettit. “The range of appropriate analytical sensors for continuous monitoring of product specific critical quality attributes and impurities is limited. Additionally, sensor drift over the operating time, any on-going sensor calibration needs, and system suitability testing for quality critical attribute monitoring during continuous operation must be resolved,” he adds. “Here again, sensor technology could be the way to go (e.g., online Octet dip-and-read-type systems); however, the technology is not sufficiently developed yet. The final technology offerings need to be diverse, robust, accurate, reliable, and rapid,” he concludes.
For off-line analysis of residuals from a continuous process, Parsons notes that methods for aliquot/sample withdrawal are needed. In addition, the sampling frequency should be sufficient to provide a statistical sampling of the material flowing through the process in order to account for any transient variation that may occur while avoiding significant depletion of the process flow.
The analytical testing strategy associated with continuous processes can, however, be more demanding in terms of the number of sampling time points if in-line or on-line monitoring is not possible, according to Farris. “Process analytical technologies have generally been geared more toward product quality, cell-culture viability, and growth than active monitoring of process impurities. If a deeper understanding of any particular impurity is required, more complicated assays will likely be required,” he observes.
When moving from batch to continuous processing, it is likely that the same analytical methods for residual impurities can be used, but potentially in higher throughput versions. Offline analysis using LC, GC, ELISA, and MS should be possible as long as an appropriate sampling approach can be implemented, Parsons observes.
The throughput capability and time-to-result for the analytical method employed are also important aspects if at some point the impurity is deemed process critical, according to Darshini Shah, senior scientist and group lead for downstream process development at Thermo Fisher Scientific. Automation of methods (e.g., liquid-handling or use of systems such as the Octet for HCP analysis) can provide higher throughput both in terms of overall speed and analyst effort, according to Kneller.
The analytical target profile must also be evaluated to ensure methods are capable of monitoring fluctuating analyte levels throughout continuous processing. “Retention rates or biomass removal may result in periods of correspondingly lower impurity levels, and the analytical method must be sensitive enough and support a large enough dynamic range to be able to actively monitor the process range without the need to constantly repeat analyses to better target the operating range of the assay,” Shah explains.
In addition, the typical approach in batch processing involves analysis of a small discrete sample number, with reference standards bracketing the samples and system suitability tests performed prior to analysis, according to Heise. Rotating through multiple identical analyzers would allow continuous monitoring, but the solution would be costly. “Validation of both system suitability tests and the analytical methods for continuous control/monitoring will need additional work to demonstrate that results do not require bracketing with reference standards and that the detection methods do not drift or get poisoned over time when operating for 90+ days,” he says. Failure mode mitigation strategies will also need to be developed, for example with respect to system suitability failure.
In addition, Parsons points out that for in-line and on-line analysis, the inherent requirements of sensitivity and specificity for analysis of process residuals suggests that many of the current and potential online methods would have limitations and may perhaps be able to best provide supportive general information, rather than specific quantitation of residuals. “Offline analysis using traditional validateable techniques might still be required for specific and sensitive quantitation of the process residuals. Nevertheless, if a continuous process can be shown (perhaps by an online monitoring method) to be robust, then reduced sampling of the continuous process for quantitation of residuals may be justifiable,” he notes.
In fact, because continuous processes are expected to operate under steady-state conditions, control through predictive modeling or by exception should be possible, Heise adds. “Multi-variant analysis during development may identify simple in-line monitoring techniques that can control the process to ensure residual impurities do not pass through the whole process. However, the dynamics of the process will be different at the outset until the steady state is reached. A testing strategy defining the frequency of measurement pre- and post-steady state will be required to identify when process equilibrium has been reached and when it begins to fail,” he observes.
“Ideally,” Heise continues, “we don’t really want to be testing anything real-time and on-line. Instead, we want to have confidence that the process will deliver consistently over a defined time scale.”
Significant development is still needed before on-line/in-line analysis of residual impurities will be widely implemented. Shah is not aware of any on-line/in-line tools capable of monitoring residual impurities for either quantitation or characterization. “Offline analysis is the predominant means of analysis and the only on-line/in-line tools are related to product quality aspects,” he says.
Current solutions exist around automated sampling for at-line analytics, where process changes occur over a longer time span than the analysis time, such as in the case of mammalian bioreactor metabolite analysis, according to Pettit.
On-line process analytical techniques such as near-infrared spectroscopy (NIR) or mid-infrared spectroscopy (mid-IR) and UV may provide some general information on clearance of residuals but are unlikely to be specific or sensitive enough to enable detailed quantitation of analytes, according to Parsons.
Parsons does note that low-field NMR is a potentially promising technique for on-line analysis of process residuals. “The system is simple to operate, potentially applicable to both batch and continuous processes, and could be performed by stop-flow analysis with direct connection of the flow path to the downstream purification process,” he says. Attractive features of NMR in this regard are its inherently quantitative response factor and its relatively high resolving power, and hence higher specificity than a technique such as UV, while drawbacks include barriers to implementation of a conceptually complex analytical technique and limitations on sensitivity.
BioPharm International
Vol. 32, No. 7
July 2019
Pages: 28–31
When referring to this article, please cite it as C. Challener, “Analysis of Residual Impurities in Continuous Manufacturing," BioPharm International 32 (7) 2019.