Biopharma Analysis Benefits from New Technology and Methods

Publication
Article
Pharmaceutical TechnologyPharmaceutical Technology-02-02-2020
Volume 44
Issue 2
Pages: 16–21

Analytical solutions are improving for raw material testing, drug product release process development, and more.

motorolka - Stock.adobe.com

Effective analytical methods are essential for the successful development and commercialization of both small- and large-molecule drug substances and drug products. As the complexity of both biologic and chemical drug substances increases, analytical methods must evolve as well.

“Faster, more efficient techniques will give companies an advantage as their products move through the pipeline,” asserts Robin Spivey, director of analytical research and development, Cambrex High Point. Techniques that are more sensitive and more accurate will, she says, better position a company for regulatory acceptance as long as they are willing to help pioneer the techniques. In addition, such companies will be seen as being at the forefront of the industry.

Some of the most noteworthy advances in analytical methods involve the application of mass spectrometry (MS) for process development and product release of both biologics and synthetic drugs, the enhancement of chromatographic techniques, particularly liquid chromatography (LC), microcrystal electron diffraction, and techniques designed for use as process analytical technology (PAT).

For biopharmaceuticals, MS was initially limited to use for protein characterization to provide supplemental information for regulatory filings, according to Amit Katiyar, director of analytical and formulation development for bioprocess sciences at Thermo Fisher Scientific. Process release/stability testing continues to largely depend on conventional analytical methods such as LC, capillary gel electrophoresis (CGE), imaged capillary isoelectric focusing (iCIEF), and enzyme-linked immunosorbent assays (ELISA) due to their simplicity and wide adoption in quality control (QC) labs.

Inclusion of biosimilars, complex non-monoclonal antibody proteins (e.g., fusion proteins), bispecifics, and combination products in the product pipeline, however, is presenting challenges due to the inability to gain a thorough understanding of these molecules using platform methods. “Most of the time, platform methods may not be able to provide the information required to develop and commercialize complex biomolecules. In these cases, MS-based methods are being used for process development and as identity and release/stability indicating methods,” Katiyar observes.

In addition to using peptide-mapping principles in multi-attribute methods (MAMs), major biopharmaceutical companies are now using MS-based identity methods to release biologic drug substances and drug products. “This approach will provide the opportunity to gather more information on the performance of MS instruments in QC labs that can then be used for implementing MS technology for process development, release, and stability testing,” says Katiyar. The current approach for regulatory filing, he adds, is to use a combined package of conventional methods and MS methods to gain more confidence from health authorities and be able to present a future case for submissions based only on MS data.

For Da Ren, process development scientific director at Amgen, MAM is probably the most important emerging analytical technology that has been used in process development and release and stability testing of therapeutic proteins. “MAM is an LC/MS-based peptide mapping assay. Unlike profile-based conventional analytical assays, which focus on whole or partial proteins, MAM can identify and quantify protein changes at the amino acid level and can provide more accurate information on product quality related attributes,” he explains. Notably, MAM is capable of replacing four conventional assays including hydrophilic interaction liquid chromatography for glycan profiling, cation exchange chromatography for charge variant analysis, reduced capillary electrophoresis-sodium dodecyl sulfate for clipped variant analysis, and ELISA for protein identification, according to Ren.

In the case of small-molecule drug development and commercialization, MS detection systems are no longer considered just research tools and are becoming more widely used for routine QC testing, for example determining extremely low level impurities such as genotoxic impurities/potential genotoxic impurities, according to Geoff Carr, director of analytical development in Canada with Thermo Fisher Scientific.

“These advances are very likely in response to new regulatory guidelines issued by agencies such as FDA and the European Medicines Agency, but also as a result of specific problems that have occurred in the industry, such as recent concerns regarding observations of N-nitrosamine residues in sartans,” Carr explains.

Regulations drive analytical advances

Chromatographic-based analytical procedures continue to be the most widely used technologies for small-molecule API and drug product analyses and most advances are likely in this area. For the most part, procedures based on high-performance liquid chromatography (LC)/ultra-high-performance LC (HPLC/UHPLC) coupled with ultraviolet detection are most widely used, according to Carr.

Notable advances in this area, Carr says, include improvements in HPLC/UHPLC column packing chemistries designed to provide better resolution between sample components and greater robustness, such as resistance to particularly high or low pH conditions for mobile phases.

Advances in crystallography are also important for small-molecule API analysis. Microcrystal electron diffraction, also known as micro-ED or cryo-EM, for instance, can be used for structure determination by electron diffraction using micron-sized crystals. “Typically, millimeter-sized single crystals are necessary to achieve unambiguous determination of the absolute structure of an API using single-crystal X-ray crystallography,” explains Heewon Lee, director of analytical research and quality systems in chemical development US for Boehringer Ingelheim Pharmaceuticals.

Micro-ED enables the same level of structure determination using micrometer-sized crystals, eliminating the need to grow larger single crystals, which can be challenging. “This capability is very useful for process development, as understanding impurity formation is a critical step to optimize processes,” Lee says.

Other advances include applications such as near infrared and Raman spectroscopies in support of process analytical technology to enable monitoring of manufacturing processes and the quality of resulting products on-line. “Developments in these areas are in response to regulatory encouragement for the application of quality-by-design approaches for manufacturing drug products,” Carr notes. He adds that this type of technology is more likely to be applied by companies that manufacture their own products and less likely in a contract manufacturing environment.

 

 

Efficiency gains for workflows

Changes in analytical workflows have the potential to impact productivity and efficiency but may also create challenges depending on the nature of the modifications. These changes may also originate as the result of new technology or new processes and approaches.

As an example of the former, Heewon Lee, director of analytical research and quality systems in chemical development US for Boehringer Ingelheim Pharmaceuticals, points to material identification using Raman spectroscopy as a technique that has impacted analytical workflows associated with small-molecule API manufacturing. “This technique is now mature, and several companies have launched products that are user-friendly and GMP [good manufacturing practice]-compliant. Benefits are gained because this method can be used to identify raw materials, intermediates, and APIs in the process area. QC personnel can then release batches based on the Raman data acquired, streamlining the analytical workflow,” she explains.

For biologics, using MAM through process development and release and stability testing is a revolutionary analytical workflow, according to Ren. “The continuous monitoring and control of product quality attributes at the amino acid level during product and process characterization as well as release and stability testing enhances the understanding of biotherapeutic products and processes,” he asserts.

One driver leading to changes in analytical workflows is the desire to achieve greater efficiencies and thereby reduce operating costs, according to Carr. One approach that many pharma companies have taken, he notes, is to implement operational excellence initiatives within laboratory operations.

Regulatory pressures for improvements in the scientific understanding and quality of drug product is also leading to an evolution in analytical workflows. “We are seeing increasing guidelines focused on analytical development, such as a [Brazilian Health Regulatory Agency] ANVISA guideline on conducting forced degradation studies that is very demanding,” Carr observes.

Harmonizing method development

There are also initiatives within the International Council for Harmonization (ICH) and the US Pharmacopeial Convention (USP) focused on replacing the traditional development-validation-transfer approach to management of analytical procedures to one based on lifecycle management. “This approach,” asserts Carr, “requires rational development/validation coupled with a process of performance monitoring that leads to continuous improvements.”

These changes relate to an important development noted by Katiyar; the adoption of phase-appropriate analytical workflows combined with increasing harmonization of workflows across sites and with external partners, such as contract research and development organizations.

Phase-appropriate development and manufacturing can help pharma companies better handle their expanding product pipelines by enabling the supply of safe clinical materials while maintaining flexibility in manufacturing operations, according to Katiyar. He does note, however, that while ICH guidelines are defined for late-stage development, there is a lack of clarity on the qualification of analytical methods for early phase development.

“Harmonization using a platform approach to method development and qualification for early and late-stage projects and method validation for late-stage projects, as well as forced-degradation, research comparability, and formal comparability studies and specification setting has provided the opportunity to engage process development, analytical development, operations, quality, and regulatory organizations earlier in the development cycle,” Katiyar says. “The harmonized approach creates a well-defined roadmap for process development and operations teams to execute the program timelines in an efficient manner,” he adds.

The details of each harmonization approach vary from company to company. In some cases, the harmonized templates may contain specific experiments and acceptance criteria to ensure a successful platform approach, according to Katiyar. In other cases, high-level harmonization guidance provides flexibility during development.

“Pre-approved generic templates for draft methods, qualification plans, and forced degradation/comparability protocols result in higher productivity due to time savings resulting from a shorter review process for every program,” Katiyar observes. While there may be a need to modify the templates as reflected by the development data, he notes that such changes can be incorporated in the molecule-specific methods/plans, and protocols and subsequent changes can be reviewed by appropriate stakeholders for quality and regulatory compliance.

Automation improves sample prep

Some of the most important advances in sample preparation tools include increasing application of automation and robotics. Quality-by-design (QbD) approaches to analytical testing can often lead to multiple sampling and testing to  achieve a more accurate assessment of the total batch rather than testing one or two samples per batch.

An example given by Carr is stratified sampling for solid oral dosage forms whereby samples are taken at approximately 20 time points during tablet compression or capsule filling and tested for drug content, and three units from each time point are tested. “In circumstances such as this one, there are huge benefits in having automation available in the lab for the preparation of the analytical samples from the 60 individual tablets that require testing,” he explains.

Automation of sample preparation for low throughput methods is also critical to improve turn-around times to support process development activities, adds Katiyar. In general, he notes that automation of all in-process methods for biologics-including size-exclusion chromatography, CGE, iCIEF, n-Glycan content and residual host-cell protein, DNA, and Protein A-to support process development activities is crucial for meeting fast-to-first-in-human trials/quick-to-clinic timelines.

 

 

In the field of biologics sample preparation, Process Development Scientific Director Jill Crouse-Zeineddini at Amgen sees acoustic droplet ejection for potency assays as an important advance. Acoustic droplet ejection uses acoustic energy instead of tips to transfer a fixed amount of liquid sample from a source to destination plates freely with excellent accuracy and precision, she explains. “The significance of this technology resides in its superb dispensing performance at a very low sample volume. This technology performs direct dilutions instead of serial dilutions and prepares each dose independently, improving assay precision and throughput,” Crouse-Zeineddini observes.

Robust aseptic sampling and automated sample preparation for the purification, desalting, and digestion of protein samples, meanwhile, enables many different product quality analyses. “This technology not only significantly improves operational efficiency, but also eliminates potential contamination and mistakes during manual sampling handling,” states Gang Xue, process development scientific director at Amgen.

Another important point, according to Carr, concerns the reliability of the sample preparation procedure. “This issue is not a new one, but it is becoming more apparent as we apply QbD approaches to our analytical procedures. While the greatest emphasis has been applied to chromatographic parameters, we now realize that the sample preparation stage is at least as important and also needs to be developed using QbD,” he comments.

For Lee, a specific technology development that has improved sample preparation is once again material identification by Raman spectroscopy for small-molecule drug substances. “Using this technique simplifies sample preparation, because it allows identification of compounds without any physical contact with the sample. Depending on the container material, it is even possible to acquire Raman data through the container without the need to withdraw a sample,” she says.

More developments on the horizon

Such capability for biologic drug substances has yet to be developed, however, and simplified identity methods to support release and establishing post-shipment identity of bulk drug substance are still required, according to Katiyar. Currently, peptide mapping and binding ELISA are used as identity methods, but they have long turnaround times. Raman spectroscopy has been evaluated for biologics, but it has not yet been adopted by the industry for release of drug substances and drug products. “Simplification using scan-based methods with better specificity and faster turnaround times would be highly beneficial for biopharmaceuticals,” he says.

When integrated with analytical instruments, aseptic sampling and automated sample preparation has the potential to move in-process and product release testing from offline QC labs to the manufacturing floor, either in-line or online, according to Xue. In addition to enabling real-time monitoring of not only cell growth, but also the critical quality attributes of therapeutic proteins themselves, the technology is beneficial for providing much more granular insights into the conventional batch process and products in-flight, he notes. “More importantly,” Xue states, “it could in the future be crucial for lot definition, process variation detection, and material segregation as required for continuous bioprocessing.”

For small-molecules, Lee would like to see the widespread adoption of x-ray fluorescence (XRF) for metals testing because it would also reduce turnaround times. “XRF is a powerful method for detecting metals used as catalysts in API manufacturing. Compared to inductively coupled plasma-MS, XRF does not require sample dissolution and digestion, facilitating easier sample preparation and faster turnaround times,” she explains.

Carr, meanwhile, expects to see increasing us of LC–MS for routine analytical testing. “This technology is widely applied in chemical drug development labs for various purposes and is also used for biopharmaceutical analytical testing, but less for release testing of products and for testing stability samples. The technology has advanced considerably over recent years, and while these instruments were previously only applied in R&D, they have now become highly suitable for use in routine testing labs,” he remarks.

Short timelines create challenges

There are a number of challenges to the adoption of advances in analytical techniques, some of which vary according to the development phase. Adhering to compressed program timelines is the key challenge in getting advances in method adoption for early stage development, according to Katiyar. “Fast-to-first-in-human (FIH)/quick-to-clinic program timelines have been introduced in almost every pharmaceutical organization to provide clinical material for Phase I studies, and these timelines have shrunk from 18 months to less than 12 months during the past five years,” he says.

The shorter timelines are met by relying on platform approaches developed based on knowledge generated over years with multiple molecules. “For new molecules that fit the platform methods, there is no scientific justification to explore new technologies,” Katiyar states. When working in a lab that is operating in a high-efficiency environment, there is often resistance to the introduction of new methods and approaches due to concerns about meeting delivery targets, agrees Carr.

Once programs move to late-phase development, organizations are hesitant to introduce any change in the control strategy unless it is absolutely needed. This reluctance is particularly strong if a filing has been made to a regulatory agency and/or if significant data have been collected using the older technique, according to Spivey.

“To be adopted for measuring product quality measurement, the performance of new analytical methods must be equivalent to or better than the methods they replace, and there must be clear evidence that they are reliable and robust across a wide range of operating spaces,” states John Harrahy, director of process development in pivotal attribute sciences with Amgen. The adoption of new technology in the middle of a program, adds Katiyar, requires significant effort to develop the new method, perform bridging studies, requalify the method, perform technology transfer (if outsourced), perform retrospective testing, and define new specifications. Bridging studies cost the sponsor additional money and time, and there is always the risk that a bridging study may show that the methods or techniques are not comparable, adds Spivey.

There is also often a reluctance on the part of drug companies to be the first to make a submission to FDA with a new technique due to the possibility of the validity of the technique being questioned, Spivey notes. “They don’t want the burden of having to defend the technique to FDA or other regulatory agencies,” she says. There can be some risk with introducing new technologies that have had limited regulatory exposure, adds Harrahy, particularly considering the different regulatory expectations and change control requirements from different regulatory authorities worldwide.

“With that said,” Harrahy comments, “evaluating innovative technologies is a vital component to ensuring product quality and value to patients, and the ultimate risk of not evaluating new technologies greatly outweighs remaining stagnant.”

The ideal solution, Katiya argues, is to explore new technologies as part of improvement initiatives without associating them with any programs. This approach provides the flexibility to explore new technologies without putting the program timelines at risk. “Once proof of concept is established and the method is ready to be adopted, a platform approach can be used to implement the new technology,” he comments.

Senior leadership in large organizations, according to Katiya, must provide guidance to their teams to push innovation without risking program timelines. In addition, it is also important to apply thorough training practices to ensure that scientists really understand the new approaches, says Carr. Continuity of data must also be addressed. “Trend analysis is a widely used tool for monitoring pharmaceutical product quality, and the introduction of new and ‘better’ methods may be perceived to interfere with this trending process,” Carr observes, even though it is more important to apply continuous improvement and accept possible breaks in trends.

 

 

Ways to facilitate adoption

In addition to evaluating new analytical methods separately from specific drug development programs, there are several other strategies that can be used to facilitate the adoption of advances in analytical techniques.

The best strategy for adopting a new analytical method in a quality setting, according to Harrahy, is to start with the end in mind. Does the proposed method fit the analytical target profile? Is the method sufficiently capable for the product or products that it will measure? Does the methodology require modification to the available GMP/QC environment?

“The robustness, reliability, and value of introducing any new method must be clearly demonstrated, which is often best accomplished by taking a staged approach: determining the method operable design space in a development laboratory, piloting the method in a development/phase-appropriate setting to monitor ‘real-world’ method capability, performing bridging studies vs. the older method, staging its implementation in QC, and continuously monitoring method performance,” he says. In addition, for regulatory acceptance of novel technologies, early partnered engagement with health authorities is strongly recommended, for example, participating in FDA’s emerging technology platform when the new technology has the potential to improve product quality.

The most important strategy, agrees Spivey, is to provide ample data demonstrating that new methods are reliable and robust and that there is little or no risk to implementing the technique in a regulated environment. Advances that offer significant advantage over corresponding currently accepted techniques will also have greater likelihood for acceptance. However, Spivey stresses that the advantage would need to be significant enough to be worth the time and money needed for it to be implemented. “Ideally,” she says, “the owner of the technique would perform some preliminary legwork with the regulatory agencies demonstrating the capabilities of the technique. The sponsor would then have some assurance that the agencies would accept their data and make it a less risky approach for them.”

Another approach, depending on the nature of the old and new/improved methods, is to run both in parallel for a period of time in order to develop an understanding of how their performance and the resulting data compare, Carr suggests.

For Lee, the key to new analytical method adoption is the sharing of use cases between pharmaceutical companies combined with the publication of white papers and communication with regulatory authorities. Katiyar agrees that sharing knowledge and gaining the feedback of peers and regulatory authorities in timely manner is essential. “Peer-reviewed publications, conference presentations, and Biophorum Operations Group-like forums are the best places to share information and exchange ideas to improve and adopt new technologies on a global scale,” he comments.

All stakeholders must collaborate

That information sharing should occur between all stakeholders, including contract research, development and manufacturing organization, testing laboratories, biopharmaceutical companies, regulatory authorities, and instrument/equipment vendors.

“Innovators and service providers need to be open to new ideas and be willing to invest the time and money to implement new techniques. Service providers also, rather than waiting for clients to request a technique before investing in it, should advocate for the use of new methods with their clients,” Spivey asserts. In addition, Katiyar believes innovator companies working with service providers should form an external working group to share new methods and technology to eliminate knowledge gaps caused during technology transfer of methods. “Most of the time,” he remarks, “innovator companies are not willing to share new methods and technologies and thus delay the adoption of new technologies throughout the pharmaceutical industry.”

Regulators also need to be open to new ideas and willing to work with pharmaceutical companies to ensure that new methods and techniques are acceptable for use in a regulated environment, according to Spivey. It is important for pharma companies and regulatory authorities to remember they have a common goal in identifying new methods and technologies for monitoring and quantifying critical quality attributes that may impact the safety and efficacy of the molecule throughout the lifecycle of the program, adds Katiyar. He points to MAMs as an example where health authorities have accepted data packages consisting of results obtained using conventional approaches supplemented by those obtained using MS-based approaches.

Instrument/equipment vendors, meanwhile, should be prepared to demonstrate that a new technique is sufficiently better than the currently accepted technique to be worth investing in and worth any potential regulatory risks, asserts Spivey. The dilemma here, according to Carr, is how stakeholders all link together.

“If a new analytical technology comes up, it will not be accepted by industry/regulators unless the equipment that is required to use it becomes widely available. Maintenance, qualification, and repair services must also be widely available and reliable. Typically, however, a vendor will not set establish this level of availability unless there is a level of confidence that sales targets will be achieved. I think that this is the area where conferences, exhibitions, and publications provide a really valuable platform to get the information from innovators and suppliers circulated to end users,” he says.

Article Details

Pharmaceutical Technology
Vol. 44, No. 2
February 2020
Pages: 16–21

Citation

When referring to this article, please cite it as C. Challener, “Biopharma Analysis Benefits from New Technology and Methods," Pharmaceutical Technology 44 (2) 2020.

 

 

 

Recent Videos
Lee Cronin, founder and CEO of Chemify
Drug Digest: Is Our Understanding of Stability Changing
Jay Rajagopalan, senior director—Engineering & Product Management for Malema at PSG Biotech