Analytics Advances for Optimizing Downstream Processes

Publication
Article
Pharmaceutical TechnologyPharmaceutical Technology, December 2021 Issue
Volume 45
Issue 12
Pages: 30-32

Simple, inexpensive real-time analytics are urgently needed for high value products.

process analytics

Siarhei - stock.adobe.com

Downstream processing operations have a direct impact on biopharma product quality and cost. Therefore, optimization of downstream biologics purification steps is not surprisingly focused on increasing productivity and reducing cost, while maintaining the highest possible quality. Two specific areas receiving attention, according to Martin Vollmer, biopharma program manager in the Lifescience Analysis Group at Agilent Technologies, are continuous manufacturing with automation/integration of associated analytics and the adoption of single-use technologies, including a shift from traditional column chromatography to membrane-based techniques.

Capture chromatography, adds Darren Verlenden, head of bioprocessing, MilliporeSigma, has been an early target for improvement due to the high cost of goods associated with this operation. Adoption of advanced capture methods, such as multi-column capture, has been slower than expected, however, due to complexity and regulatory concerns, he observes. The focus has therefore shifted to flow-through polishing. “Analytics in this space are needed to move from monitoring critical process parameters to monitoring critical quality attributes, with in-line or at-line aggregate analysis seen as an important need,” Verlenden says.

Indeed, the biopharmaceutical industry continues to express interest in technologies that enable process control while improving process development times and quality. There is particular need for advances downstream due to the dramatically higher productivity achieved in upstream manufacturing today, according to Phil Vanek, chief technology officer at Gamma Biosciences.

“Increasing interest in automation and continuous manufacturing and the advent of manufacturing strategies for advanced therapies (e.g., cell and gene) are compelling more chained or continuous manufacturing methods to allow for more walk-away operations,” Vanek continues. “These approaches not only improve cost efficiency, but when properly implemented can improve product quality and reduce risk through closed operations and increased use of automation,” he asserts.

High-throughput potential

Full integration of analytical technologies helps to provide real-time answers, reduce costs, avoid costly failures, and makes processes much more efficient, notes Vollmer. “Online and inline process analytical technologies will provide continuous insight into the process and into the quality of the drug substance,” he states.

Overall, therefore, the incorporation of rapid, high-throughput analytics enables more informed decision making in process development, resulting in improved processes, Verlenden summarizes. However, for these high-throughput development and analytics to also reduce development timelines, he cautions that novel approaches to data collection and processing are needed to overcome the inefficient, largely manual, paradigm of today.

While advanced control is lagging for a variety of reasons, including the different control architectures within the diversity of equipment being deployed in manufacturing today, Jonathan Hartmann, president and CEO of Nirrin Technologies, believes that with new technologies and the market demand for integrated control in operations, the control aspects (for full automation) will quickly catch up.

In addition, Hartmann observes that for next-generation therapies in particular, process analytics promise to improve not only their therapeutic potential and potency, which is often predicated on the manufacturing process itself, but also the ability to assure their safety through process consistency and control, as well as their cost effectiveness through automation and simplified manufacturing.

Existing tech has limitations

With Chinese hamster ovary (CHO) cell productivity routinely exceeding titers of 10 g/L, Vanek notes that many biologics manufacturers are looking to address knock-on effects such as column fouling and protein aggregation. “The ability to continuously monitor downstream unit operations and operate under conditions that avoid product yield losses, especially after affinity and polishing steps when the value of the product is at its highest, are of extreme importance,” he comments.

Particularly for advanced therapies, Vanek says that real-time monitoring and control of critical quality attributes, such as full capsid adeno-associated virus productivity, can have a positive and significant impact on product potency, cost, and safety.

Most analytical technologies employed today, however, are still used in an offline manner, and results are not available in real time, which causes delays. Samples are removed from the stream and submitted for testing with results produced hours or days later.

“With those techniques, immediate reaction on fractionation or in optimizing yield and recovery is not possible; the approach today can be compared to performing a post-mortem analysis,” Vollmer explains. “Such delays preclude decision making to inform the next round of experiments and slows overall development,” adds Verlenden.

Any reduction in this timeline should speed development, according to Verlenden. “As biopharma moves to connected and continuous processing, the ability to monitor critical quality attributes in real (or near-real) time will further enable optimization and streamline key decision points,” he states. Vollmer agrees that the goal must be to get results when there is still time to take action in response. He notes that connecting liquid chromatography (LC) or spectroscopy online or inline will provide this capability.

“Using advanced analytical approaches including machine learning,” adds Vanek, “will deliver data that can identify process pitfalls and opportunities for improvement addressing quality and yield, as well as set the stage for predictive analytics that could allow users to intervene or abort processes that are likely to produce out of specification results.” Additionally, real-time data collection and analysis can be coupled to feedback controllers to enable automation and hands-free process management in the future, contends Vanek.

While there are some inline sensor technologies available on the market today, Vanek laments that they can be costly, slow, and/or complex, and often measure surrogate events in the process that are then indirectly correlated to the desired metric. It is important, stresses Hartmann, to consider the physical time needed to not only capture data, but also analyze and integrate the results of that analysis into a real-time control instruction set, to allow for process automation.

Digital innovation and analytical advances now essential and enabling

The desire for faster and more flexible analytical solutions for downstream processing is driving innovation. New “Industry 4.0” digital capabilities will be essential before fully automated feedback loops can be leveraged, according to Vollmer.

One of the key challenges of existing downstream-processing analytics, Vanek observes, is the need to sample product and analyze the material off-line, as many of the on-line technologies measure only surrogate analytes or cannot process the data in real-time. “As new applications are developed, or existing methodologies are being adapted to on-line measurements, it becomes easier to realize improvements in performance. In doing so, integration of the data into manufacturing execution systems and process chaining for continuous manufacturing, together with increased automation, become enabled,” he explains.

Advances in connected and continuous manufacturing, in particular, are increasing expectations for rapid decision making, agrees Verlenden. “Success in this area cannot be fully realized without an advancement in analytics,” he observes. New robust analytical equipment fully integrated into the process and specifically designed for this purpose helps to facilitate adoption as well, Vollmer agrees.

Regulatory is also a key driver with expanded data package requirements, Verlenden says. Design-of-experiment approaches, for instance, require additional analytical data in downstream-processing applications.

“In essence,” Verlenden states, “improving analytics will allow for shorter development cycles to enable manufacturers to accelerate speed to market and reduce costs, while in the future, these analytics will enable the transition to process control based on critical quality attributes.”

Implementation challenges still exist

As with the adoption of any new technology, the conservative nature of the biopharmaceutical industry has led to slower-than-desired action with respect to the implementation of more advanced analytics that can facilitate downstream process optimization. “Our industry tends to conservatively adopt new technologies, and this continues to slow down adoption,” says Verlenden.

There are other challenges as well. “Inertia within the industry to switch, primarily caused by regulatory constraints and the high cost of implementation, is a key hurdle,” Vollmer notes. In addition, as analytical technologies move from established off-line methods to at-line or inline methods, there is likely to be a period of decreased efficiency where, for instance, methods are run both off-line and at-line or inline, according to Verlenden.

“While this duplication will be a necessary step to realize the future potential of at-line and inline analytical testing, it will require investment and acceptance of this intermediate inefficiency,” Verlenden comments. Ultimately, however, Vollmer believes that the pay-off will be lower-cost production and better drug quality.

Vanek is clear that implementing new technologies into regulated manufacturing environments can introduce many challenges, from data compatibility across legacy systems to methods re-validation and even respecifying acceptance criteria for a product—all of which can be regulatory headaches or worse. “To make it worth the investment from a cost- and risk-perspective, adopters of new technology have to have confidence that the method will be reproducible, reliable, and scalable, with the added information providing significant advantages over existing methods,” he explains.

Some movement is occurring

Advances, while not numerous, are being introduced and demonstrating their benefits. Some downstream process innovations are being spurred by upstream process improvements in conventional biologics manufacturing, such as productivity increases and continuous manufacturing, according to Vanek. The demand is also increasing, he notes, due to the broader array of biologics being manufactured including plasmids, RNA, and viral vectors; the physical nature of these materials makes purification with conventional downstream methods challenging. 

“This diversity of products is one key element driving innovation. Real-time analysis using rapid methods and their integration into existing workflows is another innovation driver. Analysis that addresses not only product quantity, but simultaneously product quality, can be a real-time and cost-saver in the steps towards releasing a therapeutic product,” Vanek adds.

Vollmer points to online LC, new Raman spectroscopy solutions, and new near-infrared (NIR) instruments and LC/capillary electrophoresis (CE)-based analyzers that target very specific single-attribute applications.

Implementation of Raman spectroscopy for bioreactor feedback control, Verlenden notes, is contributing to increased efficiency, productivity, and quality through inline analysis, chemometric analysis, and feedback control. One specific example is MilliporeSigma’s Procellics Raman Analyzer with Bio4C PAT Raman Software, which currently enables upstream scientists to deliver harvest material with more highly controlled quality attributes that should,in turn, reduce downstream purification challenges.

“Eventually this technology should be useful for streamlining future adoption of Raman and other spectroscopic techniques for monitoring critical quality attributes such as concentration, aggregates, and formulation composition,” Verlenden believes.

More advances on the way

Other technologies are under development with the goal of moving analytics from offline to at-line and online and enabling the monitoring of not just critical process parameters but also critical quality attributes.

“Process control based on critical quality attributes will provide additional degrees of freedom that are not possible today,” observes Verlenden. “For example, if aggregate levels could be monitored in real time, an excursion early in the process that today would require rework or a deviation could be corrected downstream within identified parameters to meet process purity goals. This type of approach would represent a fundamental change in how we develop and manufacture biologics,” he adds.

Agilent is focused on the development of online-LC solutions because this technology offers significant versatility, according to Vollmer. “Online LC is a very promising technology since it has the power to analyze multiple process and product-related attributes. With LC, it is possible to measure a multitude of parameters and get high-quality data from the analysis. LC is also a technology widely used and well-known in the pharma industry, and it can be connected to a variety of different detectors that provides different angles of insight,” he explains.

The key to successful online LC, Vollmer stresses, is to develop analytical instrumentation that seamlessly plugs into the overall process software environment.

Nirrin Technologies, meanwhile, utilizes the strengths of NIR, including rapid reliable results with a very high dynamic range, by combining new sensor deployment designs with novel laser technology, contends Hartmann.

Hoping for even more

While many of these developments are still in the early stages, scientist involved in the development of analytical solutions for downstream processing continue to set their sights well beyond what might be possible in the near term.

For instance, Vollmer would like to see the introduction of online LC-mass spectrometry (MS) technologies for downstream processing applications. “Online LC/MS would provide an additional layer of insight into the process,” he says. First, however, simplification and user friendliness must be improved for MS so that operators working on downstream processing lines who may not be analytical experts can operate these platforms with ease.

Article Details

Pharmaceutical Technology
Volume 45, Number 12
December 2021
Page: 30–32

Citation

When referring to this article, please cite it as C. Challener, “Analytics Advances for Optimizing Downstream Processes,” Pharmaceutical Technology 45 (12) 2021.

Recent Videos
Behind the Headlines episode 6
Behind the Headlines episode 5
Buy, Sell, Hold: Cell and Gene Therapy
Related Content