Design Space Characterization and Risk Assessment Through Mechanistic Modeling

Publication
Article
Pharmaceutical TechnologyPharmaceutical Technology-11-02-2018
Volume 42
Issue 11
Pages: 46–49

Mechanistic process and product modeling turns data into knowledge that can be extrapolated with known confidence for design space characterization and risk assessment.

maximmmmum/stock.adobe.com

Mechanistic modeling uses well-established methodologies that have proven success in ensuring quality, safety, and efficiency in industries outside of the pharmaceutical industry for decades. Though newer to the pharmaceutical industry, mechanistic models have seen increased use in R&D in recent years to increase the science (physics and chemistry) behind quality-by-design (QbD) practices, hence increasing R&D efficiency and requiring fewer experiments to characterize a process, independent of the number of critical process parameters. This approach turns data into knowledge that can be extrapolated to other scales and/or operating conditions with known confidence and enables the construction of probabilistic design spaces to characterize robustness.

Mechanistic process and product modeling

To date, the majority of use-cases for mechanistic models in the pharmaceutical industry either focus on a single or few process operations within drug substance or drug product manufacture, or they aim toward quantifying or predicting product performance through mechanistic in-vitro or in-vivo modeling. Some of these applications include flowsheet modeling, in which several manufacturing operation models are integrated into a single process model to capture interactive effects across unit operations. These flowsheet models typically address either drug substance or drug product manufacture independently, enabling mechanistic-model-based design of experiments, process design, and optimization.

Less common, however, is the integration of these subsystem models in a holistic, systems-based approach to process and product design. As a result of successful applications of mechanistic component models in R&D, this concept of end-to-end process and product modeling is gaining traction at several large pharmaceutical companies, with the aim to reduce the number of iterations between product and process design and accelerate development timelines while maintaining high product quality.

Once a model of an integrated pharmaceutical process is available, it can be used to analyze the process performance under nominal conditions, its ability to cope with “abnormal” conditions and disturbances (e.g., raw material variability), and the effects of changes in the operating policy for, and/or the design of, individual equipment items. From the mathematical point of view, all of these correspond to process simulation calculations where the user specifies all the process inputs, and the model equations are solved to determine all the relevant outputs and, in particular, the key performance indicators (KPIs) of the individual steps and of the overall process. 

These single-point calculations can answer targeted questions about the process design space. Does an operating point yield a quality product, provided that the sources of variability are well controlled? What is the expected value of a quality metric for a given set of operating conditions?

In reality, the operating point can be variable, and the sources of common-cause variability are not always negligible. As a result, “single-point” predictions of process KPIs (including critical quality attributes [CQAs]) computed via isolated process simulations are often of limited value. Instead, of more interest may be the probability distributions of these KPIs and also the manner in which uncertainty in process KPIs can be attributed to the uncertainty and variability of individual inputs. The key question then becomes: What is the probability of quality throughout the operational space of the process? The same mechanistic model used for single-point calculations can be used to answer these questions through sensitivity and uncertainty analyses.

Risk management through sensitivity analysis

The concept of the design space-defined by the International Council for Harmonization (ICH) Q8(R1) as “the multidimensional combination and interaction of input variables (e.g., material attributes) and process parameters that have been demonstrated to provide assurance of quality” (1)-is central to current thinking regarding the degree of flexibility afforded to pharmaceutical process operation within a regulatory framework. In principle, process simulation could be used to determine whether a given operating point falls within the design space. Such a single-point calculation, however, may be unreliable in view of the uncertainties mentioned previously; in reality, one can only determine the probability that a particular point is within the design space. Moreover, establishing the shape and size of the entire design space of any non-trivial process using isolated process simulations is impractical, and one may need to resort to more complex types of calculations.

The use of calibrated mechanistic models enables a step change improvement in risk assessment and mitigation at the heart of QbD. Through performing virtual designs of experiments (DoEs) with mechanistic models, one can assess the individual and combined effects of 10–20 factors, as opposed to the three to four factors typically seen in physical DoEs. As shown in Figure 1, a global analysis can be used to quantify and predict the combined effects of uncertainty and variability in process, product, and patient parameters on probability distributions of risk-related CQAs. By globally assessing uncertainty and variability, risk-mitigation strategies can be designed to target and reduce the highest-impact sources of variability.

Figure 1: Global sensitivity and uncertainty analysis methodology wherein a mechanistic model describes a system of interest, and the key performance indicators (KPIs) are quantified as probabilistic distributions based on the multivariate combination of design/operational decisions, sources of common cause variability, and model uncertainty. Courtesy of the authors.

Another dimension of complexity is added by the uncertainty that is inherent both in some process inputs (e.g., levels of raw material impurities) and in the underlying processing step models. Formal mathematical techniques can be used to perform elementary effects analysis and identify which of the input factors have a non-negligible effect on at least one of the output responses in order to eliminate unimportant factors from further consideration at an early stage. Uncertainty analysis is next performed to determine the probability distribution of output responses in order to quantify the impact of uncertainty and variability. If the resulting probability distribution of the output response is unacceptable, then a model-based global sensitivity analysis (GSA) can be performed to compute global sensitivity indices of each response with respect to each factor and apportion variability of outputs to that of inputs, thus enabling targeted risk-management actions. 

 

Early adopters

Though the approach is novel, several early adopters have successfully demonstrated design-space characterization through sensitivity analysis of integrated mechanistic process and product performance models.

Gavin Reynolds (AstraZeneca) performed design-space characterization and risk analysis on a dry granulation and tableting process, as shown in Figure 2 (2,3). In this study, the product CQAs included dissolution behavior and tablet hardness, and the material flowability during processing was considered as an additional design constraint. Through the flowsheet model, the effects of the roller compaction process parameters (roll force and gap width), milling screen size, number of blender revolutions, and tablet compaction force were explored, along with the roller compactor’s side-seal leakage, which is a source of common cause variability. A sensitivity analysis was performed to identify the multivariate design space within which the product quality and manufacturability targets were met. Further, Reynolds evaluated the risk associated with changing to a different lubricant grade, overlaying the resulting design spaces for the two lubricant grades to identify the operational area of lowest risk.

Figure 2: An integrated model in gPROMS FormulatedProducts (Process Systems Enterprise) of a roller compaction and tableting process with in-vitro dissolution testing (top), design space characterization with respect to a change in grade of excipient (bottom) (3). Figure is courtesy of AstraZeneca

A second example of early adoption is found in the work presented by Marta Moreno-Benito (Pfizer), et. al.(4). Moreno-Benito developed an end-to-end model from drug substance manufacture to drug product manufacture and finally in-vitro performance testing. The key objectives of the work were to predict the impact of design decisions on quality attributes of the product and to quantitatively identify the design space around the process, using GSA to analyze the system holistically. Results showed that API crystal size, driven mainly by the API milling process parameters, largely influenced the dissolution behavior despite the many processing steps in between milling and dissolution testing.

Future perspectives

While mechanistic modeling has already taken hold in the pharmaceutical industry, the next phase of adoption will be driven toward holistic, integrated approaches to achieve product quality and minimize risk through probabilistic input and output spaces across the drug manufacture process. The adoption of these tools will rely on building confidence in the mechanistic models and the tools used for their analysis, along with the development of the skillsets to use them effectively. The early use cases of this methodology are promising and demonstrate a changing mindset in the approaches for design space characterization and risk assessment.

References

  1. ICH, ICH Harmonised Tripartite Guideline: Pharmaceutical Development Q8(R1) (Geneva, Switzerland, November 2008).
  2. E. Gavi and G.K. Reynolds, Computers and Chemical Engineering 22, 130-140 (2014).
  3. G.K. Reynolds, Digital design of drug product: Application of global system analysis to a tablet manufacturing process [Video webinar] (2016) www.psenterprise.com/events/webinars/2016/16-10-digital-design-of-drug-product.
  4. M. Moreno-Benito, et al., Solid drug product and process design using multi-scale interconnected flowsheet modelling and global system analysis, Poster session at AAPS Annual Meeting & Exposition (San Diego, CA, 2017).

Article Details

Pharmaceutical Technology
Vol. 42, No. 11
Pages: 46–49

Citation: 

When referring to this article, please cite it as D. Barrasso and S. Bermingham, "Design Space Characterization and Risk Assessment Through Mechanistic Modeling," Pharmaceutical Technology 42 (11) 2018.

About the authors

Dana Barrasso is senior consultant at Process Systems Enterprise, Inc., 3 Wing Drive, Cedar Knolls, NJ, d.barrasso@psenterprise.com. Sean Bermingham is head of Formulated Products at Process Systems Enterprise, Ltd., 26-28 Hammersmith Grove, Hammersmith, London, UK, s.bermingham@psenterprise.com.

Recent Videos
Christian Dunne, director of Global Corporate Business Development at ChargePoint Technology
Behind the Headlines episode 6
Behind the Headlines episode 5
Related Content