The use of smart tools in early drug discovery can have an impact on downstream phases of drug development.
Digitalization, artificial intelligence (AI), and machine learning (ML) are several of the major tools being used in the bio/pharmaceutical industry today to implement significant leaps in optimization (e.g., reducing costs, shortening timelines, streamlining processes, improving product quality, etc.). Importantly, these tools are being used early in the drug development cycle, from R&D/drug discovery through to clinical-phase development, and then onward in commercial manufacture. A closer look at the implementation of these tools can shed light on the importance of smart drug development in the early discovery phase. For instance, how does using “smart” tools have an impact on the development process going into the clinical phases?
The existing drug development paradigm is based on moving through several stages, sequentially, says Jo Varshney, PhD, founder and CEO of VeriSIM Life, a US-based company specializing in the use of advanced AI and ML techniques for drug discovery and development. Drug development moves from the identification of a disease or condition target to validating a safe and effective drug candidate in human clinical trials, she notes.
“This paradigm suffers from tremendous waste and cost. Passing from one stage-gate to the next is a siloed process where data [are] often not shared and poor proxies define success for human biology—especially animals,” Varshney remarks.
Smart drug development re-imagines the traditional drug development process by allowing for design and validation early in discovery through the use of better models of human biology and more comprehensive data. This allows for the process to be “parallelized,” where, typically, steps such as dosing or formulation can be factored in conjunction with hit-to-lead selection, Varshney notes. “Smart drug development quickly identifies dead-end outcomes so you can course correct before investing in heavy experimentation, which saves time and costs, getting better drugs to patients faster,” she emphasizes.
“‘Smart’ drug development has always been with us,” says Grant Wishart, PhD, senior director, Small Molecule Drug Discovery and Logica Lead, Charles River Laboratories. Wishart points out that the bio/pharma industry has generated knowledge from past experiences and developed and leveraged new technologies to bring new medicines and safer medicines to patients for increasingly complex diseases.
“From rational approaches in the 1960s pioneered by the Scottish pharmacologist, Sir James Black, to computational structure-based design emerging in the 1980s—but perhaps not fulfilling its potential until the early 2000s—there have been many ‘smart’ enhancements to drug discovery and development,” Wishart explains. “Each of these has influenced and advanced what we do to a greater or lesser extent.”
Wishart also points out that organizations continuously strive for “smarter” drug development to reduce discovery timelines and increase certainty in clinical progression. “Today we are at the intersection of advances in data generation, automation, and advanced computation, which come together to offer potential step changes in how we discover and develop drugs of the future,” he remarks.
Digitalization is a major trend in the bio/pharm industry today. Advances in digitalization and other tools have allowed the industry to organize and connect data sets as well as set up more efficient workflows, says Wishart. Smart tools such as digitalization have also allowed the industry to connect disparate data sets from data management systems and instruments across the continuum.
According to Wishart, this connectivity helps the industry to manage information along individual programs. Moreover, it also helps in the aggregation of data to progress ML/AI modeling toward better prediction and assistive tools, he says.
“By doing this, we also have the opportunity to reduce the overall timelines for researchers to match data sets shared from various partners across the ecosystem, including academia, CROs [contract research organizations], and CDMOs [contract development and manufacturing organizations],” Wishart states. “Today, scientists spend a tremendous amount of time doing this, and we envision a world where there is strong interoperability of data between organizations across all phases of drug discovery.”
Meanwhile, Varshney emphasizes the point that smart drug development is dependent on digitalization. She explains that in-silico-based approaches, whether labeled “Model Informed Drug Development” or “digital twin,” bridge the gap between human biological process and understanding those processes by leveraging tremendous amounts of data from existing scientific resources and custom-developed assays.
“Digitalization enables the conversion of analog data for the purposes of complex analysis and machine learning. This opens the door to a high-powered computational understanding of biological processes and predictions about how drugs will interact with them,” Varshney states. She further adds that digitalization also reduces the “black box” of animal testing, in which experiments often succeed only to fail inexplicably in humans.
Tools such as AI, automation, ML, and digitalization are already being utilized for bio/pharmaceutical manufacturing processes, but how are they being leveraged in the R&D and drug discovery/drug screening phases?
According to Varshney, one use for AI and ML is generating compound hits for a given target, a process that typically screens for molecular structures that can bind to the target from billions of potential molecules. Another use she points out is the prediction of various characteristics of one or more (e.g., 1000s) molecules, including the mechanism of action in the body, toxicity, and more—across different dosing, routes of administration, and formulations. “These predictions can then be used to rank compounds for potential success and failure well before clinical trials, reducing the financial risk associated with developing a drug candidate,” she adds.
AI and ML techniques can also be used to understand how drug candidate efficacy may vary in different patient populations, Varshney continues. These techniques can be used to inform trial design, dosing, and combination therapies, among other things. “Generating predictive patient stratification analysis in this way can improve clinical success and ultimately move us closer to precision medicine care,” she states.
Wishart notes that the industry is seeing a huge interest in applying AI/ML and automation to all steps of the drug discovery process and across all modalities. In target discovery, for instance, huge volumes of data are being linked to select disease-modifying targets. Meanwhile, preclinical and clinical data are being used to select the patient populations for which the drug is beneficial. However, Wishart adds, the area of greatest personal interest is in the early discovery phase (small molecules) where generative models are being used to design compounds optimized in their scoring versus predictive models for parameters of interest, such as on-target potency, early absorption, distribution, metabolism, and excretion (ADME), and selectivity. Using generative and predictive models in this way allows for the ability to better choose compounds for synthesis and testing.
“Success is highly dependent on the predictive utility of such models. As a minimum, models must be able to remove the compounds with the lowest probability of success of meeting the desired profile. Although expectations are typically much higher where we are striving for quantitative predictions of in-vitro data and extending our modeling efforts to pharmacokinetics, efficacy, and safety,” Wishart says.
For example, Charles River, with its partner, Valo Health, created an end-to-end AI-driven approach to drug discovery—in this case small-molecule discovery—called Logica, Wishart says. This approach combines computational AI and experimental approaches to hit identification and optimization. “Ligand-based computational models are used to trawl vast areas of chemical space for potential hit series. If there is no literature data to develop the computational models for the target, then Logica simply leverages experimental high throughput screening (HTS) and DNA Encoded Library (DEL) screening to generate that initial data and to fuel hit finding and model building,” he explains.
In this scenario, hit compounds are optimized to advanceable leads and, ultimately, to clinical candidates through AI/ML-guided approaches to de-risk candidate compounds and ensure that only the best candidates progress. “The Logica concept results in fewer design cycles, fewer compounds synthesized and tested, and enhanced decision-making on progression,” says Wishart.
“The opportunity to integrate the ‘design’ and ‘analyze’ components of the design-make-test-analyze (DMTA) cycle with advances in synthetic route prediction and automation in synthesis seems tantalizingly close,” Wishart continues. He notes that one can envision a near future state where drug discovery experts are driving parts of the drug discovery trajectory through almost closed loop systems comprising AI/ML-enhanced designs and automated synthesis, purification, and testing.
“The DMTA cycle for small-molecule discovery is, of course, a very specific area of drug discovery, and there are also many technology pioneers applying similar principles in the design, optimization, and development of other modalities, including antibodies and peptides,” Wishart states.
In enhancing the early R&D and drug discovery stages, the industry is seeking faster ways to generate, gain insights, and make decisions to get therapies to patients as fast as possible, emphasizes Wishart. One way to gain speed is through the appropriate management of preclinical trials. “The better we manage our preclinical data within organizations and across organizations, the higher probability we will have to make this [speed-to-patient] happen,” Wishart says. “By designing digital systems that simplify the collection, organization, visualization, and sharing of [these] data, we contribute to the progress toward smarter drug discovery. This starts with strong digital architecture, data standards, and connectivity of tool sets.”
It helps that, today, various organizations are working together to join the pieces together and leverage the advances in these tool sets, including AI, ML, and automation. “We see ourselves as a key player in the ecosystem of these technologies,” Wishart states.
“Hybrid AI and model-informed predictions of drug behavior inform the design of preclinical testing to be both prescriptive of testing and orthogonally confirming of testing,” adds Varshney. For example, she explains, AI-based predictions can guide which rat experiments to conduct and how to conduct them; in addition, they can also explain discrepancies and variabilities of observations, which can elucidate important disease pathways.
“Smart drug development is all about accelerating therapies to patients while reducing unnecessary research and development,” Varshney says. To that end, introducing techniques such as predictive analytics early in drug candidate discovery can eliminate dead-end compounds much more quickly. Narrowing the field with such techniques, before performing animal model tests that often result in false positives, can save millions of dollars in testing and assays, she asserts. Meanwhile, closing the early “translational gap” can increase the number of successful drug candidates entering clinical trials.
“Drug discovery is extremely difficult and is often a high-risk, high-reward industry,” Wishart states. “Costs for drug development are high, timelines are lengthy and are accompanied by high failure rates throughout the process. Therefore, elevating the probability of success through means of enhanced knowledge, technology, and automation is of the utmost importance,” he says.
While it is known that the most costly and lengthy phases of drug development are in the clinical phases, the actions taken in the early discovery phase can and will have a significant impact, Wishart stresses. Selecting the right target with the greatest certainty in modulating the disease in humans is imperative, and reducing timelines in the hit-finding and optimization phases can result in benefits to the patient (e.g., faster delivery of the therapeutic) and to the organization (e.g., additional patent life).
There is also potential for de-risking pre-clinical candidates during the optimization phases through the development of advanced safety and pharmacokinetics prediction models, Wishart adds. “Taken together, it is clear that ‘smarter’ drug discovery in the early stages can have a major impact on the downstream success, with the ultimate goal being to develop medicines for unmet medical needs and to get these medicines to patients as quickly and as safely as possible,” he says.
Among the misconceptions that persist around the “smart” technology evolution is the assertion that AI is unreliable. Varshney observes that many scientists believe AI and ML technology can never replace traditional scientific methods, and, therefore, these technologies cannot be trusted to provide value in drug discovery and development. “Much of this mistrust is due to the need for explainability found with many systems. But explainability is now being addressed by many algorithms and within individual technology platforms,” she explains.
For example, Varshney points to a hybrid AI technology utilized by VeriSIM Life (the company’s Translational Index technology), which scores the potential of a compound to succeed in the clinic. The technology also provides an itemized breakdown of the score’s components, which gives insights on what is working, what needs improvement, and recommendations on how things can be done differently for better outcomes. Insights generated may include weighted measures of parameters such as the drug’s properties, animal study outcomes, and first-in-human doses.
Meanwhile, Wishart notes that there are several points to consider around the use of these tools and technologies. For one, there is unlikely to be a magic “smart” solution to drug discovery that will elegantly transform drug discovery into a simple process. Drug discovery is much more complex than that and, as the industry has seen in the past, there will continue to be evolution and progress in the way new drugs are discovered and developed in the future. This evolution will largely be due to the emergence of new and exciting technologies.
“Perhaps the biggest challenges are twofold,” Wishart expounds. “First of all, managing our own expectations, both positive and negative, to truly understand where new developments will have the biggest impact. Secondly, understanding what needs to change to enable that impact—this can encompass co-technologies or deeper cultural changes necessary to embed such developments.” For instance, looking at the role of AI/ML in early drug discovery rather than simply using those tools to bolster current workflows can create competing workflows with traditional tried and tested approaches. This conflict can potentially lead to a challenging environment where the impact of the AI/ML approaches are thus not realized, Wishart cautions.
Feliza Mirasol is the science editor for Pharmaceutical Technology.
Pharmaceutical Technology
Vol. 47, No. 5
May 2023
Page: 16–20
When referring to this article, please cite it as Mirasol, F. Using Smart Tools for Smart Development. Pharmaceutical Technology 2023, 47 (5), 16–20.
Drug Solutions Podcast: A Closer Look at mRNA in Oncology and Vaccines
April 30th 2024In this episode fo the Drug Solutions Podcast, etherna’s vice-president of Technology and Innovation, Stefaan De Koker, discusses the merits and challenges of using mRNA as the foundation for therapeutics in oncology as well as for vaccines.