Applying Automation Hardware, and Software to Create Efficiency in Pharmaceutical Analysis Using Enhanced and Traditional Techniques for Drug Characterization

Return to All Sessions

Note: symposium canceld by the organizer.

Thursday, May 5, 2022 1:00 pm EST
Organizer: Stacey Marden, AstraZeneca

This symposium will focus on new strategies for laboratories to reduce costs and keep the lab running efficiently with presentations focused on automation, non-destructive techniques and innovative designs to adapt to the ever-changing analytical landscape. Particular focus will be on speeding up analysis in toxicology laboratories, explosive analysis and utilizing automation to reduce the ability for human error.


Presentation 1
Enhancing High Throughput Analysis Using Automation and Machine Learning
Christopher Welch, ICASE (Indiana Consortium for Analytical Science and Engineering)
High throughput analysis (HTA) is an increasingly important component of automated workflows in the pharmaceutical industry and in other areas of research where high throughput experimentation is carried out. In recent years there has been a growing interest in applying artificial intelligence and machine learning to enhance and enable HTA workflows. In this presentation we describe several recent projects that provide a glimpse into the new opportunities coming from this union of automation, analytical chemistry and data science, including development of new approaches for improved chiral chromatographic method development and chemical structure elucidation via mass spectrometry. Recent work in this area coming from the Center for Bioanalytic Metrology (CBM), an NSF-funded Industry-University Cooperative Research Center, will be highlighted,


Presentation 2
Accelerating the Development of Drug Formulation Design with Modern High Throughput Experimentation Analytical Techniques
Matthew Bahr, GlaxoSmithKline
Advances in high throughput automation – both with regards to platforms and analytical instrumentation – have generated growth in drug discovery and product development experimentation processes. Initially, automation platforms limited their focus on aqueous and organic solubility studies. As capabilities have improved, more functionality has been introduced which provide for more opportunities to characterize drug molecules from earlier in the candidate selection phase through to product development. The scientists at GSK’s high throughput experimentation (HTE) lab, in Collegeville PA, have deployed a variety of automation equipment to drive their investigations. These systems include Unchained Labs CM3 robotic platforms, Mettler Toledo Quantos & Chemspeed powder dispensing platforms, and Hamilton & Andrew liquid dispensing platforms. As focus has intensified in preparing long-acting injectable (LAI) drug products, the HTE lab has developed unique protocols for determining partition coefficients, viscosity measurements, version and form analysis, formulation stability, and excipient compatibility. Because this data can be produced with small amounts of precious drug substance, HTE enables critical-path decisions to be made earlier in the process, with benefits that include bringing new molecules to the patient sooner and at lower cost. This presentation will focus on several HT experimental designs and how they are applied to LAI formulations. Specifically, we will discuss the measurement of distribution coefficients, viscosity, and design of experiments for LAI pre-formulation studies. The results of this work are informative for developing final formulation designs with appropriate excipient concentrations which ultimately produce our clinical trial materials.


Presentation 3
Design, Execution and Analysis of High Throughput Excipient Compatibility Studies. Assessing Chemical and Physical Stability Risks to accelerate Drug Product Development
Timothy Rhodes, Merck
At any point in time, there is always more that we would like to accomplish than there is time. This is certainly the case in pharmaceutical development. We’ve had a number of successes in applying automation to disrupt this norm but, in fact, it doesn’t really change the dilemma, it just moves the bar. The development and application of automation to sample preparation has lowered the barrier to generating a wider array of samples. This has shifted the bottleneck towards analysis. The subsequent application of automation to the development of higher throughput analytical instruments and the associated methods has moved the bar yet again. The present barrier is rapid automated analysis of large data sets, oftentimes multivariate in nature. We are making headway on chipping away at this challenge. The final sequence in this exercise involves how we then translate these outcomes into informed decision making downstream in product development AND how these outcomes inform rational study design for the next program. In this work, we’ll explore these concepts through the design, execution and analysis of high throughput excipient compatibility studies to investigate both chemical and physical stability risks in early product development. Study design considerations include not only empirically differentiating between low and high risk combinations but also correlating observations with material and chemical properties one can use to infer potential mechanisms. Execution considerations include minimizing material needs, streamlining sample preparation, method selection, and analytical data acquisition. Analysis considerations include organizing, parsing and pre-processing data, incorporation of models to rank order risks, and data exploration and visualization tools to provide an interactive environment to gain insights and surface deeper questions from the data.


Presentation 4
Finding Signal in the Noise: Visualizing Big Data to Design Better Drugs
Dennis Leung, Genentech
The discovery and development of new drugs has become increasingly challenging. The number of new drug candidates that encounter failure in clinical development has increased and the number of new drugs reaching the market has decreased in recent years. One factor contributing to the high attrition rate is due to unexpected behavior in the clinic, for example due to unforeseen toxicity or lack of efficacy. This has been further compounded by increasingly challenging new disease targets occupying historically “undruggable” space. The interaction between drug molecules and these undruggable targets can be difficult to predict and optimize. As a result, it is critical to understand the properties of potential drug molecules as well as their behavior towards these targets. In parallel with these challenges, there has been a concomitant increase in the development and application of new high throughput tools for rapidly assessing and characterizing a large number of drug molecules. While this can significantly enhance the exploration of the drug molecule space, it can be also result in an overwhelming amount of data that must be carefully parsed and interpreted. While the quantity of data can be useful, it is crucial to be able to find the quality “signal in the noise” and to be able to rapidly extract meaningful conclusions from Big Data in order to better influence drug discovery and development. This presentation will focus on identifying key correlations between chemical properties obtained in high throughput approaches with meaningful study endpoints (e.g., bioperformance), resulting in improved strategies for optimizing drug molecules.


Presentation 5
Solution Phase Stress Testing of Drug Candidates and Solid Dosage Form Drug Products – Best Practices
Stacey Marden, AstraZeneca
Forced degradation of pharmaceuticals is a critical component in drug development to understand a compound’s innate stability and to develop stability-indicating methods. This presentation will look at best practices to aid in selecting candidates that can be developed as stable dosage forms, including a perspective for automating these assays. Also, the presentation will touch on stress testing during formulation development. We will present findings from a recent publication, where members from 9 pharmaceutical companies and the Brazil agency ANVISA collected stressed-stability data from 62 solid drug products to determine which stress conditions produced degradation products also observed in formal stability studies (1). The presentation will conclude with a perspective on the future of stress testing in the pharmaceutical industry including the use of automation and artificial intelligence.

 
Register Now Return to All Sessions