Significant progress has been made in applying artificial intelligence (AI) and machine learning (ML) to drug development. At a recent panel discussion called “Biopharma AI Scorecard” experts from industry and venture capital at the BioCentury-BayHelix East-West Summit discussed how AI and ML technologies are being used to design therapeutics and enable new drug discovery efforts. Panelists, who included Michelle Chen, PhD, CBO of Insilico Medicine; Tom Chittenden, PhD, CSO and President of R&D of BioAI Health; Aaron Arvey, Director of Machine Learning at Third Rock Ventures; Gevorg Grigoryan, PhD, Cofounder and CTO of Generate Biomedicines; and Kiersten Stead, PhD, Managing Partner of DCVC Bio, also discussed the molecular underpinnings that drive cellular behavior and dictate phenotype and the ways that deep technology and the life sciences intersect across the drug discovery continuum. Karen Tkach Tuzman, Senior Editor of BioCentury, served as moderator.
When asked how AI and ML are impacting drug development, Stead noted that COVID antibodies relied on AI/ML systems to keep pace with viral mutations. Added Chen: “Pfizer capitalized on AI to accelerate the drug development path to Paxlovid.” At Insilico Medicine, said Chen, the company’s Chemistry42 platform is an automated ML platform that discovers small molecule drugs based on generative adversarial networks.
“Most drug discovery scientists are looking for a needle in the haystack in a high throughput compound library screening,” Chen said. “Chemistry42 enables the design of de novo needles.”
The Company’s Pharma.AI platform identified a small molecule drug to target fibrosis that is associated with aging and age-related diseases. “One small molecule that was developed is currently in clinical Phase 1 trials and has shown IC50 at nanomolar level with excellent preclinical efficacy and safety,” Chen said. “The team was able to go from the initial hypothesis to target identification to molecule design to preclinical candidate selection in less than 18 months.”
Better Models and Better Data
The human body is a complex system. AI and ML tools provide detailed modeling that can reflect levels of biological and chemical interactions and make accurate predictions. As these systems are fed huge quantities of data, they are able to recognize patterns and underlying principles.
As the AI has evolved, noted Stead, companies’ focus has shifted, too. Smaller companies are focused on clinical assets which are data-driven, she said.
“In addition to focusing on small molecule discovery, researchers are solving specific problems relating to ADME [Absorption, Distribution, Metabolism and Excretion] and looking at chemical scaffolds using AI applications that are purpose built,” she added.
ProteinMPNN algorithms – machine learning for designing proteins – is another development that has allowed for much higher protein sequence recovery on native backbones, said Arvey.
For the AI to truly be effective, quality data is essential, said Chen.
“Trained AI needs a good quality data set because of the underlying challenge – AI does not know what it does not know.” She noted that Insilico Medicine’s Chemistry42 system relies on a library of over 1 billion small molecules, many of which have known properties, in order to predict features, combined with in vitro and in vivo data.
“The AI was trained using this data, resulting in over 30 AI modules covering differing properties such as novelty, diversity, affinity, pharmacokinetic, and ADME features,” Chen said. “Hence it is a robust technology, and novel when it comes to unsupervised learning. Medicinal and computational chemists were trained to look at the features in a realistic manner. Combining AI capabilities with the experience of drug developers is the key to success in handling purpose-built, as well as publicly available data.”
Chittenden echoed the need for purpose-built data sets. “Especially at the molecular level for drug discovery, there are many hidden technical biases,” he said. “Data needs to be harmonized to remove these biases. Hence, we must generate our own data sets and use publicly available data to validate identified signals. This was done with The Cancer Genome Atlas which is highly annotated and curated. However, our AI algorithms were able to identify errors in annotation as well as uncover biases in these data sets.”
Publicly available data can be variable in quality, Chen stressed. That’s why, she noted, Insilico Medicine has 40 curators to annotate and perform quality control to ensure that the data is clean. “AI and automation technologies should be considered as partners and not a replacement for people,” she said. “It is not about random data going into a black box.”
The conversation also included the ways in which AI is being utilized to improve clinical trial design, with Chen highlighting Insilico’s latest tool, InClinico, which can predict the likelihood of success for clinical trials based on a number of parameters.
Ultimately, the panelists emphasized that AI and ML have made significant inroads in improving the drug discovery process, and the ways in which the technology is rapidly evolving. The future holds further opportunity to apply AI and ML capabilities to other aspects of the process, they noted – including in drug manufacturing, supply chain logistics, clinical trials, pharmacovigilance and adverse events monitoring.
Grit Daily News is the premier startup news hub. It is the top news source on Millennial and Gen Z startups — from fashion, tech, influencers, entrepreneurship, and funding. Based in New York, our team is global and brings with it over 400 years of combined reporting experience.
Credit: Source link
Comments are closed.