Researchers from the University of Bonn have uncovered unexpected findings about how artificial intelligence (AI) systems used in pharmaceutical research operate. Rather than learning complex chemical interactions, these AI models primarily rely on recalling existing data.
Decoding AI in pharmaceutical research
A team led by Prof. Dr. Jürgen Bajorath, a chemoinformatics expert at the University of Bonn, have made significant strides in deciphering the processes of artificial intelligence (AI) systems used in the pharmaceutical industry. Despite the rapid advancements in AI, it often operates in an obscure, 'black box' nature which makes its processes and decision-making mechanisms hard to understand. This breakthrough technique developed by the team unravels some of the intricacies of these AI systems, offering a glimpse into their operational mechanisms.
In a surprising revelation, the team discovered that the AI models used in drug research primarily rely on data recall rather than learning and understanding intricate chemical interactions. This discovery, recently published in Nature Machine Intelligence, indicates that these AI systems are using existing data to predict the effectiveness of drugs rather than learning new information about chemical interactions, as previously thought. This insight into the functioning of AI in drug research could lead to a re-evaluation of how AI is utilized in the pharmaceutical industry.
Graph neural networks in drug discovery research
The study leverages Graph neural networks (GNNs), a machine learning application increasingly used in drug discovery research. GNNs are used to predict the binding strength of a molecule to a target protein, which is a crucial factor in drug effectiveness. Using a special method called 'EdgeSHAPer', the researchers were able to analyze how GNNs generate their predictions, essentially providing a look into the 'black box' of AI.
The study also challenges the efficacy of AI in drug discovery. It suggests that the predictions made by AI models are often overrated, as similar quality forecasts can be achieved using basic chemical knowledge and simpler methodologies. This could be a game-changer, prompting a shift towards simpler, more traditional methods in drug discovery research. However, the researchers highlight that AI still holds potential, with certain models showing promise to learn more interactions when the potency of test compounds increased.
The importance of 'explainable AI'
Prof. Dr. Jürgen Bajorath underscores the importance of 'explainable AI,' where the focus lies in understanding how machine learning arrives at its results. He believes that developing methods to explain predictions of complex models is a crucial aspect of AI research. The team's work on EdgeSHAPer and other analysis tools are promising steps in this direction, potentially opening up new pathways in AI research and applications.