Unraveling the role of attention mechanisms in drug design

Mondo Parenting Updated on 2024-02-02

Today I would like to tell you about an article published in Briefings in Bioinformatics in January 2024Attention mechanisms are applied in drug designreview. This article begins withThe principles of the current mainstream attention-based methods and their advantages in drug discovery, and discussed them furtherApplications in various areas of drug development(From drug-like molecular screening and target binding to molecular properties** and molecule generation tasks, etc.). Although data quality and model interpretability are still challenges that need to be overcome in the future, in the futureA model based on attention mechanismsThe in-depth research will usher in a revolutionary breakthrough in the pharmaceutical field and significantly accelerate the pace of drug development.

Traditional drug discovery methods typically involve:Extensive experimentation and manual screeningThis is a time-consuming and costly process, i.e., it takes more than a decade to go from drug discovery to market launch, with billions of dollars invested. In this process, researchers need to conduct a large number of experiments to evaluate properties such as the activity, toxicity, and pharmacokinetics of the molecule. This process of trial and error can lead to the elimination of many potential drug candidates, thus increasing the cost of development.

withModern AI technologyThe field of drug discovery has undergone revolutionary changes. Language models based on attention mechanisms, such as GAT, Transformer, BERT, and GPT, can process large-scale data more quickly and accurately. InDrug screeningAt this stage, these methods enable rapid and accurate screening of a large number of compounds to accelerate the discovery of potential drug candidates. Secondly, inDrug synthesis designIt can also help design new molecular structures to improve the efficiency and success rate of synthesis. InClinical trialsAI models also excel in drug risk assessment and precision medicine. By analyzing a large amount of data, it helps doctors choose the best regimen for their patients (Figure 1).

It can be seen from thisModern AI technologies have increased efficiency, reduced costs, and brought more forward-looking developments in future drug discovery methods

Fig.1 Relevant applications of drug design.

2.1 Principles of Attention Mechanisms

In 2014 Bahdanau et al. for the first timeAttention mechanisms

It is used in machine translation. In the process of processing sequence data, the model adaptively allocates attention to different sequence positions to capture the importance of each word, allowing the model to focus on different parts of the input sequence. Attention scores are obtained by calculating the degree of correlation between q and k, and then using these scores to a weighted average of v to obtain the final attention representation. The formula is.

Long attentionAs shown in Figure 2, multiple independent attention heads are introduced, each of which is computed independently. Finally, the output of multiple heads is spliced or weighted to form the final output.

Figure 2 Architecture of multi-headed attention mechanism.

2.2 Transformer architecture

Transformer modelThe overall architecture is shown in Figure 3, where the transformer's self-attention mechanism captures potential long-distance interactions between atoms when characterizing molecules. Compared to recurrent neural networks (RNNS), Transformer is parallelism and is able to efficiently process large-scale datasets。It can also be used in molecular properties such as hydrophilic, fat-soluble, and biologically active. In addition,Transformer can be used as a generative model to generate molecules with novel structures by learning the spatial distribution of chemistry

Currently,Transformer has a wide range of applications in various fields of small molecule drugs。InDrug interactions**The Transformer encoder integrates the features of different drug combinations to ** drug-drug interactions. InCancer drug reaction**Transformer can extract drug molecule signatures that combine transcriptome data to ** cancer drug response. InProtein ligands are mutually reciprocal**In the task, the Transformer encoder can extract protein-ligand interaction features (character and fragment level features) to determine the affinity between protein and ligand. In addition, inGenerate a modelThe Transformer can learn the molecular structure and properties in the latent space of the SMILES language of compounds and use a decoder to decode from the latent space to generate new molecules with the desired properties under given conditions.

Figure 3 Schematic diagram of the Transformer model architecture.

3.1. Drug-drug interaction**

Attention mechanisms also play an important role in drug-drug interaction tasks。Figure 4 illustrates the current mainstream attention-based DDI model. For example, in the SA-DDI model, a novel substructure-aware attention method is proposed to help pharmaceutical chemistry experts understand the key substructures that contribute the most to DDI**. In addition,The Transformer model also has certain advantages in capturing the correlation between each element and the elements in the sequence of drugs and genesFor example, attentionddi is a twin self-attention multimodal neural network that uses contrastive learning methods to measure the drug-like properties of compounds. The MDF-SA-DDI method uses multi-source drug fusion, multi-source feature fusion and Transformer self-attention mechanism to integrate information from different data sources and obtain information from multiple feature dimensions, which enhances the expression ability of the model.

Fig.4 ** model of attention mechanism in drug interactions.

3.2 Molecular properties**

The Transformer model also has a certain breakthrough in the task of molecular properties。Figure 5 illustrates the current mainstream attention-based molecular properties** model. For example, the ABT-MPNN model cleverly combines self-attention with the messaging network, improves the extraction method of molecular representation, and achieves competitive advantages in multiple property tasks. In addition,DHTNN modelThe BEAF activation function was introduced, and the bidirectional attention transformer was used to extract the molecular representations, so that the model could update the weights adaptively. The pharmhgt proposed by the Dreng team is excellent at capturing heterogeneous molecular map features, that is, the final representation vector of the molecule is obtained by extracting the three-layer view features at the atomic level, functional group level, and node level. The results show that the introduced attention mechanism can also make the model more reliable to distinguish the importance of different molecular features, and adaptively update the weights of the model, which is helpful for downstream tasks.

With the rapid development of the field of drug design, the attention mechanism can capture the complex relationships between molecular structures, improve the accuracy of the task, and enhance the interpretability of the model

Figure 5: Task-related model of molecular properties.

In this paper, the authors describe the application of attention mechanisms and related models in drug design. From drug molecule screening and properties** to molecule generation, attention mechanisms offer significant advantages. In the future, we can expect the future, although there are still challenges in terms of data quality, model interpretability, computing resources, and complexityA more effective and theoretically based attention mechanism-based approachto further accelerate drug design and lead to breakthrough developments.

References

cummings, m. d. &sekharan, s. structure-based macrocycle design in small-molecule drug discovery and **metrics to identify opportunities for macrocyclization of small-molecule ligands. j. med. chem.This article is an external contribution accepted by AIDD Pro, the views expressed in the article only represent the author's own views, and do not represent the AIDD PRO platform, if you find any copyright infringement or other information misinterpretation of the published content, please contact AIDD PRO (please add ***sixiali fox59) for deletion and modification.

This article is original content, unauthorized prohibition**, after authorization** also need to indicate the source. If you have any questions, please send an email to sixiali@stonewisecn

Related Pages