Nothing Special   »   [go: up one dir, main page]

skip to main content
10.1145/3584371.3613014acmconferencesArticle/Chapter ViewAbstractPublication PagesbcbConference Proceedingsconference-collections
abstract

Synergistic Fusion of Graph and Transformer Features for Enhanced Molecular Property Prediction

Published: 04 October 2023 Publication History

Abstract

Molecular property prediction is a critical task in computational drug discovery. While recent advances in Graph Neural Networks (GNNs) and Transformers have shown to be effective and promising, they face the following limitations: Transformer self-attention does not explicitly consider the underlying molecule structure while GNN feature representation alone is not sufficient to capture granular and hidden interactions and characteristics that distinguish similar molecules. To address these limitations, we explore synergistically combining pre-trained features from GNNs and Transformers and term it as SYN-FUSION. This comprehensive molecular representation captures both the global molecule structure and individual atom characteristics.
In MoleculeNet benchmarks, using GIN and Chemformer as GNN and Transformer architectures demonstrate better performance than baselines. It surpasses existing GNN and Transformer methods and matches models that jointly train both features. SYN-FUSION outperforms in 5 of 7 classification datasets and 4 of 6 regression datasets. Relative improvements on datasets: BBBP (6.63%), ClinTox (19.89%), HIV (4.36%), SIDER (7.2%), MUV (9.21%), ESOL (21.3%), Lipo (5.4%), QM7 (38.74%), QM8 (2.09%), and QM9 (55.23%).
In order to experimentally verify the impact of synergy, we conducteda comparative analysis of the combined effect (SYN-FUSION) and the individual models (MolCLR and Chemformer), as well as their ensemble, on both classification and regression tasks. In the absence of SYN-FUSION, AUC% drops by 5.10%-6.15% on ClinTox and RMSE increases by 0.15-0.39 on ESOL. This demonstrates the synergy effect - the combined effect achieved through fusion is greater than the individual models and their ensemble.

References

[1]
Weihua Hu, Bowen Liu, Joseph Gomes, Marinka Zitnik, Percy Liang, Vijay S. Pande, and Jure Leskovec. 2020. Strategies for Pre-training Graph Neural Networks. In 8th International Conference on Learning Representations, ICLR 2020, Addis Ababa, Ethiopia, April 26--30, 2020. OpenReview.net.
[2]
Ross Irwin, Spyridon Dimitriadis, Jiazhen He, and Esben Jannik Bjerrum. 2022. Chemformer: a pre-trained transformer for computational chemistry. Machine Learning: Science and Technology 3, 1 (jan 2022), 015022.
[3]
Yuyang Wang, Jianren Wang, Zhonglin Cao, and Amir Barati Farimani. 2022. Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4, 3 (March 2022), 279--287.
[4]
Chengxuan Ying, Tianle Cai, Shengjie Luo, Shuxin Zheng, Guolin Ke, Di He, Yanming Shen, and Tie-Yan Liu. 2021. Do Transformers Really Perform Badly for Graph Representation?. In Advances in Neural Information Processing Systems, A. Beygelzimer, Y. Dauphin, P. Liang, and J. Wortman Vaughan (Eds.).

Recommendations

Comments

Please enable JavaScript to view thecomments powered by Disqus.

Information & Contributors

Information

Published In

cover image ACM Conferences
BCB '23: Proceedings of the 14th ACM International Conference on Bioinformatics, Computational Biology, and Health Informatics
September 2023
626 pages
ISBN:9798400701269
DOI:10.1145/3584371
Permission to make digital or hard copies of part or all of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for third-party components of this work must be honored. For all other uses, contact the owner/author(s).

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 04 October 2023

Check for updates

Qualifiers

  • Abstract

Conference

BCB '23
Sponsor:

Acceptance Rates

Overall Acceptance Rate 254 of 885 submissions, 29%

Upcoming Conference

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • 0
    Total Citations
  • 46
    Total Downloads
  • Downloads (Last 12 months)46
  • Downloads (Last 6 weeks)1
Reflects downloads up to 24 Sep 2024

Other Metrics

Citations

View Options

Get Access

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media