Publication
AAAI 2024
Workshop paper

A Multi-View approach based on Graphs and Chemical Language Foundation Model for Molecular Properties Prediction

Abstract

Pre-trained Language Models have emerged as promising tools for predicting molecular properties, yet their development is in its early stages, necessitating further research to enhance their efficacy and address challenges such as generalization and sample efficiency. In this paper, we present a Multi-View approach that combines latent spaces derived from state-of-the-art chemical models. Our approach relies on two pivotal elements: the embeddings derived from MHG-GNN, which represent molecular structures as graphs, and MoLFormer embeddings rooted in chemical language. The attention mechanism of MoLFormer is able to identify relations between two atoms even when their distance is far apart, while the GNN of MHG-GNN can more precisely capture relations among multiple atoms closely located. In this work, we demonstrate the superior performance of our proposed Multi-view approach compared to existing state-of-the-art methods, including MoLFormer-XL, which was trained on 1.1 billion molecules, particularly in intricate tasks such as predicting the quantum mechanical properties of small molecules. We assessed our approach using 11 benchmark datasets from MoleculeNet, where it outperformed competitors in 8 of them. We also provide a deep analysis of the results obtained with the QM9 dataset, where our proposed approach surpass its state-of-the-art competitors in 9 out of the 12 tasks presented in this dataset. Our study highlights the potential of latent space fusion and feature integration for advancing molecular property prediction. In this work, we use small versions of MHG-GNN and MoLFormer, which opens up an opportunity for further improvement when our approach uses a larger-scale dataset.