Taku Ito, Luca Cocchi, et al.
ICML 2025
Recent data-efficient molecular generation approaches exploit graph grammars to introduce interpretability into the generative models. However, grammar learning therein relies on expert annotation or unreliable heuristics for algorithmic inference. We propose Foundation Molecular Grammar (FMG), which leverages multi-modal foundation models (MMFMs) to induce an interpretable molecular language. By exploiting the chemical knowledge of an MMFM, FMG renders molecules as images, describes them as text, and aligns information across modalities using prompt learning. FMG can be used as a drop-in replacement for the prior grammar learning approaches in molecular generation and property prediction. We show that FMG not only excels in synthesizability, diversity, and data efficiency but also offers built-in chemical interpretability for automated molecular discovery workflows.
Taku Ito, Luca Cocchi, et al.
ICML 2025
Saiteja Utpala, Alex Gu, et al.
NAACL 2024
Gosia Lazuka, Andreea Simona Anghel, et al.
SC 2024
Gabriele Picco, Lam Thanh Hoang, et al.
EMNLP 2021