Zatom-1: A Multimodal Flow Foundation Model for 3D Molecules and Materials

📅 2026-02-24
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work proposes Zatom-1, the first unified foundation model capable of jointly generating and predicting both 3D molecules and materials. Addressing the limitation of existing AI approaches that are typically confined to a single chemical domain (molecules or materials) and a single task (generation or prediction), Zatom-1 leverages a Transformer architecture to enable cross-domain representation sharing and transfer. It employs multimodal flow matching to jointly model discrete atomic types and continuous geometric structures, with a unified generative pretraining strategy that provides a general-purpose initialization for diverse downstream tasks. The model achieves state-of-the-art or competitive performance across multiple benchmarks, demonstrates over an order-of-magnitude speedup in generation, and validates positive transfer from material pretraining to molecular property prediction, significantly enhancing both generalization and inference efficiency.

Technology Category

Application Category

📝 Abstract
General-purpose 3D chemical modeling encompasses molecules and materials, requiring both generative and predictive capabilities. However, most existing AI approaches are optimized for a single domain (molecules or materials) and a single task (generation or prediction), which limits representation sharing and transfer. We introduce Zatom-1, the first foundation model that unifies generative and predictive learning of 3D molecules and materials. Zatom-1 is a Transformer trained with a multimodal flow matching objective that jointly models discrete atom types and continuous 3D geometries. This approach supports scalable pretraining with predictable gains as model capacity increases, while enabling fast and stable sampling. We use joint generative pretraining as a universal initialization for downstream multi-task prediction of properties, energies, and forces. Empirically, Zatom-1 matches or outperforms specialized baselines on both generative and predictive benchmarks, while reducing the generative inference time by more than an order of magnitude. Our experiments demonstrate positive predictive transfer between chemical domains from joint generative pretraining: modeling materials during pretraining improves molecular property prediction accuracy.
Problem

Research questions and friction points this paper is trying to address.

3D chemical modeling
foundation model
multimodal learning
generative and predictive tasks
molecules and materials
Innovation

Methods, ideas, or system contributions that make the work stand out.

foundation model
multimodal flow matching
3D molecular modeling
generative and predictive learning
cross-domain transfer
🔎 Similar Papers
No similar papers found.