GO-Diff: Data-free and amortized global structure optimization

📅 2025-10-15
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Global structural optimization without prior data or explicit energy relaxation remains challenging. Method: This paper proposes a diffusion-based, end-to-end atomic configuration generation framework guided by physics-informed energy functions. It employs Boltzmann-weighted score matching loss for training, enabling *ab initio* sampling without requiring labeled structural data. A two-stage self-sampling–model-refinement loop combines amortized inference with iterative improvement, while pre-trained models support cross-chemical-system transfer to accelerate convergence on new tasks. Contributions/Results: Experiments demonstrate that the method achieves comparable low-energy structure discovery performance to conventional global optimization algorithms—yet with significantly fewer energy evaluations. It thus offers superior efficiency, generalizability across chemical systems, and model reusability, establishing a data-efficient paradigm for physics-guided generative optimization.

Technology Category

Application Category

📝 Abstract
We introduce GO-Diff, a diffusion-based method for global structure optimization that learns to directly sample low-energy atomic configurations without requiring prior data or explicit relaxation. GO-Diff is trained from scratch using a Boltzmann-weighted score-matching loss, leveraging only the known energy function to guide generation toward thermodynamically favorable regions. The method operates in a two-stage loop of self-sampling and model refinement, progressively improving its ability to target low-energy structures. Compared to traditional optimization pipelines, GO-Diff achieves competitive results with significantly fewer energy evaluations. Moreover, by reusing pretrained models across related systems, GO-Diff supports amortized optimization - enabling faster convergence on new tasks without retraining from scratch.
Problem

Research questions and friction points this paper is trying to address.

Directly samples low-energy atomic configurations without data
Trains using energy function to target thermodynamically favorable structures
Enables amortized optimization across systems for faster convergence
Innovation

Methods, ideas, or system contributions that make the work stand out.

Diffusion-based sampling without prior data
Boltzmann-weighted training using energy function
Amortized optimization with pretrained model reuse
🔎 Similar Papers
No similar papers found.
N
Nikolaj Rønne
Technical University of Denmark
Tejs Vegge
Tejs Vegge
Professor, Technical University of Denmark
Director of CAPeX - Pioneer Center for Accelerating P2X Materials Discovery
A
Arghya Bhowmik
Technical University of Denmark