🤖 AI Summary
Global structural optimization without prior data or explicit energy relaxation remains challenging. Method: This paper proposes a diffusion-based, end-to-end atomic configuration generation framework guided by physics-informed energy functions. It employs Boltzmann-weighted score matching loss for training, enabling *ab initio* sampling without requiring labeled structural data. A two-stage self-sampling–model-refinement loop combines amortized inference with iterative improvement, while pre-trained models support cross-chemical-system transfer to accelerate convergence on new tasks. Contributions/Results: Experiments demonstrate that the method achieves comparable low-energy structure discovery performance to conventional global optimization algorithms—yet with significantly fewer energy evaluations. It thus offers superior efficiency, generalizability across chemical systems, and model reusability, establishing a data-efficient paradigm for physics-guided generative optimization.
📝 Abstract
We introduce GO-Diff, a diffusion-based method for global structure optimization that learns to directly sample low-energy atomic configurations without requiring prior data or explicit relaxation. GO-Diff is trained from scratch using a Boltzmann-weighted score-matching loss, leveraging only the known energy function to guide generation toward thermodynamically favorable regions. The method operates in a two-stage loop of self-sampling and model refinement, progressively improving its ability to target low-energy structures. Compared to traditional optimization pipelines, GO-Diff achieves competitive results with significantly fewer energy evaluations. Moreover, by reusing pretrained models across related systems, GO-Diff supports amortized optimization - enabling faster convergence on new tasks without retraining from scratch.