Predicting Atomistic Transitions with Transformers

๐Ÿ“… 2026-03-05
๐Ÿ“ˆ Citations: 0
โœจ Influential: 0
๐Ÿ“„ PDF
๐Ÿค– AI Summary
This work addresses the high computational cost of conventional atomistic transition path sampling, which has hindered its practical application. For the first time, the Transformer architecture is introduced into this domain to construct an efficient surrogate model capable of learning complex emergent behaviors from atomic simulation data, enabling rapid prediction of atomic transition pathways in nanoclusters. The proposed method supports fine-tuning of inputs to generate diverse yet physically plausible microstates, with built-in validation of physical consistency to ensure reliability. Experimental results demonstrate that the model substantially reduces computational overhead while maintaining strong generalization performance and adherence to physical principles.

Technology Category

Application Category

๐Ÿ“ Abstract
Accurate knowledge of the atomistic transition pathways in materials and material surfaces is crucial for many material science problems. However, conventional simulation techniques used to find these transitions are extremely computationally intensive. Even with large-scale, accelerated material simulations, the computational cost constrains the applicable domain in practice. Machine learning models, with the potential to learn the complex emergent behaviors governing atomistic transitions as a fast surrogate model, have great promise to predict transitions with a vastly reduced computational cost. Here, we demonstrate how transformers can be trained to predict atomistic transitions in nano-clusters. We show how we evaluate physical validity of the predictions and how a multitude of additional, different microstates can be generated by slightly varying the data provided to the model.
Problem

Research questions and friction points this paper is trying to address.

atomistic transitions
transition pathways
computational cost
materials science
nano-clusters
Innovation

Methods, ideas, or system contributions that make the work stand out.

Transformers
atomistic transitions
machine learning
surrogate modeling
nano-clusters
๐Ÿ”Ž Similar Papers
No similar papers found.
H
Henry Tischler
Computing and Artificial Intelligence Division, Los Alamos National Laboratory, Los Alamos, NM 87545; School of Engineering and Computer Science, University of Denver, Denver, CO 80210; Department of Physics and Astronomy, University of Denver, Denver, CO 80210
W
Wenting Li
Department of Electrical and Computer Engineering, University of Texas at Austin, Austin, TX 78712
Qi Tang
Qi Tang
Computational Science and Engineering, Georgia Institute of Technology
High Performance ComputingApplied MathematicsPlasma PhysicsScientific Machine Learning
Danny Perez
Danny Perez
Los Alamos National Laboratory
Computational Materials Physics
Thomas Vogel
Thomas Vogel
Humboldt-Universitรคt zu Berlin, Germany
Software EngineeringSoftware TestingSoftware VerificationSBSESelf-Adaptive Systems