๐ค AI Summary
This work addresses the high computational cost of conventional atomistic transition path sampling, which has hindered its practical application. For the first time, the Transformer architecture is introduced into this domain to construct an efficient surrogate model capable of learning complex emergent behaviors from atomic simulation data, enabling rapid prediction of atomic transition pathways in nanoclusters. The proposed method supports fine-tuning of inputs to generate diverse yet physically plausible microstates, with built-in validation of physical consistency to ensure reliability. Experimental results demonstrate that the model substantially reduces computational overhead while maintaining strong generalization performance and adherence to physical principles.
๐ Abstract
Accurate knowledge of the atomistic transition pathways in materials and material surfaces is crucial for many material science problems. However, conventional simulation techniques used to find these transitions are extremely computationally intensive. Even with large-scale, accelerated material simulations, the computational cost constrains the applicable domain in practice. Machine learning models, with the potential to learn the complex emergent behaviors governing atomistic transitions as a fast surrogate model, have great promise to predict transitions with a vastly reduced computational cost. Here, we demonstrate how transformers can be trained to predict atomistic transitions in nano-clusters. We show how we evaluate physical validity of the predictions and how a multitude of additional, different microstates can be generated by slightly varying the data provided to the model.