🤖 AI Summary
To address the trade-off among accuracy, efficiency, and scalability in machine learning force fields (MLFFs), this work proposes an SE(3)-equivariant atomic force field based on learnable local coordinate frames. Methodologically, it introduces a learnable local frame transformation to efficiently encode atomic geometric environments, integrated with multi-scale environmental representations and a strictly SE(3)-equivariant neural network architecture—ensuring both physical consistency and computational efficiency. Compared to NequIP and DeepPot, our model achieves significantly improved energy and force prediction accuracy across diverse benchmark tasks—including defective graphene, formic acid decomposition, zeolites, and surface reactions—while attaining superior accuracy-efficiency trade-offs and enhanced cross-scale generalization. The framework thus establishes a new paradigm for large-scale molecular dynamics simulations: highly accurate, strictly equivariant, and strongly scalable.
📝 Abstract
We present AlphaNet, a local frame-based equivariant model designed to achieve both accurate and efficient simulations for atomistic systems. Recently, machine learning force fields (MLFFs) have gained prominence in molecular dynamics simulations due to their advantageous efficiency-accuracy balance compared to classical force fields and quantum mechanical calculations, alongside their transferability across various systems. Despite the advancements in improving model accuracy, the efficiency and scalability of MLFFs remain significant obstacles in practical applications. AlphaNet enhances computational efficiency and accuracy by leveraging the local geometric structures of atomic environments through the construction of equivariant local frames and learnable frame transitions. We substantiate the efficacy of AlphaNet across diverse datasets, including defected graphene, formate decomposition, zeolites, and surface reactions. AlphaNet consistently surpasses well-established models, such as NequIP and DeepPot, in terms of both energy and force prediction accuracy. Notably, AlphaNet offers one of the best trade-offs between computational efficiency and accuracy among existing models. Moreover, AlphaNet exhibits scalability across a broad spectrum of system and dataset sizes, affirming its versatility.