🤖 AI Summary
To address the high computational cost of modeling long-range interactions in machine learning force fields (MLFFs), this work introduces the first linear-complexity Euclidean-space attention mechanism. The method integrates two key innovations: (1) Euclidean Rotation Position Encoding (ERoPE), the first position encoding that rigorously preserves translational, rotational, and reflectional equivariance; and (2) symmetry-constrained embedding coupled with a linear-scaling self-attention approximation, enabling equivariant global representation learning for 3D atomic systems. The approach significantly improves prediction accuracy for critical chemical interactions—including van der Waals forces, charge transfer, and long-range polarization—outperforming classical force fields across multiple benchmark datasets. With an inference complexity of O(N), it achieves both physical consistency and computational scalability, marking a step toward efficient, physically principled MLFFs for large-scale molecular simulation.
📝 Abstract
Long-range correlations are essential across numerous machine learning tasks, especially for data embedded in Euclidean space, where the relative positions and orientations of distant components are often critical for accurate predictions. Self-attention offers a compelling mechanism for capturing these global effects, but its quadratic complexity presents a significant practical limitation. This problem is particularly pronounced in computational chemistry, where the stringent efficiency requirements of machine learning force fields (MLFFs) often preclude accurately modeling long-range interactions. To address this, we introduce Euclidean fast attention (EFA), a linear-scaling attention-like mechanism designed for Euclidean data, which can be easily incorporated into existing model architectures. A core component of EFA are novel Euclidean rotary positional encodings (ERoPE), which enable efficient encoding of spatial information while respecting essential physical symmetries. We empirically demonstrate that EFA effectively captures diverse long-range effects, enabling EFA-equipped MLFFs to describe challenging chemical interactions for which conventional MLFFs yield incorrect results.