MARA: Continuous SE(3)-Equivariant Attention for Molecular Force Fields

📅 2026-02-02
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work addresses the limitations of existing machine learning force fields, which rely on fixed angular expansions and struggle to flexibly capture local geometric interactions. The authors propose the Modular Angular-Radial Attention (MARA) mechanism—the first to incorporate continuous spherical attention into molecular modeling—by jointly encoding the angular and radial coordinates of neighboring atoms under SE(3) symmetry, enabling geometry-aware adaptive weighting. MARA is designed as a plug-and-play module that integrates seamlessly into SE(3)-equivariant architectures such as MACE without requiring modifications to the backbone network. Experiments demonstrate that MARA significantly improves prediction accuracy for both energy and atomic forces across multiple molecular benchmarks, effectively reduces high-error outliers, and enhances model robustness and generalization.

Technology Category

Application Category

📝 Abstract
Machine learning force fields (MLFFs) have become essential for accurate and efficient atomistic modeling. Despite their high accuracy, most existing approaches rely on fixed angular expansions, limiting flexibility in weighting local geometric interactions. We introduce Modular Angular-Radial Attention (MARA), a module that extends spherical attention -- originally developed for SO(3) tasks -- to the molecular domain and SE(3), providing an efficient approximation of equivariant interactions. MARA operates directly on the angular and radial coordinates of neighboring atoms, enabling flexible, geometrically informed, and modular weighting of local environments. Unlike existing attention mechanisms in SE(3)-equivariant architectures, MARA can be integrated in a plug-and-play manner into models such as MACE without architectural modifications. Across molecular benchmarks, MARA improves energy and force predictions, reduces high-error events, and enhances robustness. These results demonstrate that continuous spherical attention is an effective and generalizable geometric operator that increases the expressiveness, stability, and reliability of atomistic models.
Problem

Research questions and friction points this paper is trying to address.

machine learning force fields
SE(3)-equivariance
angular expansions
local geometric interactions
molecular modeling
Innovation

Methods, ideas, or system contributions that make the work stand out.

SE(3)-equivariance
spherical attention
modular attention
molecular force fields
geometric deep learning
🔎 Similar Papers
No similar papers found.
F
Francesco Leonardi
Institute of Computer Science, University of Bern, 3012 Bern, Switzerland
Boris Bonev
Boris Bonev
NVIDIA Research
Scientific ComputingMachine LearningNumerical Linear AlgebraComputational Physics
K
Kaspar Riesen
Institute of Computer Science, University of Bern, 3012 Bern, Switzerland