Reciprocal Space Attention for Learning Long-Range Interactions

📅 2025-10-14
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Machine learning interatomic potentials (MLIPs) struggle to efficiently model long-range electrostatic and dispersion interactions. To address this, we propose Reciprocal-Space Attention (RSA), the first attention mechanism that explicitly and assumption-free models long-range physical effects directly in Fourier (reciprocal) space—without requiring predefined atomic charges or empirical parameters. RSA maps linear-scaling attention into reciprocal space, enabling plug-and-play enhancement of arbitrary local or semi-local MLIPs (e.g., MACE). Benchmarking across diverse systems—including dimer binding curves, phosphorene exfoliation energies, and dipole density distributions in liquid water—demonstrates substantial improvements in predictive accuracy. These results validate RSA’s generalizability for capturing long-range behavior across heterogeneous molecular and materials systems, establishing a foundation for physically grounded, scalable MLIPs.

Technology Category

Application Category

📝 Abstract
Machine learning interatomic potentials (MLIPs) have revolutionized the modeling of materials and molecules by directly fitting to ab initio data. However, while these models excel at capturing local and semi-local interactions, they often prove insufficient when an explicit and efficient treatment of long-range interactions is required. To address this limitation, we introduce Reciprocal-Space Attention (RSA), a framework designed to capture long-range interactions in the Fourier domain. RSA can be integrated with any existing local or semi-local MLIP framework. The central contribution of this work is the mapping of a linear-scaling attention mechanism into Fourier space, enabling the explicit modeling of long-range interactions such as electrostatics and dispersion without relying on predefined charges or other empirical assumptions. We demonstrate the effectiveness of our method as a long-range correction to the MACE backbone across diverse benchmarks, including dimer binding curves, dispersion-dominated layered phosphorene exfoliation, and the molecular dipole density of bulk water. Our results show that RSA consistently captures long-range physics across a broad range of chemical and materials systems. The code and datasets for this work is available at https://github.com/rfhari/reciprocal_space_attention
Problem

Research questions and friction points this paper is trying to address.

Capturing long-range interactions in machine learning interatomic potentials
Modeling electrostatics and dispersion without empirical assumptions
Improving accuracy across chemical and materials systems
Innovation

Methods, ideas, or system contributions that make the work stand out.

Reciprocal-Space Attention captures long-range Fourier interactions
Linear-scaling attention mapped to Fourier space enables modeling
Integrates with existing MLIP frameworks as long-range correction
🔎 Similar Papers
No similar papers found.
H
Hariharan Ramasubramanian
Department of Mechanical Engineering, Carnegie Mellon University, Pittsburgh, Pennsylvania 15213
A
Alvaro Vazquez-Mayagoitia
Computational Science Division, Argonne National Laboratory, Lemont, IL 60439
Ganesh Sivaraman
Ganesh Sivaraman
Senior Research Scientist, Pindrop
Automatic Speech RecognitionSpeaker verificationSpeech enhancementDeepfake detection
A
Atul C. Thakur
Aiiso Yufeng Li Family Department of Chemical and Nano Engineering, University of California, San Diego, CA 92093