🤖 AI Summary
Modeling biological, chemical, and physical systems requires simultaneous SE(3) equivariance and efficient global geometric contextualization—a persistent challenge: existing equivariant self-attention scales quadratically (O(N²)), while local message passing fails to capture long-range dependencies. This work introduces, for the first time, long-range convolution into equivariant learning, proposing an SE(3)-equivariant long convolutional architecture grounded in state-space models. It integrates equivariant feature mappings with geometry-aware kernel design to achieve global context modeling at subquadratic complexity O(N log N). The method establishes new state-of-the-art performance on RNA all-atom property prediction and protein molecular dynamics tasks. At 30k atoms, it runs 20× faster than equivariant Transformers and supports sequences 72× longer under identical computational resources.
📝 Abstract
Processing global geometric context while preserving equivariance is crucial when modeling biological, chemical, and physical systems. Yet, this is challenging due to the computational demands of equivariance and global context at scale. Standard methods such as equivariant self-attention suffer from quadratic complexity, while local methods such as distance-based message passing sacrifice global information. Inspired by the recent success of state-space and long-convolutional models, we introduce Geometric Hyena, the first equivariant long-convolutional model for geometric systems. Geometric Hyena captures global geometric context at sub-quadratic complexity while maintaining equivariance to rotations and translations. Evaluated on all-atom property prediction of large RNA molecules and full protein molecular dynamics, Geometric Hyena outperforms existing equivariant models while requiring significantly less memory and compute that equivariant self-attention. Notably, our model processes the geometric context of 30k tokens 20x faster than the equivariant transformer and allows 72x longer context within the same budget.