E(n) Equivariant Topological Neural Networks

📅 2024-05-24
🏛️ arXiv.org
📈 Citations: 3
Influential: 0
📄 PDF
🤖 AI Summary
This work addresses the challenge of modeling higher-order, heterogeneous interactions in geometric topological data while preserving E(n)-equivariance—i.e., invariance under rotations, reflections, and translations. We propose the first E(n)-equivariant topological deep learning framework, embedding E(n)-equivariant message passing into combinatorial topological complexes (e.g., simplicial and cellular complexes). This enables native equivariant modeling of geometric features such as positions and velocities, and supports arbitrary topological structures. We theoretically prove that our model’s expressive power is strictly greater than that of existing equivariant graph neural networks. Empirically, it achieves state-of-the-art or competitive performance on QM9 molecular property prediction and multi-resolution urban air pollution regression, while significantly reducing computational overhead compared to prior equivariant topological deep learning methods. The core contribution lies in the first principled integration of E(n)-equivariance with topological deep learning, enabling rigorous geometric reasoning over complex topological domains.

Technology Category

Application Category

📝 Abstract
Graph neural networks excel at modeling pairwise interactions, but they cannot flexibly accommodate higher-order interactions and features. Topological deep learning (TDL) has emerged recently as a promising tool for addressing this issue. TDL enables the principled modeling of arbitrary multi-way, hierarchical higher-order interactions by operating on combinatorial topological spaces, such as simplicial or cell complexes, instead of graphs. However, little is known about how to leverage geometric features such as positions and velocities for TDL. This paper introduces E(n)-Equivariant Topological Neural Networks (ETNNs), which are E(n)-equivariant message-passing networks operating on combinatorial complexes, formal objects unifying graphs, hypergraphs, simplicial, path, and cell complexes. ETNNs incorporate geometric node features while respecting rotation, reflection, and translation equivariance. Moreover, being TDL models, ETNNs are natively ready for settings with heterogeneous interactions. We provide a theoretical analysis to show the improved expressiveness of ETNNs over architectures for geometric graphs. We also show how E(n)-equivariant variants of TDL models can be directly derived from our framework. The broad applicability of ETNNs is demonstrated through two tasks of vastly different scales: i) molecular property prediction on the QM9 benchmark and ii) land-use regression for hyper-local estimation of air pollution with multi-resolution irregular geospatial data. The results indicate that ETNNs are an effective tool for learning from diverse types of richly structured data, as they match or surpass SotA equivariant TDL models with a significantly smaller computational burden, thus highlighting the benefits of a principled geometric inductive bias. Our implementation of ETNNs can be found at https://github.com/NSAPH-Projects/topological-equivariant-networks.
Problem

Research questions and friction points this paper is trying to address.

Model higher-order interactions in neural networks
Incorporate geometric features in topological deep learning
Develop E(n)-equivariant topological neural networks
Innovation

Methods, ideas, or system contributions that make the work stand out.

E(n)-equivariant message-passing networks
operates on combinatorial complexes
incorporates geometric node features
🔎 Similar Papers
No similar papers found.