🤖 AI Summary
Existing topological deep learning methods on combinatorial complexes rely on local attention mechanisms, suffering from quadratic computational complexity and dimensional constraints that hinder efficient modeling of higher-order relationships. This work introduces the Mamba architecture to combinatorial complex learning for the first time, reframing message passing as a rank-aware selective state space modeling problem. This enables adaptive, directional, and long-range information propagation in linear time. The proposed framework unifies the treatment of graphs, hypergraphs, and simplicial complexes, significantly outperforming current approaches across multiple benchmarks while offering superior scalability, depth robustness, and computational efficiency. Theoretical analysis further establishes that its expressive power is upper-bounded by the 1-Weisfeiler-Lehman test.
📝 Abstract
Topological deep learning has emerged for modeling higher-order relational structures beyond pairwise interactions that standard graph neural networks fail to capture. Although combinatorial complexes offer a unified topological framework, most existing topological deep learning methods rely on local message passing via attention mechanisms, which incur quadratic complexity and remain low-dimensional, limiting scalability and rank-aware information aggregation in higher-order complexes.We propose Combinatorial Complex Mamba (CCMamba), the first unified mamba-based neural framework for learning on combinatorial complexes. CCMamba reformulates message passing as a selective state-space modeling problem by organizing multi-rank incidence relations into structured sequences processed by rank-aware state-space models. This enables adaptive, directional, and long range information propagation in linear time without self attention. We further establish the theoretical analysis that the expressive power upper-bound of CCMamba message passing is the 1-Weisfeiler-Lehman test. Experiments on graph, hypergraph, and simplicial benchmarks demonstrate that CCMamba consistently outperforms existing methods while exhibiting improved scalability and robustness to depth.