Heat Kernel Goes Topological

📅 2025-07-16
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
High-order topological neural networks suffer from excessive computational overhead and struggle to balance expressive power with efficiency. Method: This paper introduces a novel framework based on combinatorial complexes: it defines the Laplacian operator on combinatorial complexes for the first time and leverages it to efficiently compute heat kernels as node descriptors. The method enables multi-scale topological awareness, permutation-equivariant representations, and seamless integration into Transformer architectures. Contributions/Results: Theoretically, it achieves optimal expressivity—distinguishing any pair of non-isomorphic structures, thereby overcoming the discrimination limitations of conventional topological methods. Empirically, it matches state-of-the-art handcrafted descriptors on molecular datasets, significantly outperforms existing topological learning approaches, and attains superior computational efficiency and scalability.

Technology Category

Application Category

📝 Abstract
Topological neural networks have emerged as powerful successors of graph neural networks. However, they typically involve higher-order message passing, which incurs significant computational expense. We circumvent this issue with a novel topological framework that introduces a Laplacian operator on combinatorial complexes (CCs), enabling efficient computation of heat kernels that serve as node descriptors. Our approach captures multiscale information and enables permutation-equivariant representations, allowing easy integration into modern transformer-based architectures. Theoretically, the proposed method is maximally expressive because it can distinguish arbitrary non-isomorphic CCs. Empirically, it significantly outperforms existing topological methods in terms of computational efficiency. Besides demonstrating competitive performance with the state-of-the-art descriptors on standard molecular datasets, it exhibits superior capability in distinguishing complex topological structures and avoiding blind spots on topological benchmarks. Overall, this work advances topological deep learning by providing expressive yet scalable representations, thereby opening up exciting avenues for molecular classification and property prediction tasks.
Problem

Research questions and friction points this paper is trying to address.

Efficient computation of heat kernels for topological neural networks
Capturing multiscale information with permutation-equivariant representations
Distinguishing complex topological structures with high computational efficiency
Innovation

Methods, ideas, or system contributions that make the work stand out.

Laplacian operator on combinatorial complexes
Efficient heat kernel computation
Permutation-equivariant transformer integration
🔎 Similar Papers