RTD-Lite: Scalable Topological Analysis for Comparing Weighted Graphs in Learning Tasks

πŸ“… 2025-03-14
πŸ“ˆ Citations: 0
✨ Influential: 0
πŸ“„ PDF
πŸ€– AI Summary
Existing topological alignment methods for large-scale weighted graphs suffer from high computational complexity and incompatibility with end-to-end learning. Method: We propose the first scalable topological distance algorithm with both time and space complexity of O(nΒ²). Our approach avoids persistent homology computation by constructing an auxiliary graph, extracting its minimum spanning tree, hierarchically tracking connected components, and integrating multi-scale differences in connectivity and clustering structure. A differentiable loss function is designed to enable topology-aware embedding learning. Contribution/Results: On synthetic and real-world benchmarks, our method achieves up to数十-fold speedup over state-of-the-art alternatives. When integrated as a regularizer into neural networks, it significantly improves topological fidelity and structural preservation in dimensionality reduction and graph representation learning tasks.

Technology Category

Application Category

πŸ“ Abstract
Topological methods for comparing weighted graphs are valuable in various learning tasks but often suffer from computational inefficiency on large datasets. We introduce RTD-Lite, a scalable algorithm that efficiently compares topological features, specifically connectivity or cluster structures at arbitrary scales, of two weighted graphs with one-to-one correspondence between vertices. Using minimal spanning trees in auxiliary graphs, RTD-Lite captures topological discrepancies with $O(n^2)$ time and memory complexity. This efficiency enables its application in tasks like dimensionality reduction and neural network training. Experiments on synthetic and real-world datasets demonstrate that RTD-Lite effectively identifies topological differences while significantly reducing computation time compared to existing methods. Moreover, integrating RTD-Lite into neural network training as a loss function component enhances the preservation of topological structures in learned representations. Our code is publicly available at https://github.com/ArGintum/RTD-Lite
Problem

Research questions and friction points this paper is trying to address.

Scalable topological analysis for comparing weighted graphs
Efficiently identifies topological differences in large datasets
Enhances neural network training by preserving topological structures
Innovation

Methods, ideas, or system contributions that make the work stand out.

RTD-Lite: scalable topological graph comparison algorithm
Uses minimal spanning trees for efficient discrepancy detection
Enhances neural network training with topological loss function
πŸ”Ž Similar Papers
No similar papers found.