🤖 AI Summary
To address the limitations of existing methods in embedding hierarchical data—namely, insufficient embedding efficiency and geometric fidelity—this paper proposes a novel deep learning architecture grounded in the group structure of hyperbolic space. The core method introduces, for the first time in hyperbolic deep learning, the Cartan decomposition from symmetric spaces, leveraging the dual nature of hyperbolic space as both a solvable Lie group and a Riemannian manifold. This enables the design of neural layers that jointly satisfy group homomorphism modeling and metric preservation. Geometric constraints are enforced via diffeomorphic mappings, underpinned rigorously by Lie group theory and hyperbolic geometry to ensure precise representation of hierarchical relations. Experiments on multiple standard hierarchical benchmarks demonstrate significant improvements in embedding quality and downstream task performance—including classification and link prediction—while offering enhanced interpretability. The work establishes a principled, structure-driven paradigm for hyperbolic deep learning.
📝 Abstract
Hyperbolic deep learning leverages the metric properties of hyperbolic spaces to develop efficient and informative embeddings of hierarchical data. Here, we focus on the solvable group structure of hyperbolic spaces, which follows naturally from their construction as symmetric spaces. This dual nature of Lie group and Riemannian manifold allows us to propose a new class of hyperbolic deep learning algorithms where group homomorphisms are interleaved with metric-preserving diffeomorphisms. The resulting algorithms, which we call Cartan networks, show promising results on various benchmark data sets and open the way to a novel class of hyperbolic deep learning architectures.