🤖 AI Summary
Existing hyperbolic kernel methods suffer from geometric distortion and limited expressivity due to fixed curvature when modeling hierarchical data. To address this, we propose an adaptive hyperspherical kernel framework grounded in learnable de Branges–Rovnyak spaces—marking the first integration of curvature-awareness into reproducing kernel Hilbert space (RKHS) construction. Our approach introduces a learnable radial kernel and a task-driven feature modulation mechanism. By jointly leveraging isometric Poincaré ball embeddings and parameterized multiplier functions, we design an end-to-end differentiable embedding model. Experiments on multimodal vision-language benchmarks and hierarchical classification tasks demonstrate that our method significantly outperforms state-of-the-art hyperspherical kernel baselines. It substantially reduces embedding distortion, enhances structural fidelity, and improves classification accuracy—thereby offering a more flexible and geometrically faithful representation for hierarchical data.
📝 Abstract
Hierarchical data pervades diverse machine learning applications, including natural language processing, computer vision, and social network analysis. Hyperbolic space, characterized by its negative curvature, has demonstrated strong potential in such tasks due to its capacity to embed hierarchical structures with minimal distortion. Previous evidence indicates that the hyperbolic representation capacity can be further enhanced through kernel methods. However, existing hyperbolic kernels still suffer from mild geometric distortion or lack adaptability. This paper addresses these issues by introducing a curvature-aware de Branges-Rovnyak space, a reproducing kernel Hilbert space (RKHS) that is isometric to a Poincare ball. We design an adjustable multiplier to select the appropriate RKHS corresponding to the hyperbolic space with any curvature adaptively. Building on this foundation, we further construct a family of adaptive hyperbolic kernels, including the novel adaptive hyperbolic radial kernel, whose learnable parameters modulate hyperbolic features in a task-aware manner. Extensive experiments on visual and language benchmarks demonstrate that our proposed kernels outperform existing hyperbolic kernels in modeling hierarchical dependencies.