Curvature Learning for Generalization of Hyperbolic Neural Networks

📅 2025-08-24
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Hyperbolic neural networks (HNNs) excel at modeling hierarchical data but suffer from sensitivity to curvature choice; existing approaches lack theoretical grounding for curvature’s impact, and fixed curvature often leads to suboptimal convergence. Method: We present the first PAC-Bayes–based analysis revealing how curvature in hyperspherical neural networks improves generalization by smoothing the loss landscape. Building on this insight, we propose a sharpness-aware curvature learning framework: upper-level optimization minimizes range sharpness, while lower-level optimization performs parameter updates—solved efficiently via implicit differentiation and gradient approximation. Contribution/Results: Our theoretical analysis provides generalization error bounds and convergence guarantees. Experiments demonstrate consistent and significant improvements in generalization across classification, long-tailed recognition, noisy-label learning, and few-shot learning tasks.

Technology Category

Application Category

📝 Abstract
Hyperbolic neural networks (HNNs) have demonstrated notable efficacy in representing real-world data with hierarchical structures via exploiting the geometric properties of hyperbolic spaces characterized by negative curvatures. Curvature plays a crucial role in optimizing HNNs. Inappropriate curvatures may cause HNNs to converge to suboptimal parameters, degrading overall performance. So far, the theoretical foundation of the effect of curvatures on HNNs has not been developed. In this paper, we derive a PAC-Bayesian generalization bound of HNNs, highlighting the role of curvatures in the generalization of HNNs via their effect on the smoothness of the loss landscape. Driven by the derived bound, we propose a sharpness-aware curvature learning method to smooth the loss landscape, thereby improving the generalization of HNNs. In our method, we design a scope sharpness measure for curvatures, which is minimized through a bi-level optimization process. Then, we introduce an implicit differentiation algorithm that efficiently solves the bi-level optimization by approximating gradients of curvatures. We present the approximation error and convergence analyses of the proposed method, showing that the approximation error is upper-bounded, and the proposed method can converge by bounding gradients of HNNs. Experiments on four settings: classification, learning from long-tailed data, learning from noisy data, and few-shot learning show that our method can improve the performance of HNNs.
Problem

Research questions and friction points this paper is trying to address.

Studying curvature's impact on hyperbolic neural network generalization
Developing sharpness-aware curvature learning to smooth loss landscape
Improving HNN performance across classification and noisy data tasks
Innovation

Methods, ideas, or system contributions that make the work stand out.

Sharpness-aware curvature learning for hyperbolic neural networks
Bi-level optimization with implicit differentiation algorithm
Approximation error upper-bounded convergence analysis
🔎 Similar Papers
No similar papers found.
Xiaomeng Fan
Xiaomeng Fan
Beijing Institute of Technology
machine learningcomputer vision
Yuwei Wu
Yuwei Wu
Ph.D. candidate, GRASP Lab, University of Pennsylvania
RoboticsTrajectory OptimizationTask and Motion Planning
Z
Zhi Gao
Beijing Laboratory of Intelligent Information Technology, School of Computer Science, Beijing Institute of Technology (BIT), Beijing, 100081, P.R. China.
Mehrtash Harandi
Mehrtash Harandi
Department of Electrical and Computer Systems Engineering, Monash University
Machine LearningComputer Vision
Y
Yunde Jia
Guangdong Laboratory of Machine Perception and Intelligent Computing, Shenzhen MSU-BIT University, Shenzhen, 518172, P.R. China.