HMamba: Hyperbolic Mamba for Sequential Recommendation

📅 2025-05-14
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Existing sequential recommendation models (e.g., RNNs/Transformers) suffer from high computational overhead and struggle to capture the inherent hierarchical structure of user preferences; while Mamba-based models achieve linear time complexity, they operate in Euclidean space and thus fail to model the hyperbolic geometry naturally exhibited by recommendation data. To address these limitations, we propose HyperMamba—the first end-to-end sequential recommendation framework integrating hyperbolic geometry with selective state space modeling (SSM). Its core innovations include: (i) a selective state update mechanism formulated in hyperbolic space, (ii) a stable Riemannian gradient optimization algorithm, and (iii) a Poincaré ball embedding and projection strategy. Evaluated on four benchmark datasets, HyperMamba achieves 3–11% improvements in Recall@20 while retaining O(L) time complexity, enabling scalable, real-time industrial deployment.

Technology Category

Application Category

📝 Abstract
Sequential recommendation systems have become a cornerstone of personalized services, adept at modeling the temporal evolution of user preferences by capturing dynamic interaction sequences. Existing approaches predominantly rely on traditional models, including RNNs and Transformers. Despite their success in local pattern recognition, Transformer-based methods suffer from quadratic computational complexity and a tendency toward superficial attention patterns, limiting their ability to infer enduring preference hierarchies in sequential recommendation data. Recent advances in Mamba-based sequential models introduce linear-time efficiency but remain constrained by Euclidean geometry, failing to leverage the intrinsic hyperbolic structure of recommendation data. To bridge this gap, we propose Hyperbolic Mamba, a novel architecture that unifies the efficiency of Mamba's selective state space mechanism with hyperbolic geometry's hierarchical representational power. Our framework introduces (1) a hyperbolic selective state space that maintains curvature-aware sequence modeling and (2) stabilized Riemannian operations to enable scalable training. Experiments across four benchmarks demonstrate that Hyperbolic Mamba achieves 3-11% improvement while retaining Mamba's linear-time efficiency, enabling real-world deployment. This work establishes a new paradigm for efficient, hierarchy-aware sequential modeling.
Problem

Research questions and friction points this paper is trying to address.

Overcome quadratic complexity and superficial attention in Transformers
Address Euclidean geometry limitations in Mamba-based models
Model hierarchical preferences in sequential recommendation data
Innovation

Methods, ideas, or system contributions that make the work stand out.

Hyperbolic Mamba combines Mamba's efficiency with hyperbolic geometry
Introduces hyperbolic selective state space for curvature-aware modeling
Uses stabilized Riemannian operations for scalable training
🔎 Similar Papers
No similar papers found.
Q
Qianru Zhang
School of Computing and Data Science, The University of Hong Kong, Hong Kong
H
Honggang Wen
School of Computing and Data Science, The University of Hong Kong, Hong Kong
W
Wei Yuan
School of Electrical Engineering and Computer Science, The University of Queensland, Australia
C
Crystal Chen
School of Computer Science, Boston University, USA
Menglin Yang
Menglin Yang
HKUST(GZ) | Yale University | CUHK
Hyperbolic Representation LearningTransformerRecommender SystemLLM
Siu-Ming Yiu
Siu-Ming Yiu
Professor of Computer Science, The University of Hong Kong
CybersecurityCryptographyFinTechBioinformatics
Hongzhi Yin
Hongzhi Yin
Professor and ARC Future Fellow, University of Queensland
Recommender SystemGraph LearningSpatial-temporal PredictionEdge IntelligenceLLM