🤖 AI Summary
Constructing Steiner Minimum Trees (SMTs) in hyperbolic space is an NP-hard problem; existing deterministic heuristics—such as HyperSteiner—are prone to local optima. To address this, we propose Randomized HyperSteiner (RHS), the first method to inject stochasticity into the Delaunay triangulation step, synergistically integrating hyperbolic geometric modeling with Riemannian gradient descent for joint topology generation and continuous-space optimization. This design markedly enhances global search capability and robustness—particularly in boundary-dense regions. Evaluated on synthetic benchmarks and single-cell transcriptomic data, RHS consistently outperforms Minimum Spanning Tree (MST), Neighbor-Joining, and the original HyperSteiner. Notably, it reduces total edge length in boundary regions by 32% on average, demonstrating both theoretical soundness and practical efficacy for hyperbolic phylogenetic reconstruction.
📝 Abstract
We study the problem of constructing Steiner Minimal Trees (SMTs) in hyperbolic space. Exact SMT computation is NP-hard, and existing hyperbolic heuristics such as HyperSteiner are deterministic and often get trapped in locally suboptimal configurations. We introduce Randomized HyperSteiner (RHS), a stochastic Delaunay triangulation heuristic that incorporates randomness into the expansion process and refines candidate trees via Riemannian gradient descent optimization. Experiments on synthetic data sets and a real-world single-cell transcriptomic data show that RHS outperforms Minimum Spanning Tree (MST), Neighbour Joining, and vanilla HyperSteiner (HS). In near-boundary configurations, RHS can achieve a 32% reduction in total length over HS, demonstrating its effectiveness and robustness in diverse data regimes.