Faster Low-Rank Approximation and Kernel Ridge Regression via the Block-Nyström Method

📅 2025-06-20
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Under heavy-tailed spectral decay, conventional Nyström methods suffer from computational overload and explosive effective dimension growth, leading to failure of low-rank approximation. To address this, we propose Block-Nyström: a block-diagonal low-rank decomposition framework that reduces computational complexity while preserving approximation accuracy via randomized sampling and explicit tail-spectrum estimation. We introduce a novel recursive preconditioned inverse solver, dramatically accelerating kernel ridge regression (KRR) computation. Furthermore, we derive a new statistical learning bound for generalized approximate KRR. Our method achieves significantly improved spectral tail estimation accuracy under identical computational budgets, enabling scalable, high-precision preconditioning for large-scale second-order optimization. Key innovations include (i) the block-structured Nyström decomposition framework, (ii) a recursive matrix preconditioning mechanism, and (iii) a theory-driven error control system ensuring rigorous approximation guarantees.

Technology Category

Application Category

📝 Abstract
The Nyström method is a popular low-rank approximation technique for large matrices that arise in kernel methods and convex optimization. Yet, when the data exhibits heavy-tailed spectral decay, the effective dimension of the problem often becomes so large that even the Nyström method may be outside of our computational budget. To address this, we propose Block-Nyström, an algorithm that injects a block-diagonal structure into the Nyström method, thereby significantly reducing its computational cost while recovering strong approximation guarantees. We show that Block-Nyström can be used to construct improved preconditioners for second-order optimization, as well as to efficiently solve kernel ridge regression for statistical learning over Hilbert spaces. Our key technical insight is that, within the same computational budget, combining several smaller Nyström approximations leads to stronger tail estimates of the input spectrum than using one larger approximation. Along the way, we provide a novel recursive preconditioning scheme for efficiently inverting the Block-Nyström matrix, and provide new statistical learning bounds for a broad class of approximate kernel ridge regression solvers.
Problem

Research questions and friction points this paper is trying to address.

Reducing computational cost of Nyström method for large matrices
Improving preconditioners for second-order optimization problems
Efficiently solving kernel ridge regression in statistical learning
Innovation

Methods, ideas, or system contributions that make the work stand out.

Block-Nyström method reduces computational cost
Block-diagonal structure enhances approximation guarantees
Recursive preconditioning for efficient matrix inversion
🔎 Similar Papers
No similar papers found.