LancBiO: dynamic Lanczos-aided bilevel optimization via Krylov subspace

📅 2024-04-04
🏛️ arXiv.org
📈 Citations: 1
Influential: 0
📄 PDF
🤖 AI Summary
To address the computational bottleneck of inefficient Hessian-vector products in hypergradient computation for bilevel optimization, this paper introduces, for the first time, Lanczos iteration and Krylov subspace projection into this domain. We propose a novel method that dynamically constructs a low-dimensional subspace to efficiently approximate hypergradients. Our approach reduces the original high-dimensional Hessian system inversion to solving a small-scale tridiagonal linear system, achieving a theoretical convergence rate of O(ε⁻¹) and establishing a provably convergent optimization framework. Experiments on synthetic data and two deep learning tasks—hyperparameter optimization and meta-learning—demonstrate that our method accelerates hypergradient computation by 2–5× over baseline approaches while significantly reducing memory overhead. The proposed framework thus delivers superior efficiency, numerical stability, and practical applicability.

Technology Category

Application Category

📝 Abstract
Bilevel optimization, with broad applications in machine learning, has an intricate hierarchical structure. Gradient-based methods have emerged as a common approach to large-scale bilevel problems. However, the computation of the hyper-gradient, which involves a Hessian inverse vector product, confines the efficiency and is regarded as a bottleneck. To circumvent the inverse, we construct a sequence of low-dimensional approximate Krylov subspaces with the aid of the Lanczos process. As a result, the constructed subspace is able to dynamically and incrementally approximate the Hessian inverse vector product with less effort and thus leads to a favorable estimate of the hyper-gradient. Moreover, we propose a provable subspace-based framework for bilevel problems where one central step is to solve a small-size tridiagonal linear system. To the best of our knowledge, this is the first time that subspace techniques are incorporated into bilevel optimization. This successful trial not only enjoys $mathcal{O}(epsilon^{-1})$ convergence rate but also demonstrates efficiency in a synthetic problem and two deep learning tasks.
Problem

Research questions and friction points this paper is trying to address.

Enhances bilevel optimization via Krylov subspace
Reduces Hessian inverse computation complexity
Improves hyper-gradient estimation efficiency
Innovation

Methods, ideas, or system contributions that make the work stand out.

Dynamic Lanczos process
Krylov subspace approximation
Tridiagonal linear system solution
B
Bin Gao
State Key Laboratory of Scientific and Engineering Computing, Academy of Mathematics and Systems Science, Chinese Academy of Sciences, Beijing, China
Y
Yan Yang
State Key Laboratory of Scientific and Engineering Computing, Academy of Mathematics and Systems Science, Chinese Academy of Sciences, and University of Chinese Academy of Sciences, Beijing, China
Ya-xiang Yuan
Ya-xiang Yuan
Academy of Mathematics and Systems Science, Chinese Academy of Sciences
operations researchnumerical analysisoptimizationmathematics