Fully First-Order Algorithms for Online Bilevel Optimization

📅 2026-02-12
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work addresses the challenge of expensive Hessian-vector product (HVP) computations in nonconvex-strongly-convex online bilevel optimization by reformulating the original problem as a single-level online optimization with inequality constraints. By constructing a sequence of Lagrangian functions, the authors propose the first fully first-order online bilevel optimization algorithm that completely eliminates the reliance on implicit differentiation and HVPs. The method integrates online convex optimization, adaptive inner-loop scheduling, and first-order gradient updates, without requiring the common assumption of bounded drift in the inner-level optimal solutions. Theoretical analysis shows that the base algorithm achieves a dynamic regret bound of $O(1 + V_T + H_{2,T})$, and an improved variant further attains a tighter bound of $O(\sqrt{T} + V_T)$ when $V_T \geq O(\sqrt{T})$.

Technology Category

Application Category

📝 Abstract
In this work, we study non-convex-strongly-convex online bilevel optimization (OBO). Existing OBO algorithms are mainly based on hypergradient descent, which requires access to a Hessian-vector product (HVP) oracle and potentially incurs high computational costs. By reformulating the original OBO problem as a single-level online problem with inequality constraints and constructing a sequence of Lagrangian function, we eliminate the need for HVPs arising from implicit differentiation. Specifically, we propose a fully first-order algorithm for OBO, and provide theoretical guarantees showing that it achieves regret of $O(1 + V_T + H_{2,T})$. Furthermore, we develop an improved variant with an adaptive inner-iteration scheme, which removes the dependence on the drift variation of the inner-level optimal solution and achieves regret of $O(\sqrt{T} + V_T)$. This regret have the advatange when $V_{T}\ge O(\sqrt{T})$.
Problem

Research questions and friction points this paper is trying to address.

online bilevel optimization
non-convex-strongly-convex
Hessian-vector product
computational cost
first-order algorithm
Innovation

Methods, ideas, or system contributions that make the work stand out.

first-order algorithm
bilevel optimization
online optimization
Hessian-vector product
regret bound
🔎 Similar Papers
No similar papers found.