Learning to Discover Iterative Spectral Algorithms

📅 2026-02-10
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work proposes AutoSpec, a novel framework for automatically discovering efficient iterative spectral algorithms tailored to large-scale numerical linear algebra and optimization tasks. By uniquely integrating self-supervised learning with executable matrix polynomial recursions, AutoSpec leverages coarse spectral information to drive a neural network that adaptively predicts recursion coefficients, thereby constructing task-specific iterative solvers. The key innovation lies in the fusion of classical minimax approximation theory with data-driven learning, enabling effective generalization from small-scale synthetic problems to real-world large-scale operators. Experimental results demonstrate that the learned algorithms achieve several orders of magnitude improvement over baselines in both accuracy and convergence speed on real matrices, while exhibiting near-equiripple minimax behavior characteristic of optimal spectral approximations.

Technology Category

Application Category

📝 Abstract
We introduce AutoSpec, a neural network framework for discovering iterative spectral algorithms for large-scale numerical linear algebra and numerical optimization. Our self-supervised models adapt to input operators using coarse spectral information (e.g., eigenvalue estimates and residual norms), and they predict recurrence coefficients for computing or applying a matrix polynomial tailored to a downstream task. The effectiveness of AutoSpec relies on three ingredients: an architecture whose inference pass implements short, executable numerical linear algebra recurrences; efficient training on small synthetic problems with transfer to large-scale real-world operators; and task-defined objectives that enforce the desired approximation or preconditioning behavior across the range of spectral profiles represented in the training set. We apply AutoSpec to discovering algorithms for representative numerical linear algebra tasks: accelerating matrix-function approximation; accelerating sparse linear solvers; and spectral filtering/preconditioning for eigenvalue computations. On real-world matrices, the learned procedures deliver orders-of-magnitude improvements in accuracy and/or reductions in iteration count, relative to basic baselines. We also find clear connections to classical theory: the induced polynomials often exhibit near-equiripple, near-minimax behavior characteristic of Chebyshev polynomials.
Problem

Research questions and friction points this paper is trying to address.

iterative spectral algorithms
numerical linear algebra
matrix-function approximation
spectral filtering
preconditioning
Innovation

Methods, ideas, or system contributions that make the work stand out.

iterative spectral algorithms
neural algorithm discovery
matrix polynomial approximation
self-supervised learning
numerical linear algebra
🔎 Similar Papers
No similar papers found.