🤖 AI Summary
This work addresses the lack of efficient, compatible, and user-friendly self-scaling quasi-Newton optimizers in the JAX ecosystem. Building upon the Optimistix library, the authors systematically implement the Broyden family of quasi-Newton methods—including BFGS, DFP, Broyden, and their self-scaling variants—and integrate them with a Zoom line search satisfying the strong Wolfe conditions. While the contribution is not algorithmically novel, this study presents the first complete integration of self-scaling Broyden-family methods within JAX, significantly enhancing their reusability and accessibility for the community. The resulting optimizer is open-source, highly compatible, and enables JAX users to directly employ a variety of quasi-Newton methods for efficient optimization.
📝 Abstract
We present a JAX implementation of the Self-Scaled Broyden family of quasi-Newton methods, fully compatible with JAX and building on the Optimistix~\cite{rader_optimistix_2024} optimisation library. The implementation includes BFGS, DFP, Broyden and their Self-Scaled variants(SSBFGS, SSDFP, SSBroyden), together with a Zoom line search satisfying the strong Wolfe conditions. This is a short technical note, not a research paper, as it does not claim any novel contribution; its purpose is to document the implementation and ease the adoption of these optimisers within the JAX community. The code is available at https://github.com/IvanBioli/ssbroyden_optimistix.git.