Sharper Convergence Rates for Nonconvex Optimisation via Reduction Mappings

📅 2025-06-10
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
In high-dimensional nonconvex optimization, optima often form smooth manifolds—arising from overparameterization or symmetry—leading to ill-conditioned objectives and slow convergence. This paper proposes a dimensionality-reduction optimization framework based on reduction maps: it locally reparameterizes the parameter space onto the solution manifold, eliminating redundant degrees of freedom and improving objective curvature. We establish, for the first time, a unified theoretical analysis of how reduction maps reshape the optimization landscape; rigorously proving that they significantly reduce the Hessian condition number and accelerate convergence of gradient-based algorithms. The framework provides principled explanations for diverse structured acceleration techniques—including weight tying, symmetry constraints, and implicit differentiation—thereby bridging the theoretical gap between geometric priors and empirical optimization performance gains.

Technology Category

Application Category

📝 Abstract
Many high-dimensional optimisation problems exhibit rich geometric structures in their set of minimisers, often forming smooth manifolds due to over-parametrisation or symmetries. When this structure is known, at least locally, it can be exploited through reduction mappings that reparametrise part of the parameter space to lie on the solution manifold. These reductions naturally arise from inner optimisation problems and effectively remove redundant directions, yielding a lower-dimensional objective. In this work, we introduce a general framework to understand how such reductions influence the optimisation landscape. We show that well-designed reduction mappings improve curvature properties of the objective, leading to better-conditioned problems and theoretically faster convergence for gradient-based methods. Our analysis unifies a range of scenarios where structural information at optimality is leveraged to accelerate convergence, offering a principled explanation for the empirical gains observed in such optimisation algorithms.
Problem

Research questions and friction points this paper is trying to address.

Exploiting geometric structures in minimisers for optimisation
Improving curvature via reduction mappings for faster convergence
Unifying scenarios leveraging structural information to accelerate convergence
Innovation

Methods, ideas, or system contributions that make the work stand out.

Reduction mappings exploit geometric structures
Reparametrise parameter space for manifolds
Improve curvature for faster convergence
🔎 Similar Papers
No similar papers found.