ADMM for Structured Fractional Minimization

📅 2024-11-12
🏛️ arXiv.org
📈 Citations: 1
Influential: 0
📄 PDF
🤖 AI Summary
This paper studies a class of structured fractional minimization problems, where the numerator comprises differentiable, nonconvex nonsmooth, concave nonsmooth, and linearly composed convex nonsmooth terms, while the denominator is either weakly convex or the square root of a weakly convex function. To address the slow convergence and poor numerical stability of existing subgradient and smoothed proximal gradient methods, we propose, for the first time, a Fractional Alternating Direction Method of Multipliers (FADMM) tailored to this structure—offering two variants: Dinkelbach-type and quadratic-transform-type. By constructing a novel Lyapunov function, we establish the first theoretical guarantee of convergence to an ε-critical point with oracle complexity O(1/ε³). Empirical evaluation on sparse Fisher discriminant analysis, robust Sharpe ratio optimization, and robust sparse recovery demonstrates that FADMM significantly outperforms baseline methods in both convergence speed and numerical stability.

Technology Category

Application Category

📝 Abstract
This paper considers a class of structured fractional minimization problems. The numerator consists of a differentiable function, a simple nonconvex nonsmooth function, a concave nonsmooth function, and a convex nonsmooth function composed with a linear operator. The denominator is a continuous function that is either weakly convex or has a weakly convex square root. These problems are prevalent in various important applications in machine learning and data science. Existing methods, primarily based on subgradient methods and smoothing proximal gradient methods, often suffer from slow convergence and numerical stability issues. In this paper, we introduce {sf FADMM}, the first Alternating Direction Method of Multipliers tailored for this class of problems. {sf FADMM} decouples the original problem into linearized proximal subproblems, featuring two variants: one using Dinkelbach's parametric method ({sf FADMM-D}) and the other using the quadratic transform method ({sf FADMM-Q}). By introducing a novel Lyapunov function, we establish that {sf FADMM} converges to $epsilon$-approximate critical points of the problem within an oracle complexity of $mathcal{O}(1/epsilon^{3})$. Extensive experiments on synthetic and real-world datasets, including sparse Fisher discriminant analysis, robust Sharpe ratio minimization, and robust sparse recovery, demonstrate the effectiveness of our approach. Keywords: Fractional Minimization, Nonconvex Optimization, Proximal Linearized ADMM, Nonsmooth Optimization, Convergence Analysis
Problem

Research questions and friction points this paper is trying to address.

Solving structured fractional minimization with nonconvex nonsmooth components
Addressing slow convergence in existing fractional optimization methods
Developing ADMM for machine learning and data science applications
Innovation

Methods, ideas, or system contributions that make the work stand out.

ADMM for structured fractional minimization problems
Decouples problem into linearized proximal subproblems
Novel Lyapunov function ensures convergence