🤖 AI Summary
Classical convex optimization in ℝⁿ fails to model minimization over unbounded domains where no finite minimizer exists—only infinite sequences can approach the infimum. Traditional compactifications lack both minimality and analytical expressiveness for convex analysis.
Method: This paper introduces the **astral space**, the minimal compactification of ℝⁿ that rigorously incorporates “points at infinity” while preserving essential convex-analytic structure. It enables continuous extensions of linear functions and systematically develops convexity, Fenchel conjugation, and subdifferentiation on infinite points.
Results: Theoretically, we establish necessary and sufficient conditions for continuity of convex functions on the astral space, characterize the structure of minimization sets, and provide a unified convergence criterion for descent algorithms. Practically, the framework yields the first rigorous, unified analysis for unbounded convex optimization—naturally encompassing both finite and infinite minimizers—thereby overcoming expressive limitations of prior compactification approaches in convex analysis.
📝 Abstract
Not all convex functions on $mathbb{R}^n$ have finite minimizers; some can only be minimized by a sequence as it heads to infinity. In this work, we aim to develop a theory for understanding such minimizers at infinity. We study astral space, a compact extension of $mathbb{R}^n$ to which such points at infinity have been added. Astral space is constructed to be as small as possible while still ensuring that all linear functions can be continuously extended to the new space. Although astral space includes all of $mathbb{R}^n$, it is not a vector space, nor even a metric space. However, it is sufficiently well-structured to allow useful and meaningful extensions of concepts of convexity, conjugacy, and subdifferentials. We develop these concepts and analyze various properties of convex functions on astral space, including the detailed structure of their minimizers, exact characterizations of continuity, and convergence of descent algorithms.