Progressively Sampled Equality-Constrained Optimization

πŸ“… 2025-09-30
πŸ“ˆ Citations: 0
✨ Influential: 0
πŸ“„ PDF
πŸ€– AI Summary
This paper addresses continuous nonlinear equality-constrained optimization problems where constraints are defined by expectations (or empirical averages) over a large number of terms. To overcome the high sample complexity and computational cost of conventional full-sample methods, we propose the first progressive sampling strategy for such problems: starting from a small random sample, the sample set is incrementally expanded across iterations, and a sequential optimization framework is designed leveraging first- and second-order derivative information of the constraint functions. Under standard regularity assumptions, we establish rigorous theoretical guarantees showing that our method strictly improves upon the worst-case sample complexity bound of the full-sample baseline. Numerical experiments on canonical test problems confirm the method’s efficiency and practical feasibility.

Technology Category

Application Category

πŸ“ Abstract
An algorithm is proposed, analyzed, and tested for solving continuous nonlinear-equality-constrained optimization problems where the constraints are defined by an expectation or an average over a large (finite) number of terms. The main idea of the algorithm is to solve a sequence of equality-constrained problems, each involving a finite sample of constraint-function terms, over which the sample set grows progressively. Under assumptions about the constraint functions and their first- and second-order derivatives that are reasonable in some real-world settings of interest, it is shown that -- with a sufficiently large initial sample -- solving a sequence of problems defined through progressive sampling yields a better worst-case sample complexity bound compared to solving a single problem with a full set of samples. The results of numerical experiments with a set of test problems demonstrate that the proposed approach can be effective in practice.
Problem

Research questions and friction points this paper is trying to address.

Solves nonlinear optimization with large-scale expectation constraints
Uses progressive sampling to reduce worst-case computational complexity
Improves efficiency for problems with finite but numerous constraint terms
Innovation

Methods, ideas, or system contributions that make the work stand out.

Progressively sampled equality-constrained optimization algorithm
Sequential solving with growing constraint sample sets
Improved worst-case sample complexity bounds
πŸ”Ž Similar Papers
No similar papers found.