Multinoulli Extension: A Lossless Continuous Relaxation for Partition-Constrained Subset Selection

📅 2026-03-22
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work addresses the problem of approximate submodular subset selection under partition constraints, which existing methods often struggle with due to high query complexity or reliance on hard-to-obtain structural parameters. The authors propose the Multinoulli-SCG algorithm, built upon a novel continuous relaxation framework termed the Multinoulli Extension, which transforms the discrete optimization problem into a continuous one without requiring prior knowledge of problem-specific parameters. This framework enables lossless rounding for arbitrary set functions and further yields two online variants—Multinoulli-OSCG and Multinoulli-OSGA—suitable for dynamic settings. With only $O(1/\varepsilon^2)$ function evaluations, the approach achieves a $(1 - e^{-\alpha})\text{OPT} - \varepsilon$ approximation guarantee for monotone $\alpha$-weakly DR-submodular objectives and corresponding theoretical guarantees for $(\gamma, \beta)$-weakly submodular functions.

Technology Category

Application Category

📝 Abstract
Identifying the most representative subset for a close-to-submodular objective while satisfying the predefined partition constraint is a fundamental task with numerous applications in machine learning. However, the existing distorted local-search methods are often hindered by their prohibitive query complexities and the rigid requirement for prior knowledge of difficult-to-obtain structural parameters. To overcome these limitations, we introduce a novel algorithm titled Multinoulli-SCG, which not only is parameter-free, but also can achieve the same approximation guarantees as the distorted local-search methods with significantly fewer function evaluations. More specifically, when the objective function is monotone $α$-weakly DR-submodular or $(γ,β)$-weakly submodular, our Multinoulli-SCG algorithm can attain a value of $(1-e^{-α})\text{OPT}-ε$ or $(\frac{γ^{2}(1-e^{-(β(1-γ)+γ^2)})}{β(1-γ)+γ^2})\text{OPT}-ε$ with only $O(1/ε^{2})$ function evaluations, where OPT denotes the optimal value. The cornerstone of our Multinoulli-SCG algorithm is an innovative continuous-relaxation framework named Multinoulli Extension(ME), which can effectively convert the discrete subset selection problem subject to partition constraints into a solvable continuous maximization focused on learning the optimal multinoulli priors across the concerned partition. In sharp contrast with the well-established multi-linear extension for submodular subset selection, a notable advantage of our proposed ME is its intrinsic capacity to provide a lossless rounding scheme for any set function. Furthermore, based on our proposed ME, we also present two novel online algorithms, namely, Multinoulli-OSCG and Multinoulli-OSGA, for the unexplored online subset selection problems over partition constraints.
Problem

Research questions and friction points this paper is trying to address.

subset selection
partition constraints
submodular optimization
representative subset
combinatorial optimization
Innovation

Methods, ideas, or system contributions that make the work stand out.

Multinoulli Extension
continuous relaxation
partition-constrained subset selection
weak submodularity
lossless rounding
🔎 Similar Papers
No similar papers found.