Online Two-Stage Submodular Maximization

📅 2025-10-22
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This paper studies Online Two-Stage Submodular Maximization (O2SSM): given a sequence of submodular functions arriving online, the goal is to dynamically maintain a feasible base set under matroid constraints so as to maximize the expected maximum value of a subsequently drawn random objective function over the restricted set. We introduce the O2SSM framework for the class of weighted threshold potential functions, integrating online learning with submodular optimization via an algorithm combining threshold sampling, gradient estimation, and a greedy strategy respecting matroid constraints. Theoretically, we establish a $(1-1/e)^2$-competitive sublinear regret bound under general matroids, and a tighter $(1-1/e)(1 - e^{-k} k^k / k!)$ bound under $k$-rank uniform matroids—matching the best-known approximation guarantee for the offline counterpart. Experiments on real-world datasets demonstrate that our method significantly outperforms baseline approaches.

Technology Category

Application Category

📝 Abstract
Given a collection of monotone submodular functions, the goal of Two-Stage Submodular Maximization (2SSM) [Balkanski et al., 2016] is to restrict the ground set so an objective selected u.a.r. from the collection attains a high maximal value, on average, when optimized over the restricted ground set. We introduce the Online Two-Stage Submodular Maximization (O2SSM) problem, in which the submodular objectives are revealed in an online fashion. We study this problem for weighted threshold potential functions, a large and important subclass of monotone submodular functions that includes influence maximization, data summarization, and facility location, to name a few. We design an algorithm that achieves sublinear $(1 - 1/e)^2$-regret under general matroid constraints and $(1 - 1/e)(1-e^{-k}k^k/k!)$-regret in the case of uniform matroids of rank $k$; the latter also yields a state-of-the-art bound for the (offline) 2SSM problem. We empirically validate the performance of our online algorithm with experiments on real datasets.
Problem

Research questions and friction points this paper is trying to address.

Online optimization of two-stage submodular functions under matroid constraints
Addressing regret minimization for threshold potential functions in streaming data
Extending offline submodular maximization to dynamic online settings
Innovation

Methods, ideas, or system contributions that make the work stand out.

Online algorithm for submodular maximization with sublinear regret
Handles general matroid constraints with theoretical guarantees
Applies to weighted threshold potential functions in real datasets
🔎 Similar Papers
No similar papers found.