🤖 AI Summary
This paper studies the densest subhypergraph problem with arbitrary monotone (not necessarily convex) local hyperedge rewards, generalizing beyond the conventional restriction to convex rewards. As the problem is NP-hard, we propose two theoretically guaranteed 1/k-approximation algorithms, where k denotes the maximum hyperedge size. The first algorithm constructs a surrogate objective via convex projection, enabling effective handling of non-convex rewards. The second employs an adaptive pruning strategy that departs from standard greedy paradigms to achieve tighter density optimization. To the best of our knowledge, this work establishes the first approximation framework for densest subhypergraph under arbitrary monotone non-convex hyperedge rewards, and proves that the obtained approximation ratio is optimal. Extensive experiments on multiple real-world hypergraph datasets demonstrate both the efficiency and robustness of the proposed algorithms.
📝 Abstract
We consider a generalization of the densest subhypergraph problem where nonnegative rewards are given for including partial hyperedges in a dense subhypergraph. Prior work addressed this problem only in cases where reward functions are convex, in which case the problem is poly-time solvable. We consider a broader setting where rewards are monotonic but otherwise arbitrary. We first prove hardness results for a wide class of non-convex rewards, then design a 1/k-approximation by projecting to the nearest set of convex rewards, where k is the maximum hyperedge size. We also design another 1/k-approximation using a faster peeling algorithm, which (somewhat surprisingly) differs from the standard greedy peeling algorithm used to approximate other variants of the densest subgraph problem. Our results include an empirical analysis of our algorithm across several real-world hypergraphs.