A Multi-Level Framework for Multi-Objective Hypergraph Partitioning: Combining Minimum Spanning Tree and Proximal Gradient

📅 2025-09-26
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work addresses two key challenges in multi-objective hypergraph partitioning: susceptibility to local optima and insufficient vertex feature diversity. Methodologically, we propose a novel partitioning framework based on nonconvex constraint relaxation and proximal gradient optimization. Our approach integrates an accelerated proximal gradient algorithm with a Prim-built minimum spanning tree (MST), enabling a hierarchical partitioning strategy and recursive MST-based clustering. Key techniques—including vertex subset sampling, greedy migration, and swap-based refinement—are incorporated to enhance partition quality. Experiments on public benchmarks show that our method reduces cut edge count by 2–5% over KaHyPar, with gains up to 35% on specific instances; it also outperforms hMetis on weighted vertex sets and achieves up to 16% improvement when refining hMetis solutions. The core contribution lies in the first synergistic integration of MST-guided structure learning and nonconvex proximal optimization for hypergraph partitioning, significantly improving both feature diversity and global partitioning capability.

Technology Category

Application Category

📝 Abstract
This paper proposes an efficient hypergraph partitioning framework based on a novel multi-objective non-convex constrained relaxation model. A modified accelerated proximal gradient algorithm is employed to generate diverse $k$-dimensional vertex features to avoid local optima and enhance partition quality. Two MST-based strategies are designed for different data scales: for small-scale data, the Prim algorithm constructs a minimum spanning tree followed by pruning and clustering; for large-scale data, a subset of representative nodes is selected to build a smaller MST, while the remaining nodes are assigned accordingly to reduce complexity. To further improve partitioning results, refinement strategies including greedy migration, swapping, and recursive MST-based clustering are introduced for partitions. Experimental results on public benchmark sets demonstrate that the proposed algorithm achieves reductions in cut size of approximately 2%--5% on average compared to KaHyPar in 2, 3, and 4-way partitioning, with improvements of up to 35% on specific instances. Particularly on weighted vertex sets, our algorithm outperforms state-of-the-art partitioners including KaHyPar, hMetis, Mt-KaHyPar, and K-SpecPart, highlighting its superior partitioning quality and competitiveness. Furthermore, the proposed refinement strategy improves hMetis partitions by up to 16%. A comprehensive evaluation based on virtual instance methodology and parameter sensitivity analysis validates the algorithm's competitiveness and characterizes its performance trade-offs.
Problem

Research questions and friction points this paper is trying to address.

Proposes multi-objective hypergraph partitioning framework
Combines minimum spanning tree with proximal gradient
Improves partitioning quality and reduces cut size
Innovation

Methods, ideas, or system contributions that make the work stand out.

Multi-objective hypergraph partitioning with non-convex relaxation
Accelerated proximal gradient generates diverse vertex features
MST-based strategies and refinement for different data scales
🔎 Similar Papers
No similar papers found.
Yingying Li
Yingying Li
UIUC
online controllearning-based controlonline learningsafe learningdistributed control
M
Mingxuan Xie
Xidian University, Xi’an, Shaanxi, China
H
Hailong You
Xidian University, Xi’an, Shaanxi, China
Y
Yongqiang Yao
Shihezi University, Shihezi, Xinjiang, China
H
Hongwei Liu
Xidian University, Xi’an, Shaanxi, China