🤖 AI Summary
We address change-point detection in dynamic networks exhibiting joint low-rank and sparse structure. We propose MOSAIC, a novel inferential framework that constructs a test statistic via spectral decomposition and signal screening, augmented by a residual-adjustment strategy; leveraging the martingale central limit theorem, the statistic is asymptotically standard normal under the null. Theoretically, we establish a minimax detection boundary dependent on change sparsity, and MOSAIC achieves the optimal detection rate. Empirically, it demonstrates high sensitivity to subtle structural shifts and exhibits strong power and robustness across synthetic and real-world network data. Our key innovation lies in unifying low-rank modeling, sparse inference, and adaptive pivot construction within a single dynamic network change-point detection framework—thereby attaining both statistical optimality and computational feasibility.
📝 Abstract
We propose a new inference framework, named MOSAIC, for change-point detection in dynamic networks with the simultaneous low-rank and sparse-change structure. We establish the minimax rate of detection boundary, which relies on the sparsity of changes. We then develop an eigen-decomposition-based test with screened signals that approaches the minimax rate in theory, with only a minor logarithmic loss. For practical implementation of MOSAIC, we adjust the theoretical test by a novel residual-based technique, resulting in a pivotal statistic that converges to a standard normal distribution via the martingale central limit theorem under the null hypothesis and achieves full power under the alternative hypothesis. We also analyze the minimax rate of testing boundary for dynamic networks without the low-rank structure, which almost aligns with the results in high-dimensional mean-vector change-point inference. We showcase the effectiveness of MOSAIC and verify our theoretical results with several simulation examples and a real data application.