Inferring Change Points in High-Dimensional Regression via Approximate Message Passing

📅 2024-04-11
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This paper addresses the precise localization of change points in high-dimensional generalized linear models (GLMs), including linear, logistic, and ReLU regression. Existing methods struggle to jointly estimate model parameters and change-point locations in high dimensions. To overcome this, we propose a novel algorithm based on approximate message passing (AMP). We establish, for the first time, a rigorous high-dimensional asymptotic theory for AMP in change-point inference, deriving state evolution recursions that enable exact characterization of the Hausdorff estimation error for change points. The framework naturally accommodates structured priors and enables Bayesian posterior approximation. Theoretically, we fully characterize performance in the proportional regime where (p propto n). Empirically, our method achieves significant improvements over state-of-the-art baselines on both synthetic and real-world datasets.

Technology Category

Application Category

📝 Abstract
We consider the problem of localizing change points in a generalized linear model (GLM), a model that covers many widely studied problems in statistical learning including linear, logistic, and rectified linear regression. We propose a novel and computationally efficient Approximate Message Passing (AMP) algorithm for estimating both the signals and the change point locations, and rigorously characterize its performance in the high-dimensional limit where the number of parameters $p$ is proportional to the number of samples $n$. This characterization is in terms of a state evolution recursion, which allows us to precisely compute performance measures such as the asymptotic Hausdorff error of our change point estimates, and allows us to tailor the algorithm to take advantage of any prior structural information on the signals and change points. Moreover, we show how our AMP iterates can be used to efficiently compute a Bayesian posterior distribution over the change point locations in the high-dimensional limit. We validate our theory via numerical experiments, and demonstrate the favorable performance of our estimators on both synthetic and real data in the settings of linear, logistic, and rectified linear regression.
Problem

Research questions and friction points this paper is trying to address.

Localizing change points in generalized linear models
Estimating signals and change points via AMP
Characterizing performance in high-dimensional regression
Innovation

Methods, ideas, or system contributions that make the work stand out.

Approximate Message Passing algorithm for change points
State evolution recursion for performance characterization
Bayesian posterior computation for high-dimensional inference
🔎 Similar Papers
No similar papers found.