🤖 AI Summary
To address the challenge of modeling high-dimensional time-varying parameters in time-varying exponential family models—such as dynamic graphical models—this paper proposes temporal score matching: rather than estimating parameters at each time point, it directly learns their time derivatives (i.e., differential parameters). The key innovation is the first formulation of the temporal score function as a linear function of the differential parameters, coupled with a provably consistent regularized estimation objective. We establish finite-sample normality of the resulting debiased estimator under high-dimensional sparse settings. Integrating high-dimensional sparse regression with asymptotic statistical inference, the method substantially improves both accuracy and statistical efficiency in inferring differential structures. Extensive simulations and applications to real-world dynamic network data demonstrate its ability to accurately recover time-varying differences in high-dimensional graph structures.
📝 Abstract
This paper addresses differential inference in time-varying parametric probabilistic models, like graphical models with changing structures. Instead of estimating a high-dimensional model at each time and inferring changes later, we directly learn the differential parameter, i.e., the time derivative of the parameter. The main idea is treating the time score function of an exponential family model as a linear model of the differential parameter for direct estimation. We use time score matching to estimate parameter derivatives. We prove the consistency of a regularized score matching objective and demonstrate the finite-sample normality of a debiased estimator in high-dimensional settings. Our methodology effectively infers differential structures in high-dimensional graphical models, verified on simulated and real-world datasets.