Inference in Spreading Processes with Neural-Network Priors

📅 2025-09-02
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This paper addresses the joint inference of initial states and propagation pathways for diffusion processes (e.g., epidemic spread) on graphs, where initial states are generated from node covariates via an unknown nonlinear function—modeled by a neural network—departing from conventional i.i.d. assumptions. We propose a Bayesian framework integrating belief propagation (BP) and approximate message passing (AMP), termed BP-AMP, with a single-layer perceptron as the functional prior over initial states. Our analysis reveals, for the first time, a first-order phase transition and a statistical–computational gap induced by the weight distribution of this prior. The method seamlessly fuses propagation dynamics with covariate information to enable end-to-end joint inference under multi-source data. Experiments demonstrate substantial improvements in initial-state recovery accuracy over estimators relying solely on either network structure or covariates, establishing a novel paradigm for inverting stochastic processes on graphs with nodal attributes.

Technology Category

Application Category

📝 Abstract
Stochastic processes on graphs are a powerful tool for modelling complex dynamical systems such as epidemics. A recent line of work focused on the inference problem where one aims to estimate the state of every node at every time, starting from partial observation of a subset of nodes at a subset of times. In these works, the initial state of the process was assumed to be random i.i.d. over nodes. Such an assumption may not be realistic in practice, where one may have access to a set of covariate variables for every node that influence the initial state of the system. In this work, we will assume that the initial state of a node is an unknown function of such covariate variables. Given that functions can be represented by neural networks, we will study a model where the initial state is given by a simple neural network -- notably the single-layer perceptron acting on the known node-wise covariate variables. Within a Bayesian framework, we study how such neural-network prior information enhances the recovery of initial states and spreading trajectories. We derive a hybrid belief propagation and approximate message passing (BP-AMP) algorithm that handles both the spreading dynamics and the information included in the node covariates, and we assess its performance against the estimators that either use only the spreading information or use only the information from the covariate variables. We show that in some regimes, the model can exhibit first-order phase transitions when using a Rademacher distribution for the neural-network weights. These transitions create a statistical-to-computational gap where even the BP-AMP algorithm, despite the theoretical possibility of perfect recovery, fails to achieve it.
Problem

Research questions and friction points this paper is trying to address.

Estimating initial states and spreading trajectories from partial observations
Incorporating neural-network priors for node covariates in inference
Analyzing phase transitions and recovery challenges in hybrid models
Innovation

Methods, ideas, or system contributions that make the work stand out.

Neural-network priors for initial states
Hybrid BP-AMP algorithm for inference
Handles spreading dynamics and node covariates
🔎 Similar Papers
No similar papers found.