Graphon-Level Bayesian Predictive Synthesis for Random Network

📅 2025-12-20
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This paper addresses Bayesian predictive synthesis for random network graphs at the graphon level. Methodologically, it introduces the first graphon-space Bayesian predictive synthesis framework, which aggregates predictive distributions from multiple proxy graphons via weighted fusion—equivalently realizing the L²-projection of the true graphon onto the subspace spanned by the proxies. We establish a non-asymptotic oracle inequality, proving its “combination-outperforms-individual” statistical advantage. Innovatively integrating graphon theory, KL-moment calibration, and Lipschitz stability analysis, the framework ensures controlled estimation error for key graph statistics—including edge density, degree distribution, and subgraph densities. It preserves model closure under heavy-tailed degree distributions and the exponential random graph model (ERGM) family, and achieves the L² minimax optimal convergence rate.

Technology Category

Application Category

📝 Abstract
Bayesian predictive synthesis provides a coherent Bayesian framework for combining multiple predictive distributions, or agents, into a single updated prediction, extending Bayesian model averaging to allow general pooling of full predictive densities. This paper develops a static, graphon level version of Bayesian predictive synthesis for random networks. At the graphon level we show that Bayesian predictive synthesis corresponds to the integrated squared error projection of the true graphon onto the linear span of the agent graphons. We derive nonasymptotic oracle inequalities and prove that least-squares graphon-BPS, based on a finite number of edge observations, achieves the minimax L^2 rate over this agent span. Moreover, we show that any estimator that selects a single agent graphon is uniformly inconsistent on a nontrivial subset of the convex hull of the agents, whereas graphon-level Bayesian predictive synthesis remains minimax-rate optimal-formalizing a combination beats components phenomenon. Structural properties of the underlying random graphs are controlled through explicit Lipschitz bounds that transfer graphon error into error for edge density, degree distributions, subgraph densities, clustering coefficients, and giant component phase transitions. Finally, we develop a heavy tail theory for Bayesian predictive synthesis, showing how mixtures and entropic tilts preserve regularly varying degree distributions and how exponential random graph model agents remain within their family under log linear tilting with Kullback-Leibler optimal moment calibration.
Problem

Research questions and friction points this paper is trying to address.

Develops Bayesian predictive synthesis for random network graphon-level analysis.
Demonstrates combination of agent graphons outperforms individual selection in estimation.
Analyzes error propagation from graphon to network structural properties.
Innovation

Methods, ideas, or system contributions that make the work stand out.

Graphon-level Bayesian predictive synthesis for networks
Minimax L^2 rate via least-squares graphon-BPS
Heavy tail theory with mixtures and entropic tilts
🔎 Similar Papers
No similar papers found.
Marios Papamichalis
Marios Papamichalis
Postdoctoral Associate, Yale University
StatisticsNetworksCausal InferenceDeep Learning
R
Regina Ruane
Department of Statistics and Data Science, The Wharton School, University of Pennsylvania, 3733 Spruce Street, Philadelphia, PA 19104-6340