From Mice to Trains: Amortized Bayesian Inference on Graph Data

📅 2026-01-05
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work addresses the challenges of Bayesian posterior inference on graph-structured data—namely, permutation invariance, scalability, and modeling long-range dependencies—by introducing the first amortized Bayesian inference framework tailored for graphs. The proposed approach employs a two-stage architecture: a summary network maps attributed graphs to fixed-dimensional representations via a permutation-invariant graph encoder, and an inference network flexibly approximates the posterior distributions over parameters at the node, edge, and graph levels. This method enables multi-granular parameter inference and achieves high-fidelity parameter recovery alongside well-calibrated posterior estimates on both synthetic and real-world datasets from biological and logistics domains. The study also provides a systematic evaluation of various neural architectures as the summary network, demonstrating their effectiveness in capturing graph-level statistics for accurate amortized inference.

Technology Category

Application Category

📝 Abstract
Graphs arise across diverse domains, from biology and chemistry to social and information networks, as well as in transportation and logistics. Inference on graph-structured data requires methods that are permutation-invariant, scalable across varying sizes and sparsities, and capable of capturing complex long-range dependencies, making posterior estimation on graph parameters particularly challenging. Amortized Bayesian Inference (ABI) is a simulation-based framework that employs generative neural networks to enable fast, likelihood-free posterior inference. We adapt ABI to graph data to address these challenges to perform inference on node-, edge-, and graph-level parameters. Our approach couples permutation-invariant graph encoders with flexible neural posterior estimators in a two-module pipeline: a summary network maps attributed graphs to fixed-length representations, and an inference network approximates the posterior over parameters. In this setting, several neural architectures can serve as the summary network. In this work we evaluate multiple architectures and assess their performance on controlled synthetic settings and two real-world domains - biology and logistics - in terms of recovery and calibration.
Problem

Research questions and friction points this paper is trying to address.

graph data
Bayesian inference
posterior estimation
permutation invariance
long-range dependencies
Innovation

Methods, ideas, or system contributions that make the work stand out.

Amortized Bayesian Inference
Graph Neural Networks
Permutation-Invariant Encoding
Posterior Estimation
Simulation-Based Inference
🔎 Similar Papers
No similar papers found.