Neural Bayesian Filtering

📅 2025-10-03
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Addressing the challenge of belief tracking for multimodal, rapidly evolving latent states in partially observable systems, this paper proposes a Bayesian filtering framework based on latent embeddings: it maps high-dimensional belief distributions to fixed-length, low-dimensional embedding vectors and designs a differentiable, particle-inspired update mechanism within the embedding space. This approach integrates the computational efficiency of classical filtering with the expressive power of deep generative models. It effectively mitigates particle degeneration, supports end-to-end training, and balances inference speed with modeling accuracy. Experiments across three canonical partially observable environments demonstrate that the method robustly captures multimodal posterior distributions, significantly improving state estimation accuracy and filtering robustness. The framework establishes a novel paradigm for real-time belief maintenance in complex dynamic systems.

Technology Category

Application Category

📝 Abstract
We present Neural Bayesian Filtering (NBF), an algorithm for maintaining distributions over hidden states, called beliefs, in partially observable systems. NBF is trained to find a good latent representation of the beliefs induced by a task. It maps beliefs to fixed-length embedding vectors, which condition generative models for sampling. During filtering, particle-style updates compute posteriors in this embedding space using incoming observations and the environment's dynamics. NBF combines the computational efficiency of classical filters with the expressiveness of deep generative models - tracking rapidly shifting, multimodal beliefs while mitigating the risk of particle impoverishment. We validate NBF in state estimation tasks in three partially observable environments.
Problem

Research questions and friction points this paper is trying to address.

Maintains belief distributions in partially observable systems
Maps beliefs to embeddings for generative model conditioning
Combines computational efficiency with deep generative expressiveness
Innovation

Methods, ideas, or system contributions that make the work stand out.

Learns latent belief representations for tracking
Maps beliefs to embeddings for generative sampling
Combines particle filters with deep generative models
C
Christopher Solinas
University of Alberta
R
Radovan Haluska
Charles University, Prague
D
David Sychrovsky
Charles University, Prague
F
Finbarr Timbers
Allen Institute for AI
N
Nolan Bard
Sony AI
Michael Buro
Michael Buro
Professor of Computing Science, University of Alberta
Heuristic SearchMachine LearningPlanning
Martin Schmid
Martin Schmid
Google DeepMind
Game TheoryMachine Learning
N
Nathan R. Sturtevant
University of Alberta; Alberta Machine Intelligence Institute (Amii)
Michael Bowling
Michael Bowling
Amii, University of Alberta
Artificial IntelligenceMachine LearningGame TheoryReinforcement LearningComputer Games