🤖 AI Summary
Energy-based models (EBMs) suffer from limited few-shot and structured-data modeling capability due to the absence of explicit inductive biases. To address this, we propose a hybrid energy model that couples neural-network-based EBMs with exponential-family statistical structures, incorporating parameter-free, interpretable, and modular statistical functions as inductive biases. These biases are explicitly enforced via statistical alignment regularization—constraining the energy function to match both score-matching likelihood objectives and empirical statistical properties of the data. Our approach is the first to embed parameter-free statistical terms into EBMs, enabling statistically constrained training. Experiments demonstrate significant improvements in data fitting quality and generation fidelity across multiple benchmarks, with particularly pronounced gains on small-scale datasets and distributions exhibiting strong structural dependencies.
📝 Abstract
With the advent of score-matching techniques for model training and Langevin dynamics for sample generation, energy-based models (EBMs) have gained renewed interest as generative models. Recent EBMs usually use neural networks to define their energy functions. In this work, we introduce a novel hybrid approach that combines an EBM with an exponential family model to incorporate inductive bias into data modeling. Specifically, we augment the energy term with a parameter-free statistic function to help the model capture key data statistics. Like an exponential family model, the hybrid model aims to align the distribution statistics with data statistics during model training, even when it only approximately maximizes the data likelihood. This property enables us to impose constraints on the hybrid model. Our empirical study validates the hybrid model's ability to match statistics. Furthermore, experimental results show that data fitting and generation improve when suitable informative statistics are incorporated into the hybrid model.