High-Dimensional Surrogate Modeling for Closed-Loop Learning of Neural-Network-Parameterized Model Predictive Control

📅 2025-12-12
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Traditional Bayesian optimization fails for closed-loop learning of high-dimensional neural-network-parameterized model predictive control (MPC), as standard surrogate models—e.g., Matérn-kernel Gaussian processes—struggle to capture complex structure in dense, high-dimensional parameter spaces. This work systematically demonstrates, for the first time, that Bayesian neural networks (BNNs), particularly infinitely wide BNNs, serve as effective surrogates, overcoming the dimensionality bottleneck and enabling efficient closed-loop optimization of controllers with over one thousand parameters. In the inverted pendulum task, BNN-based surrogates achieve faster and more stable convergence of closed-loop cost: finite-width BNNs successfully optimize ~100-dimensional controllers, while infinitely wide BNNs maintain robust performance beyond 1,000 dimensions—where Matérn-GP surrogates degrade significantly. This establishes a new paradigm for Bayesian closed-loop learning of high-dimensional neural MPC.

Technology Category

Application Category

📝 Abstract
Learning controller parameters from closed-loop data has been shown to improve closed-loop performance. Bayesian optimization, a widely used black-box and sample-efficient learning method, constructs a probabilistic surrogate of the closed-loop performance from few experiments and uses it to select informative controller parameters. However, it typically struggles with dense high-dimensional controller parameterizations, as they may appear, for example, in tuning model predictive controllers, because standard surrogate models fail to capture the structure of such spaces. This work suggests that the use of Bayesian neural networks as surrogate models may help to mitigate this limitation. Through a comparison between Gaussian processes with Matern kernels, finite-width Bayesian neural networks, and infinite-width Bayesian neural networks on a cart-pole task, we find that Bayesian neural network surrogate models achieve faster and more reliable convergence of the closed-loop cost and enable successful optimization of parameterizations with hundreds of dimensions. Infinite-width Bayesian neural networks also maintain performance in settings with more than one thousand parameters, whereas Matern-kernel Gaussian processes rapidly lose effectiveness. These results indicate that Bayesian neural network surrogate models may be suitable for learning dense high-dimensional controller parameterizations and offer practical guidance for selecting surrogate models in learning-based controller design.
Problem

Research questions and friction points this paper is trying to address.

Optimizes high-dimensional controller parameters using Bayesian neural networks.
Improves surrogate modeling for dense parameter spaces in control learning.
Enables efficient learning of neural-network-based model predictive control.
Innovation

Methods, ideas, or system contributions that make the work stand out.

Bayesian neural networks as surrogate models
Enables high-dimensional controller parameter optimization
Outperforms Gaussian processes in convergence and scalability
🔎 Similar Papers
No similar papers found.