π€ AI Summary
Computing parameter sensitivities in nonlinear model predictive control (NMPC) remains challenging, and existing learning-augmented NMPC methods are often restricted to convex or unconstrained formulations.
Method: This paper proposes an end-to-end differentiable NMPC framework. It uniquely integrates the implicit function theorem with interior-point smoothing of optimality conditions, enabling efficient and accurate forward and adjoint-mode sensitivity computation for general nonlinear programs (NLPs). The solver adopts a sequential quadratic programming (SQP) architecture, embedding an interior-point method for subproblem resolution.
Contribution/Results: The framework lifts convexity and constraint-free assumptions, delivering the first general-purpose, efficient, and open-source differentiable NMPC infrastructure. Empirical evaluation shows over 3Γ speedup versus mpc.pytorch, significantly facilitating the design and training of learning-based control systems.
π Abstract
The efficient computation of parametric solution sensitivities is a key challenge in the integration of learning-enhanced methods with nonlinear model predictive control (MPC), as their availability is crucial for many learning algorithms. While approaches presented in the machine learning community are limited to convex or unconstrained formulations, this paper discusses the computation of solution sensitivities of general nonlinear programs (NLPs) using the implicit function theorem (IFT) and smoothed optimality conditions treated in interior-point methods (IPM). We detail sensitivity computation within a sequential quadratic programming (SQP) method which employs an IPM for the quadratic subproblems. The publication is accompanied by an efficient open-source implementation within the framework, providing both forward and adjoint sensitivities for general optimal control problems, achieving speedups exceeding 3x over the state-of-the-art solver mpc.pytorch.