Pinet: Optimizing hard-constrained neural networks with orthogonal projection layers

📅 2025-08-14
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work addresses the challenge of strictly enforcing convex constraints when solving parameterized constrained optimization problems with neural networks. We propose a novel architecture featuring an orthogonal projection output layer: during forward propagation, feasible-set projection is efficiently computed via operator splitting; during backward propagation, gradients are exactly computed using the implicit function theorem. This is the first method to explicitly embed hard convex constraint projection into the network structure while preserving end-to-end differentiability and trainability. Compared to existing approaches, our method achieves superior performance in training speed, solution quality, and hyperparameter robustness, and supports batched, efficient inference. Notably, on non-convex trajectory preference tasks—such as multi-vehicle motion planning—it significantly outperforms traditional solvers in inference latency while matching their runtime. The implementation is released as an open-source, GPU-accelerated JAX toolkit.

Technology Category

Application Category

📝 Abstract
We introduce an output layer for neural networks that ensures satisfaction of convex constraints. Our approach, $Π$net, leverages operator splitting for rapid and reliable projections in the forward pass, and the implicit function theorem for backpropagation. We deploy $Π$net as a feasible-by-design optimization proxy for parametric constrained optimization problems and obtain modest-accuracy solutions faster than traditional solvers when solving a single problem, and significantly faster for a batch of problems. We surpass state-of-the-art learning approaches in terms of training time, solution quality, and robustness to hyperparameter tuning, while maintaining similar inference times. Finally, we tackle multi-vehicle motion planning with non-convex trajectory preferences and provide $Π$net as a GPU-ready package implemented in JAX with effective tuning heuristics.
Problem

Research questions and friction points this paper is trying to address.

Ensures neural networks satisfy convex constraints
Provides faster solutions for parametric constrained optimization
Improves multi-vehicle motion planning with non-convex preferences
Innovation

Methods, ideas, or system contributions that make the work stand out.

Orthogonal projection layers ensure convex constraints
Operator splitting enables rapid reliable projections
GPU-ready JAX package with tuning heuristics
🔎 Similar Papers