Neural Sum-of-Squares: Certifying the Nonnegativity of Polynomials with Transformers

📅 2025-10-15
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Verifying polynomial nonnegativity—a fundamental NP-hard problem with broad applications in control theory and robotics—is traditionally addressed via sum-of-squares (SOS) relaxation, which relies on large-scale semidefinite programming (SDP); however, SDP complexity grows quadratically with the dimension of the monomial basis. This paper introduces the first Transformer-based approach for SOS certification, proposing a learning-augmented basis selection framework: a data-driven model predicts an approximately minimal monomial basis to drastically reduce SDP size, complemented by a theoretically guaranteed fallback verification mechanism ensuring soundness. Evaluated on over 200 benchmarks, our method achieves over 100× speedup over state-of-the-art solvers and successfully solves previously intractable large-scale instances, significantly enhancing the practical scalability and usability of SOS programming.

Technology Category

Application Category

📝 Abstract
Certifying nonnegativity of polynomials is a well-known NP-hard problem with direct applications spanning non-convex optimization, control, robotics, and beyond. A sufficient condition for nonnegativity is the Sum of Squares (SOS) property, i.e., it can be written as a sum of squares of other polynomials. In practice, however, certifying the SOS criterion remains computationally expensive and often involves solving a Semidefinite Program (SDP), whose dimensionality grows quadratically in the size of the monomial basis of the SOS expression; hence, various methods to reduce the size of the monomial basis have been proposed. In this work, we introduce the first learning-augmented algorithm to certify the SOS criterion. To this end, we train a Transformer model that predicts an almost-minimal monomial basis for a given polynomial, thereby drastically reducing the size of the corresponding SDP. Our overall methodology comprises three key components: efficient training dataset generation of over 100 million SOS polynomials, design and training of the corresponding Transformer architecture, and a systematic fallback mechanism to ensure correct termination, which we analyze theoretically. We validate our approach on over 200 benchmark datasets, achieving speedups of over $100 imes$ compared to state-of-the-art solvers and enabling the solution of instances where competing approaches fail. Our findings provide novel insights towards transforming the practical scalability of SOS programming.
Problem

Research questions and friction points this paper is trying to address.

Certifying nonnegativity of polynomials is NP-hard
Sum of Squares certification remains computationally expensive
Learning-augmented algorithm reduces monomial basis for SDP
Innovation

Methods, ideas, or system contributions that make the work stand out.

Transformer predicts minimal monomial basis for SOS
Generated 100 million SOS polynomials for training
Systematic fallback ensures correct algorithm termination
🔎 Similar Papers
No similar papers found.
N
Nico Pelleriti
Zuse Institute Berlin
C
Christoph Spiegel
Zuse Institute Berlin
S
Shiwei Liu
ELLIS Institute Tübingen
David Martínez-Rubio
David Martínez-Rubio
Carlos III University
OptimizationOnline LearningDeep Learning
Max Zimmer
Max Zimmer
Zuse Institute Berlin
Deep LearningOptimizationMathematics
S
Sebastian Pokutta
Zuse Institute Berlin