Using GPUs And LLMs Can Be Satisfying for Nonlinear Real Arithmetic Problems

📅 2026-03-08
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work addresses the high computational complexity of quantifier-free nonlinear real arithmetic (NRA) problems, which has long limited the efficiency of traditional solving methods. We propose GANRA, a novel SMT solver that integrates large language models (LLMs) with GPU acceleration, uniquely combining LLM-guided heuristics with gradient-based parallel optimization to efficiently explore the solution space. Evaluated on the Sturm-MBO benchmark, GANRA proves satisfiability for over five times as many instances as the current state-of-the-art solver while achieving an average runtime less than one-twentieth of that method. These results demonstrate a substantial improvement in both scalability and solving speed for NRA problems.

Technology Category

Application Category

📝 Abstract
Solving quantifier-free non-linear real arithmetic (NRA) problems is a computationally hard task. To tackle this problem, prior work proposed a promising approach based on gradient descent. In this work, we extend their ideas and combine LLMs and GPU acceleration to obtain an efficient technique. We have implemented our findings in the novel SMT solver GANRA (GPU Accelerated solving of Nonlinear Real Arithmetic problems). We evaluate GANRA on two different NRA benchmarks and demonstrate significant improvements over the previous state of the art. In particular, on the Sturm-MBO benchmark, we can prove satisfiability for more than five times as many instances in less than 1/20th of the previous state-of-the-art runtime.
Problem

Research questions and friction points this paper is trying to address.

nonlinear real arithmetic
quantifier-free
SMT solving
satisfiability
computational hardness
Innovation

Methods, ideas, or system contributions that make the work stand out.

GPU acceleration
Large Language Models (LLMs)
Nonlinear Real Arithmetic
SMT solving
Gradient descent
🔎 Similar Papers
No similar papers found.