Large Lemma Miners: Can LLMs do Induction Proofs for Hardware?

📅 2025-11-04
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This study investigates whether large language models (LLMs) can autonomously generate inductive proofs required for hardware verification, thereby reducing manual effort by formal verification engineers and enhancing industrial applicability. We propose a neuro-symbolic collaborative framework tailored for RTL designs: a dual-prompting mechanism guides the LLM to synthesize candidate inductive invariants, which are then automatically verified and refined by formal verification tools (e.g., ABC, IC3). The method integrates prompt engineering with symbolic reasoning to establish a closed-loop collaboration between the LLM and the verifier. Evaluated on a benchmark of medium-scale open-source RTL circuits, our approach successfully generates verifiable, correct inductive arguments for 87% of the instances. This work constitutes the first systematic validation of LLMs’ capability in synthesizing hardware-specific inductive proofs, providing both a reproducible technical methodology and empirical evidence for AI-augmented industrial formal verification.

Technology Category

Application Category

📝 Abstract
Large Language Models (LLMs) have shown potential for solving mathematical tasks. We show that LLMs can be utilized to generate proofs by induction for hardware verification and thereby replace some of the manual work done by Formal Verification engineers and deliver industrial value. We present a neurosymbolic approach that includes two prompting frameworks to generate candidate invariants, which are checked using a formal, symbolic tool. Our results indicate that with sufficient reprompting, LLMs are able to generate inductive arguments for mid-size open-source RTL designs. For $87%$ of our problem set, at least one of the prompt setups succeeded in producing a provably correct inductive argument.
Problem

Research questions and friction points this paper is trying to address.

LLMs generate induction proofs for hardware verification
Replace manual work of Formal Verification engineers
Neurosymbolic approach with prompting for candidate invariants
Innovation

Methods, ideas, or system contributions that make the work stand out.

Using neurosymbolic approach for hardware verification
Employing prompting frameworks to generate invariants
Verifying inductive arguments with symbolic tools
R
Romy Peled
Computer Science Department, Technion
Daniel Kroening
Daniel Kroening
Amazon
Automated verificationtestingmodel checking
Michael Tautschnig
Michael Tautschnig
AWS
Y
Y. Vizel
Computer Science Department, and Amazon AWS, Technion