🤖 AI Summary
Manually writing SystemVerilog Assertions (SVAs) is inefficient and error-prone, while existing large language model (LLM)-based approaches often overlook the structural patterns inherent in hardware designs. This work proposes a structure-aware SVA generation method that encodes RTL modules into abstract syntax tree (AST)-based structural fingerprints, retrieves structurally similar (RTL, SVA) pairs from a knowledge base, and incorporates them into prompt engineering to guide LLMs during assertion synthesis. By introducing, for the first time, a structure-similarity-guided retrieval mechanism, the approach significantly improves the syntactic correctness, stylistic consistency, and functional accuracy of generated assertions, demonstrating the effectiveness of structure-aware retrieval in industrial-scale formal verification.
📝 Abstract
Formal Verification (FV) relies on high-quality SystemVerilog Assertions (SVAs), but the manual writing process is slow and error-prone. Existing LLM-based approaches either generate assertions from scratch or ignore structural patterns in hardware designs and expert-crafted assertions. This paper presents STELLAR, the first framework that guides LLM-based SVA generation with structural similarity. STELLAR represents RTL blocks as AST structural fingerprints, retrieves structurally relevant (RTL, SVA) pairs from a knowledge base, and integrates them into structure-guided prompts. Experiments show that STELLAR achieves superior syntax correctness, stylistic alignment, and functional correctness, highlighting structure-aware retrieval as a promising direction for industrial FV.