🤖 AI Summary
Existing specification mining approaches are largely confined to Boolean abstractions of events, making them ill-suited for capturing the rich data-centric behaviors exhibited by modern systems. This work proposes a novel method that extends specification mining to rich data domains by jointly learning data transformations and temporal specifications. The approach integrates syntax-guided synthesis (SyGuS) with TSL$_f$—an extension of LTL$_f$ that supports functional updates—enabling the synthesis of expressive, data-aware temporal rules. Evaluated in OpenAI Gymnasium’s ToyText environments, the method successfully synthesizes reactive programs with dramatically improved sample efficiency—outperforming passive learning baselines by several orders of magnitude—and demonstrates substantially enhanced generalization capabilities.
📝 Abstract
Mining specifications from execution traces presents an automated way of capturing characteristic system behaviors. However, existing approaches are largely restricted to Boolean abstractions of events, limiting their ability to express data-aware properties. In this paper, we extend mining procedures to operate over richer datatypes. We first establish candidate functions in our domain that cover the set of traces by leveraging Syntax Guided Synthesis (SyGuS) techniques. To capture these function applications temporally, we formalize the semantics of TSL$_f$, a finite-prefix interpretation of Temporal Stream Logic (TSL) that extends LTL$_f$ with support for first-order predicates and functional updates. This allows us to unify a corresponding procedure for learning the data transformations and temporal specifications of a system. We demonstrate our approach synthesizing reactive programs from mined specifications on the OpenAI-Gymnasium ToyText environments, finding that our method is more robust and orders of magnitude more sample-efficient than passive learning baselines on generalized problem instances.