đ€ AI Summary
This work addresses the problem of efficiently learning Linear Temporal Logic over finite traces (LTLf) formulas from limited trajectory data, aiming to improve learning speed and produce more concise formulas. To overcome the insufficient trade-off between efficiency and formula size in existing approaches, we propose a novel learning framework based on Boolean Set Cover: it decomposes LTLf learning into two phasesâsubformula enumeration and Boolean combination optimizationâboth implemented efficiently on CPU. Our core innovation is the use of Boolean covering as a scalable subroutine, dramatically mitigating combinatorial explosion. The resulting tool, Bolt, achieves over 100Ă speedup on 70% of benchmarks and generates formulas no larger than baseline solutions in 98% of casesâi.e., strictly smaller or equal in size. This approach establishes a new paradigm for AI verification, program synthesis, and specification learning in cyber-physical systems, balancing scalability with interpretability.
đ Abstract
Learning formulas in Linear Temporal Logic (LTLf) from finite traces is a fundamental research problem which has found applications in artificial intelligence, software engineering, programming languages, formal methods, control of cyber-physical systems, and robotics. We implement a new CPU tool called Bolt improving over the state of the art by learning formulas more than 100x faster over 70% of the benchmarks, with smaller or equal formulas in 98% of the cases. Our key insight is to leverage a problem called Boolean Set Cover as a subroutine to combine existing formulas using Boolean connectives. Thanks to the Boolean Set Cover component, our approach offers a novel trade-off between efficiency and formula size.