Passive Model Learning of Visibly Deterministic Context-free Grammars

📅 2025-08-22
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work addresses the passive learning of deterministic context-free grammars (DCFGs), modeled as visibly deterministic pushdown automata (VDPDAs). To overcome the inherent computational hardness of DCFG learning, we propose the PAPNI framework—the first extension of the classical RPNI algorithm to DCFG inference. Leveraging prior knowledge of the input alphabet partitioned by stack operations (push/pop/ε), PAPNI reduces VDPDA learning to the coordinated inference of multiple regular models. Crucially, it operates purely passively, requiring no active membership or equivalence queries, and learns end-to-end from labeled positive and negative examples. Experiments on standard DCFG benchmarks demonstrate that PAPNI achieves prediction accuracy comparable to regular-level RPNI, confirming both its theoretical soundness and practical efficacy. By enabling scalable, query-free learning of structured grammars, PAPNI establishes a novel, extensible paradigm for grammar induction beyond the regular hierarchy.

Technology Category

Application Category

📝 Abstract
We present PAPNI, a passive automata learning algorithm capable of learning deterministic context-free grammars, which are modeled with visibly deterministic pushdown automata. PAPNI is a generalization of RPNI, a passive automata learning algorithm capable of learning regular languages from positive and negative samples. PAPNI uses RPNI as its underlying learning algorithm while assuming a priori knowledge of the visibly deterministic input alphabet, that is, the alphabet decomposition into symbols that push to the stack, pop from the stack, or do not affect the stack. In this paper, we show how passive learning of deterministic pushdown automata can be viewed as a preprocessing step of standard RPNI implementations. We evaluate the proposed approach on various deterministic context-free grammars found in the literature and compare the predictive accuracy of learned models with RPNI.
Problem

Research questions and friction points this paper is trying to address.

Learning deterministic context-free grammars passively
Generalizing RPNI algorithm for pushdown automata
Requiring visibly deterministic alphabet decomposition knowledge
Innovation

Methods, ideas, or system contributions that make the work stand out.

Passive automata learning for deterministic context-free grammars
Generalizes RPNI algorithm with visible alphabet decomposition
Uses preprocessing step in standard RPNI implementations
🔎 Similar Papers
No similar papers found.
Edi Muškardin
Edi Muškardin
Sillicon-Austria Labs
Automata LearningModel-Based Testing
T
Tamim Burgstaller
Graz University of Technology