NeSyA: Neurosymbolic Automata

πŸ“… 2024-12-10
πŸ›οΈ arXiv.org
πŸ“ˆ Citations: 1
✨ Influential: 0
πŸ“„ PDF
πŸ€– AI Summary
This work addresses sequential classification and labeling tasks by introducing the first end-to-end differentiable neurosymbolic model. Methodologically, it formalizes symbolic finite-state automata as differentiable neurosymbolic primitives and integrates them with neural perception modules (e.g., LSTM or Transformer) via probabilistic semantics, enabling gradient-based joint optimization while preserving logical interpretability. The approach explicitly encodes temporal constraints and propositional logic knowledge, thereby balancing generalization capability and model transparency. Empirically, the method achieves substantial improvements over existing neurosymbolic baselines on synthetic benchmarks. Moreover, on real-world event recognition tasks, it demonstrates superior out-of-distribution generalization, higher accuracy, and enhanced robustness compared to purely neural models.

Technology Category

Application Category

πŸ“ Abstract
Neurosymbolic (NeSy) AI has emerged as a promising direction to integrate neural and symbolic reasoning. Unfortunately, little effort has been given to developing NeSy systems tailored to sequential/temporal problems. We identify symbolic automata (which combine the power of automata for temporal reasoning with that of propositional logic for static reasoning) as a suitable formalism for expressing knowledge in temporal domains. Focusing on the task of sequence classification and tagging we show that symbolic automata can be integrated with neural-based perception, under probabilistic semantics towards an end-to-end differentiable model. Our proposed hybrid model, termed NeSyA (Neuro Symbolic Automata) is shown to either scale or perform more accurately than previous NeSy systems in a synthetic benchmark and to provide benefits in terms of generalization compared to purely neural systems in a real-world event recognition task.
Problem

Research questions and friction points this paper is trying to address.

Integrating neural and symbolic reasoning for temporal problems
Combining symbolic automata with neural perception for sequence tasks
Improving accuracy and generalization in neurosymbolic sequence classification
Innovation

Methods, ideas, or system contributions that make the work stand out.

Integrates symbolic automata with neural perception
End-to-end differentiable probabilistic model
Improves accuracy and scalability in sequence tasks
πŸ”Ž Similar Papers
No similar papers found.