Recurrent Deep Differentiable Logic Gate Networks

📅 2025-08-08
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This paper addresses the challenge of jointly achieving logical interpretability and end-to-end learning in sequence modeling. To this end, it proposes the first Recurrent Differentiable Logic Network (RDLN), which embeds Boolean operations (AND/OR/NOT) into a recurrent architecture to enable temporally grounded logical neural computation. Methodologically, it introduces differentiable logic gate units tightly coupled with hidden state updates, enabling gradient backpropagation and full end-to-end training. Key contributions include: (1) the first fully differentiable Boolean logic gates instantiated within a recurrent framework; (2) empirical validation of logical neural computation on sequence-to-sequence tasks, notably machine translation; and (3) a novel paradigm for FPGA-friendly models and interpretable recursive architectures. On the WMT’14 English–German translation task, RDLN achieves 5.00 BLEU and 30.9% accuracy, with training efficiency comparable to GRU and stable inference performance.

Technology Category

Application Category

📝 Abstract
While differentiable logic gates have shown promise in feedforward networks, their application to sequential modeling remains unexplored. This paper presents the first implementation of Recurrent Deep Differentiable Logic Gate Networks (RDDLGN), combining Boolean operations with recurrent architectures for sequence-to-sequence learning. Evaluated on WMT'14 English-German translation, RDDLGN achieves 5.00 BLEU and 30.9% accuracy during training, approaching GRU performance (5.41 BLEU) and graceful degradation (4.39 BLEU) during inference. This work establishes recurrent logic-based neural computation as viable, opening research directions for FPGA acceleration in sequential modeling and other recursive network architectures.
Problem

Research questions and friction points this paper is trying to address.

Exploring recurrent differentiable logic gates for sequence modeling
Combining Boolean operations with recurrent architectures for learning
Evaluating performance in machine translation tasks
Innovation

Methods, ideas, or system contributions that make the work stand out.

Recurrent Deep Differentiable Logic Gate Networks
Combines Boolean operations with recurrent architectures
Enables FPGA acceleration in sequential modeling
🔎 Similar Papers
No similar papers found.