🤖 AI Summary
This work addresses the challenge in cellular automata (CA) rule learning of reconciling differentiable training with discrete, binary inference. We propose Differentiable Logic Cellular Automata (DiffLogic CA), which integrates a Differentiable Logic Gate Network (DLGN) into the Neural Cellular Automaton (NCA) framework, enabling end-to-end gradient-based optimization while strictly preserving binary discrete states during inference. To our knowledge, this is the first successful incorporation of DLGN into a recurrent CA architecture, unifying continuous trainability with discrete interpretability and supporting both synchronous and asynchronous state updates. Experiments demonstrate that DiffLogic CA precisely recovers Conway’s Game of Life rules; generates noise-robust checkerboard patterns, self-healing lizard-like morphologies, and multi-color complex spatiotemporal dynamics; and yields learned logic gate structures that generalize to reusable recursive circuits—significantly enhancing robustness and out-of-distribution generalization.
📝 Abstract
This paper introduces Differentiable Logic Cellular Automata (DiffLogic CA), a novel combination of Neural Cellular Automata (NCA) and Differentiable Logic Gates Networks (DLGNs). The fundamental computation units of the model are differentiable logic gates, combined into a circuit. During training, the model is fully end-to-end differentiable allowing gradient-based training, and at inference time it operates in a fully discrete state space. This enables learning local update rules for cellular automata while preserving their inherent discrete nature. We demonstrate the versatility of our approach through a series of milestones: (1) fully learning the rules of Conway's Game of Life, (2) generating checkerboard patterns that exhibit resilience to noise and damage, (3) growing a lizard shape, and (4) multi-color pattern generation. Our model successfully learns recurrent circuits capable of generating desired target patterns. For simpler patterns, we observe success with both synchronous and asynchronous updates, demonstrating significant generalization capabilities and robustness to perturbations. We make the case that this combination of DLGNs and NCA represents a step toward programmable matter and robust computing systems that combine binary logic, neural network adaptability, and localized processing. This work, to the best of our knowledge, is the first successful application of differentiable logic gate networks in recurrent architectures.