🤖 AI Summary
Neural Cellular Automata (NCAs) face three key bottlenecks in high-resolution image generation: quadratic growth of training/inference cost with grid size, limited long-range information propagation due to local neighborhood constraints, and difficulty achieving real-time high-definition output. This paper proposes Implicit Neural Cellular Automata (iNCA), which maps low-dimensional NCA hidden states to arbitrary-resolution images via a lightweight, shared MLP decoder—enabling both texture synthesis and morphogenesis. Methodologically, we introduce a coordinate-based implicit decoding architecture and a high-resolution-aware loss function, enabling parallelized, resolution-agnostic inference over cell states. Experiments demonstrate that iNCA achieves real-time full-HD (1920×1080) generation for the first time, with substantially reduced memory and computational overhead, while preserving self-organization and emergent behavior. We validate iNCA’s effectiveness across 2D/3D regular grids and non-Euclidean surfaces.
📝 Abstract
Neural Cellular Automata (NCAs) are bio-inspired systems in which identical cells self-organize to form complex and coherent patterns by repeatedly applying simple local rules. NCAs display striking emergent behaviors including self-regeneration, generalization and robustness to unseen situations, and spontaneous motion. Despite their success in texture synthesis and morphogenesis, NCAs remain largely confined to low-resolution grids. This limitation stems from (1) training time and memory requirements that grow quadratically with grid size, (2) the strictly local propagation of information which impedes long-range cell communication, and (3) the heavy compute demands of real-time inference at high resolution. In this work, we overcome this limitation by pairing NCA with a tiny, shared implicit decoder, inspired by recent advances in implicit neural representations. Following NCA evolution on a coarse grid, a lightweight decoder renders output images at arbitrary resolution. We also propose novel loss functions for both morphogenesis and texture synthesis tasks, specifically tailored for high-resolution output with minimal memory and computation overhead. Combining our proposed architecture and loss functions brings substantial improvement in quality, efficiency, and performance. NCAs equipped with our implicit decoder can generate full-HD outputs in real time while preserving their self-organizing, emergent properties. Moreover, because each MLP processes cell states independently, inference remains highly parallelizable and efficient. We demonstrate the applicability of our approach across multiple NCA variants (on 2D, 3D grids, and 3D meshes) and multiple tasks, including texture generation and morphogenesis (growing patterns from a seed), showing that with our proposed framework, NCAs seamlessly scale to high-resolution outputs with minimal computational overhead.