GIC-DLC: Differentiable Logic Circuits for Hardware-Friendly Grayscale Image Compression

📅 2026-01-20
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work proposes a hardware-friendly grayscale image compression method that addresses the high computational cost of existing neural image codecs, which hinders their deployment on low-power edge devices. By introducing differentiable logic circuits into image compression for the first time, the approach enables end-to-end training of lookup tables, effectively combining the representational power of neural networks with the energy efficiency of Boolean operations. Evaluated on standard grayscale image datasets, the method outperforms conventional codecs in both reconstruction fidelity and computational efficiency, achieving significantly lower energy consumption and latency. This study thus opens a new pathway toward practical deployment of learned image compression algorithms on resource-constrained edge hardware.

Technology Category

Application Category

📝 Abstract
Neural image codecs achieve higher compression ratios than traditional hand-crafted methods such as PNG or JPEG-XL, but often incur substantial computational overhead, limiting their deployment on energy-constrained devices such as smartphones, cameras, and drones. We propose Grayscale Image Compression with Differentiable Logic Circuits (GIC-DLC), a hardware-aware codec where we train lookup tables to combine the flexibility of neural networks with the efficiency of Boolean operations. Experiments on grayscale benchmark datasets show that GIC-DLC outperforms traditional codecs in compression efficiency while allowing substantial reductions in energy consumption and latency. These results demonstrate that learned compression can be hardware-friendly, offering a promising direction for low-power image compression on edge devices.
Problem

Research questions and friction points this paper is trying to address.

image compression
hardware-friendly
energy-constrained devices
neural codecs
edge computing
Innovation

Methods, ideas, or system contributions that make the work stand out.

Differentiable Logic Circuits
Hardware-Aware Compression
Lookup Table Learning
Energy-Efficient Image Coding
Neural Image Compression
🔎 Similar Papers
No similar papers found.
Till Aczel
Till Aczel
ETH Zurich
deep learningrepresentation learningneural compression
D
David F. Jenny
ETH Zurich
S
Simon Buhrer
ETH Zurich
A
Andreas Lindhardt Plesner
ETH Zurich
A
Antonio Di Maio
ETH Zurich
R
R. Wattenhofer
ETH Zurich