IM-LUT: Interpolation Mixing Look-Up Tables for Image Super-Resolution

📅 2025-07-14
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Existing arbitrary-scale image super-resolution (ASISR) methods face a fundamental trade-off: lookup-table (LUT)-based approaches support only fixed scaling factors, whereas implicit neural representations incur prohibitive computational and memory overhead. To address this, we propose IM-LUT—a highly efficient, lightweight ASISR framework. First, we design a learnable Interpolation-Mixing Network (IM-Net) that dynamically fuses multiple interpolation kernels based on local image features and the target scale. Second, we distill IM-Net offline into a compact, model-free lookup table (IM-LUT), enabling real-time, CPU-only inference via pure table lookups. To our knowledge, IM-LUT is the first method to integrate adaptive interpolation mixing with LUT-based deployment. Extensive experiments demonstrate that IM-LUT consistently outperforms state-of-the-art ASISR methods across multiple benchmarks, achieving superior quality-efficiency trade-offs and unprecedented practicality for edge devices.

Technology Category

Application Category

📝 Abstract
Super-resolution (SR) has been a pivotal task in image processing, aimed at enhancing image resolution across various applications. Recently, look-up table (LUT)-based approaches have attracted interest due to their efficiency and performance. However, these methods are typically designed for fixed scale factors, making them unsuitable for arbitrary-scale image SR (ASISR). Existing ASISR techniques often employ implicit neural representations, which come with considerable computational cost and memory demands. To address these limitations, we propose Interpolation Mixing LUT (IM-LUT), a novel framework that operates ASISR by learning to blend multiple interpolation functions to maximize their representational capacity. Specifically, we introduce IM-Net, a network trained to predict mixing weights for interpolation functions based on local image patterns and the target scale factor. To enhance efficiency of interpolation-based methods, IM-Net is transformed into IM-LUT, where LUTs are employed to replace computationally expensive operations, enabling lightweight and fast inference on CPUs while preserving reconstruction quality. Experimental results on several benchmark datasets demonstrate that IM-LUT consistently achieves a superior balance between image quality and efficiency compared to existing methods, highlighting its potential as a promising solution for resource-constrained applications.
Problem

Research questions and friction points this paper is trying to address.

Enables arbitrary-scale image super-resolution efficiently
Reduces computational cost of neural representations
Balances image quality and processing speed
Innovation

Methods, ideas, or system contributions that make the work stand out.

Blends multiple interpolation functions for ASISR
Uses LUTs to replace expensive operations
Lightweight fast inference on CPUs
🔎 Similar Papers
No similar papers found.
S
Sejin Park
Department of Electrical Engineering, Korea University
S
Sangmin Lee
Department of Electrical Engineering, Korea University
Kyong Hwan Jin
Kyong Hwan Jin
Associate Professor of EE, Korea University
machine learningsignal processinginverse problemssamplinginformation theory
Seung-Won Jung
Seung-Won Jung
Korea University
Image processing