🤖 AI Summary
Implicit Neural Representations (INRs) are constrained by fixed activation functions, limiting their efficiency in modeling high-frequency and heterogeneous signals. To address this, we propose a lightweight fusion architecture that replaces the conventional ReLU with a single-layer learnable piecewise-linear activation module, integrated synergistically with coordinate encoding. The model is optimized via multi-task joint training on image reconstruction, 3D shape reconstruction, and novel-view synthesis. Crucially, this work introduces the first parameterized activation function embedded directly into the INR backbone—enhancing high-frequency detail representation without increasing network depth. Our approach achieves state-of-the-art performance across multiple INR benchmarks, demonstrating superior visual quality and generalization. The implementation is publicly available.
📝 Abstract
Implicit Neural Representation (INR), leveraging a neural network to transform coordinate input into corresponding attributes, has recently driven significant advances in several vision-related domains. However, the performance of INR is heavily influenced by the choice of the nonlinear activation function used in its multilayer perceptron (MLP) architecture. To date, multiple nonlinearities have been investigated, but current INRs still face limitations in capturing high-frequency components and diverse signal types. We show that these challenges can be alleviated by introducing a novel approach in INR architecture. Specifically, we propose SL$^{2}$A-INR, a hybrid network that combines a single-layer learnable activation function with an MLP that uses traditional ReLU activations. Our method performs superior across diverse tasks, including image representation, 3D shape reconstruction, and novel view synthesis. Through comprehensive experiments, SL$^{2}$A-INR sets new benchmarks in accuracy, quality, and robustness for INR. Our Code is publicly available on~href{https://github.com/Iceage7/SL2A-INR}{ extcolor{magenta}{GitHub}}.