🤖 AI Summary
This work addresses the challenges of optimizing periodic activation functions in implicit neural representations (INRs), which often suffer from gradient instability and inadequate modeling of multi-scale signals. To overcome these limitations, the authors propose the HOSC activation function, defined as HOSC(x) = tanh(β sin(ω₀x)), which explicitly controls the Lipschitz constant through a tunable saturation parameter β. This design stabilizes gradient propagation while preserving high-frequency signal details. Theoretical analysis and comprehensive experiments across diverse tasks—including image, audio, and video representation, as well as NeRF and signed distance field (SDF) modeling—demonstrate that HOSC matches or outperforms existing methods such as SIREN and FINER. Furthermore, the study provides practical guidelines for selecting hyperparameters to facilitate effective deployment.
📝 Abstract
Periodic activations such as sine preserve high-frequency information in implicit neural representations (INRs) through their oscillatory structure, but often suffer from gradient instability and limited control over multi-scale behavior. We introduce the Hyperbolic Oscillator with Saturation Control (HOSC) activation, $\text{HOSC}(x) = \tanh\bigl(\beta \sin(\omega_0 x)\bigr)$, which exposes an explicit parameter $\beta$ that controls the Lipschitz bound of the activation by $\beta \omega_0$. This provides a direct mechanism to tune gradient magnitudes while retaining a periodic carrier. We provide a mathematical analysis and conduct a comprehensive empirical study across images, audio, video, NeRFs, and SDFs using standardized training protocols. Comparative analysis against SIREN, FINER, and related methods shows where HOSC provides substantial benefits and where it achieves competitive parity. Results establish HOSC as a practical periodic activation for INR applications, with domain-specific guidance on hyperparameter selection. For code visit the project page https://hosc-nn.github.io/ .