🤖 AI Summary
Addressing the challenge of continuous sparse representation for multidimensional data (e.g., images, videos, point clouds), this paper proposes a low-rank implicit neural representation based on CANDECOMP/PARAFAC (CP) decomposition. Our method introduces three key innovations: (1) a Schatten-$p$ quasi-norm variational regularizer to enforce interpretable sparsity in CP decomposition; (2) a Jacobian-smoothing regularizer that avoids singular value decomposition (SVD) and instead employs Hutchinson’s trace estimator to efficiently constrain the spectral norm, thereby enhancing modeling capacity and generalization over continuous domains; and (3) neural parameterization of CP factor functions, balancing nonlinear expressivity with structural interpretability. Experiments on image inpainting, denoising, and point cloud upsampling demonstrate significant improvements over state-of-the-art implicit and tensor-based methods, validating the approach’s effectiveness, robustness, and practical applicability.
📝 Abstract
Higher-order tensors are well-suited for representing multi-dimensional data, such as color images and videos. Low-rank tensor representation has become essential in machine learning and computer vision, but existing methods like Tucker decomposition offer flexibility at the expense of interpretability. In contrast, while the CANDECOMP/PARAFAC (CP) decomposition provides a more natural and interpretable tensor structure, obtaining sparse solutions remains challenging. Leveraging the rich properties of CP decomposition, we propose a CP-based low-rank tensor function parameterized by neural networks for implicit neural representation (CP-INR). This approach enables continuous data representation beyond structured grids, fully exploiting the non-linearity of tensor data with theoretical guarantees on excess risk bounds. To achieve a sparse CP decomposition, we introduce a variational form of the Schatten-p quasi-norm and prove its relationship to multilinear rank minimization. For smoothness, we propose a regularization term based on the spectral norm of the Jacobian and Hutchinson's trace estimator. Our proposed smoothness regularization is SVD-free and avoids explicit chain rule derivations. It can serve as an alternative to Total Variation (TV) regularization in image denoising tasks and is naturally applicable to continuous data. Extensive experiments on multi-dimensional data recovery tasks, including image inpainting, denoising, and point cloud upsampling, demonstrate the superiority and versatility of our method compared to state-of-the-art approaches.