Low-Rank Implicit Neural Representation via Schatten-p Quasi-Norm and Jacobian Regularization

📅 2025-06-27
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Addressing the challenge of continuous sparse representation for multidimensional data (e.g., images, videos, point clouds), this paper proposes a low-rank implicit neural representation based on CANDECOMP/PARAFAC (CP) decomposition. Our method introduces three key innovations: (1) a Schatten-$p$ quasi-norm variational regularizer to enforce interpretable sparsity in CP decomposition; (2) a Jacobian-smoothing regularizer that avoids singular value decomposition (SVD) and instead employs Hutchinson’s trace estimator to efficiently constrain the spectral norm, thereby enhancing modeling capacity and generalization over continuous domains; and (3) neural parameterization of CP factor functions, balancing nonlinear expressivity with structural interpretability. Experiments on image inpainting, denoising, and point cloud upsampling demonstrate significant improvements over state-of-the-art implicit and tensor-based methods, validating the approach’s effectiveness, robustness, and practical applicability.

Technology Category

Application Category

📝 Abstract
Higher-order tensors are well-suited for representing multi-dimensional data, such as color images and videos. Low-rank tensor representation has become essential in machine learning and computer vision, but existing methods like Tucker decomposition offer flexibility at the expense of interpretability. In contrast, while the CANDECOMP/PARAFAC (CP) decomposition provides a more natural and interpretable tensor structure, obtaining sparse solutions remains challenging. Leveraging the rich properties of CP decomposition, we propose a CP-based low-rank tensor function parameterized by neural networks for implicit neural representation (CP-INR). This approach enables continuous data representation beyond structured grids, fully exploiting the non-linearity of tensor data with theoretical guarantees on excess risk bounds. To achieve a sparse CP decomposition, we introduce a variational form of the Schatten-p quasi-norm and prove its relationship to multilinear rank minimization. For smoothness, we propose a regularization term based on the spectral norm of the Jacobian and Hutchinson's trace estimator. Our proposed smoothness regularization is SVD-free and avoids explicit chain rule derivations. It can serve as an alternative to Total Variation (TV) regularization in image denoising tasks and is naturally applicable to continuous data. Extensive experiments on multi-dimensional data recovery tasks, including image inpainting, denoising, and point cloud upsampling, demonstrate the superiority and versatility of our method compared to state-of-the-art approaches.
Problem

Research questions and friction points this paper is trying to address.

Develops CP-based low-rank tensor function for implicit neural representation
Introduces Schatten-p quasi-norm for sparse CP decomposition
Proposes Jacobian spectral norm regularization for smooth data recovery
Innovation

Methods, ideas, or system contributions that make the work stand out.

CP-based low-rank tensor neural representation
Schatten-p quasi-norm for sparse decomposition
SVD-free Jacobian spectral norm regularization
Z
Zhengyun Cheng
School of Electronics and Information, Northwestern Polytechnical University, Xi’an 710129, China
Changhao Wang
Changhao Wang
UC Berkeley
RoboticsReinforcement LearningControl
G
Guanwen Zhang
School of Electronics and Information, Northwestern Polytechnical University, Xi’an 710129, China
Y
Yi Xu
School of Control Science and Engineering, Dalian University of Technology, Dalian 116081, China
W
Wei Zhou
School of Electronics and Information, Northwestern Polytechnical University, Xi’an 710129, China
X
Xiangyang Ji
Tsinghua University, Beijing 100190, China