Connectivity Shapes Implicit Regularization in Matrix Factorization Models for Matrix Completion

📅 2024-05-22
🏛️ Neural Information Processing Systems
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
The implicit regularization mechanism of matrix factorization models in matrix completion lacks a unified theoretical explanation, particularly regarding when and why low-rank versus minimum nuclear norm regularization dominates. Method: We identify observation graph connectivity as the key driver of implicit regularization and establish that an implicit preference shift—from minimum nuclear norm to low-rank solutions—occurs via a phase transition as connectivity increases. Building on this, we construct the first intrinsic invariant manifold hierarchy unifying both convergence regimes; theoretically characterize the dynamical conditions under which either the minimum nuclear norm or the minimum rank solution emerges; and provide rigorous proofs via nonconvex optimization analysis, manifold dynamics modeling, and matrix differential geometry. Results: Experiments validate the connectivity–phase transition relationship, with theoretical conditions closely matching empirical observations—significantly advancing the understanding of generalization in overparameterized matrix factorization models.

Technology Category

Application Category

📝 Abstract
Matrix factorization models have been extensively studied as a valuable test-bed for understanding the implicit biases of overparameterized models. Although both low nuclear norm and low rank regularization have been studied for these models, a unified understanding of when, how, and why they achieve different implicit regularization effects remains elusive. In this work, we systematically investigate the implicit regularization of matrix factorization for solving matrix completion problems. We empirically discover that the connectivity of observed data plays a crucial role in the implicit bias, with a transition from low nuclear norm to low rank as data shifts from disconnected to connected with increased observations. We identify a hierarchy of intrinsic invariant manifolds in the loss landscape that guide the training trajectory to evolve from low-rank to higher-rank solutions. Based on this finding, we theoretically characterize the training trajectory as following the hierarchical invariant manifold traversal process, generalizing the characterization of Li et al. (2020) to include the disconnected case. Furthermore, we establish conditions that guarantee minimum nuclear norm, closely aligning with our experimental findings, and we provide a dynamics characterization condition for ensuring minimum rank. Our work reveals the intricate interplay between data connectivity, training dynamics, and implicit regularization in matrix factorization models.
Problem

Research questions and friction points this paper is trying to address.

Understanding implicit regularization in matrix factorization models
Exploring data connectivity's role in low-rank vs nuclear norm bias
Characterizing training dynamics on invariant manifolds for matrix completion
Innovation

Methods, ideas, or system contributions that make the work stand out.

Investigates implicit regularization in matrix factorization
Identifies data connectivity's role in regularization bias
Theoretically characterizes hierarchical training trajectory
🔎 Similar Papers
No similar papers found.
Zhiwei Bai
Zhiwei Bai
Shanghai Jiao Tong University
Machine Learning;Deep Learning
J
Jiajie Zhao
School of Mathematical Sciences, Institute of Natural Sciences, MOE-LSC, Shanghai Jiao Tong University, Shanghai 200240, P.R. China.
Yaoyu Zhang
Yaoyu Zhang
Shanghai Jiao Tong University
Deep Learning Theory