Nonlinear Laplacians: Tunable principal component analysis under directional prior information

📅 2025-05-18
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This paper addresses the detection and estimation of rank-one signals with directional priors (e.g., nonnegative signal components) from noisy observations. Conventional PCA and spectral methods suffer from fundamental limitations in this setting. To overcome them, we propose a nonlinear Laplacian matrix construction: given the observation matrix $ Y $, we apply a nonlinear activation function $ sigma $ to the degree vector $ Ymathbf{1} $, incorporate the resulting diagonal correction term, and extract the leading eigenvector. We establish, for the first time, a rigorous theoretical framework coupling nonlinear Laplacian spectral analysis with directional priors, precisely characterizing how the detectability threshold—the critical signal-to-noise ratio—depends on $ sigma $. The method is non-iterative, tunable, computationally efficient, and robust. Theory shows a significant reduction in the critical SNR; on models such as Gaussian planted submatrix, it outperforms standard spectral algorithms and even sophisticated iterative methods like AMPA under optimal $ sigma $, achieving both statistical optimality and computational simplicity.

Technology Category

Application Category

📝 Abstract
We introduce a new family of algorithms for detecting and estimating a rank-one signal from a noisy observation under prior information about that signal's direction, focusing on examples where the signal is known to have entries biased to be positive. Given a matrix observation $mathbf{Y}$, our algorithms construct a nonlinear Laplacian, another matrix of the form $mathbf{Y} + mathrm{diag}(sigma(mathbf{Y}mathbf{1}))$ for a nonlinear $sigma: mathbb{R} o mathbb{R}$, and examine the top eigenvalue and eigenvector of this matrix. When $mathbf{Y}$ is the (suitably normalized) adjacency matrix of a graph, our approach gives a class of algorithms that search for unusually dense subgraphs by computing a spectrum of the graph"deformed"by the degree profile $mathbf{Y}mathbf{1}$. We study the performance of such algorithms compared to direct spectral algorithms (the case $sigma = 0$) on models of sparse principal component analysis with biased signals, including the Gaussian planted submatrix problem. For such models, we rigorously characterize the critical threshold strength of rank-one signal, as a function of the nonlinearity $sigma$, at which an outlier eigenvalue appears in the spectrum of a nonlinear Laplacian. While identifying the $sigma$ that minimizes this critical signal strength in closed form seems intractable, we explore three approaches to design $sigma$ numerically: exhaustively searching over simple classes of $sigma$, learning $sigma$ from datasets of problem instances, and tuning $sigma$ using black-box optimization of the critical signal strength. We find both theoretically and empirically that, if $sigma$ is chosen appropriately, then nonlinear Laplacian spectral algorithms substantially outperform direct spectral algorithms, while avoiding the complexity of broader classes of algorithms like approximate message passing or general first order methods.
Problem

Research questions and friction points this paper is trying to address.

Detects rank-one signals from noisy observations with directional priors
Improves dense subgraph detection via nonlinear Laplacian spectral analysis
Optimizes nonlinearity to outperform direct spectral algorithms
Innovation

Methods, ideas, or system contributions that make the work stand out.

Nonlinear Laplacian for rank-one signal detection
Deformed graph spectrum for dense subgraph search
Optimized nonlinearity to enhance spectral algorithms
🔎 Similar Papers
No similar papers found.
Y
Yuxin Ma
Department of Applied Mathematics and Statistics, Johns Hopkins University
Dmitriy Kunisky
Dmitriy Kunisky
Johns Hopkins University
probability theoryoptimizationalgorithms