Statistical Limits for Finite-Rank Tensor Estimation

📅 2025-06-07
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This paper investigates the statistical limits of high-dimensional finite-rank tensor estimation under two novel settings: (1) tensor recovery under heteroscedastic (non-i.i.d.) noise, and (2) recovering unknown higher-order permutations from tensor observations—i.e., the higher-order assignment problem. Leveraging statistical physics methods—including the replica method and cavity method—combined with Bayesian optimal inference theory, we derive, for the first time, asymptotically exact formulas for mutual information and minimum mean-square error (MMSE) in the heteroscedastic regime, and obtain rigorous analytic expressions for the Bayes free energy and MMSE. Our key contributions are: (i) establishing sharp statistical detectability phase transition thresholds for both settings; (ii) constructing the first unified information-theoretic analysis framework for such problems; and (iii) introducing a novel modeling paradigm for the higher-order assignment problem grounded in tensor inference and statistical physics.

Technology Category

Application Category

📝 Abstract
This paper provides a unified framework for analyzing tensor estimation problems that allow for nonlinear observations, heteroskedastic noise, and covariate information. We study a general class of high-dimensional models where each observation depends on the interactions among a finite number of unknown parameters. Our main results provide asymptotically exact formulas for the mutual information (equivalently, the free energy) as well as the minimum mean-squared error in the Bayes-optimal setting. We then apply this framework to derive sharp characterizations of statistical thresholds for two novel scenarios: (1) tensor estimation in heteroskedastic noise that is independent but not identically distributed, and (2) higher-order assignment problems, where the goal is to recover an unknown permutation from tensor-valued observations.
Problem

Research questions and friction points this paper is trying to address.

Analyzing tensor estimation with nonlinear observations and noise
Deriving mutual information and error in Bayes-optimal settings
Characterizing statistical thresholds for heteroskedastic noise and permutations
Innovation

Methods, ideas, or system contributions that make the work stand out.

Unified framework for tensor estimation analysis
Asymptotically exact mutual information formulas
Sharp statistical thresholds for novel scenarios
🔎 Similar Papers
No similar papers found.
R
Riccardo Rossetti
Department of Statistical Science, Duke University
Galen Reeves
Galen Reeves
Duke University
information theoryhigh-dimensional statistics