Extremal graphical modeling with latent variables via convex optimization

📅 2024-03-14
📈 Citations: 2
Influential: 0
📄 PDF
🤖 AI Summary
In multivariate extreme-value modeling, latent variables are pervasive and render conventional graphical models invalid. To address this, we propose eglent, the first tractable convex optimization method for learning Hüsler–Reiss extremal graphical models with latent variables. eglent integrates sparse–low-rank matrix decomposition into the extremal graph learning framework, jointly estimating the conditional dependence graph structure and the influence of latent variables—without requiring full observability. It consistently recovers both the graph structure and the number of latent variables. Theoretically, we establish finite-sample statistical consistency guarantees. Empirically, eglent significantly improves graph recovery accuracy on both synthetic and real-world datasets. Our core contribution is breaking the identifiability barrier in latent-variable extremal graph learning by introducing the first convex optimization paradigm that is computationally feasible, statistically interpretable, and theoretically grounded.

Technology Category

Application Category

📝 Abstract
Extremal graphical models encode the conditional independence structure of multivariate extremes and provide a powerful tool for quantifying the risk of rare events. Prior work on learning these graphs from data has focused on the setting where all relevant variables are observed. For the popular class of H""usler-Reiss models, we propose the exttt{eglatent} method, a tractable convex program for learning extremal graphical models in the presence of latent variables. Our approach decomposes the H""usler-Reiss precision matrix into a sparse component encoding the graphical structure among the observed variables after conditioning on the latent variables, and a low-rank component encoding the effect of a few latent variables on the observed variables. We provide finite-sample guarantees of exttt{eglatent} and show that it consistently recovers the conditional graph as well as the number of latent variables. We highlight the improved performances of our approach on synthetic and real data.
Problem

Research questions and friction points this paper is trying to address.

Learning extremal graphical models with latent variables
Decomposing precision matrix into sparse and low-rank components
Recovering conditional graph and number of latent variables
Innovation

Methods, ideas, or system contributions that make the work stand out.

Convex optimization for extremal graphical modeling
Decomposes precision matrix into sparse and low-rank components
Recovers conditional graph and number of latent variables