🤖 AI Summary
This work addresses the asymptotic convergence and non-asymptotic maximal inequalities for semidefinite matrix-valued martingales and reverse submartingales. Motivated by the lack of systematic convergence analysis and adaptive stopping-time control under the Loewner order in existing theory, we establish, for the first time, a Loewner-order-based martingale convergence theorem and adaptive maximal inequalities for stopping times—unifying light-tailed, heavy-tailed, and self-normalized settings. Our approach integrates matrix probability theory, random matrix theory, martingale analysis, and dependence modeling to derive novel matrix concentration inequalities. Crucially, our results overcome the limitations of classical Chernoff-type bounds—which require fixed sample sizes and independence—by accommodating arbitrary (possibly data-dependent) sample sizes and stopping times. This significantly broadens theoretical applicability and practical utility in high-dimensional statistical inference and online machine learning.
📝 Abstract
We explore the asymptotic convergence and nonasymptotic maximal inequalities of supermartingales and backward submartingales in the space of positive semidefinite matrices. These are natural matrix analogs of scalar nonnegative supermartingales and backward nonnegative submartingales, whose convergence and maximal inequalities are the theoretical foundations for a wide and ever-growing body of results in statistics, econometrics, and theoretical computer science. Our results lead to new concentration inequalities for either martingale dependent or exchangeable random symmetric matrices under a variety of tail conditions, encompassing now-standard Chernoff bounds to self-normalized heavy-tailed settings. Further, these inequalities are usually expressed in the Loewner order, are sometimes valid simultaneously for all sample sizes or at an arbitrary data-dependent stopping time, and can often be tightened via an external randomization factor.