🤖 AI Summary
This paper addresses Gaussian hypothesis testing under dependent continuous-time observations, extending the classical Stein lemma to non-i.i.d. continuous-domain settings. Methodologically, it introduces novel concepts—δ-typical sets and ε-well-orderedness—and integrates information-spectrum analysis, large-deviations theory, and relative entropy functional techniques. The main contribution is the first derivation of a generalized Chernoff–Stein lemma for correlated Gaussian processes. This lemma unifies the interplay among typicality, exponential error decay rates, and correlation structure, yielding a closed-form expression for the asymptotically optimal error exponent. Crucially, its tightness is rigorously established for arbitrary covariance structures. The results generalize classical discrete- and i.i.d.-domain counterparts and provide a foundational theoretical framework for statistical detection in correlated continuous-time signals.
📝 Abstract
In this manuscript we define the notion of"$delta$-typicality"for both entropy and relative entropy, as well as a notion of $epsilon$-goodness and provide an extension to Stein's lemma for continuous quantities as well as correlated setups. We apply the derived results on the Gaussian hypothesis testing problem where the observations are possibly correlated.