The exact amount of t-ness that the normal model can tolerate

📅 2026-03-27
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This study addresses the robustness of inference under model misspecification when data are generated from a t-distribution but analyzed using a normal model. Within a local asymptotic framework, it establishes a precise boundary for this robustness: when the degrees-of-freedom parameter satisfies \( m \geq 1.458\sqrt{n} \), the misspecified normal maximum likelihood estimator achieves higher estimation accuracy than the correctly specified yet more complex three-parameter t-model. By integrating “corner asymptotics” theory with scale-mixture modeling, the work proposes a compromise estimator that lies between purely normal and fully non-normal approaches. The results are extended to regression settings and general normal scale mixtures, revealing universal conditions under which model misspecification paradoxically yields superior precision.
📝 Abstract
Suppose that the normal model is used for data $Y_1,\ldots,Y_n$, but that the true distribution is a t-distribution with location and scale parameters $ξ$ and $σ$ and $m$ degrees of freedom. The normal model corresponds to $m=\infty$. Using a local asymptotic framework where $m$ is allowed to increase with $n$ two classes of estimands are identified. One small class, which in particular contains the functions of $ξ$ alone, is only affected by t-ness to the second order, and maximum likelihood estimation in the two- or three-parameter models become equivalent. For all other estimands it is shown that if $m\ge1.458\sqrt{n}$, then maximum likelihood estimation using the incorrect normal model is still more precise than using the correct three-parameter model. This is furthermore shown to be true in regression models with t-distributed residuals. We also propose and analyse compromise estimators that in various ways interpolate between the normal and the nonnormal models. A separate section extends the t-ness results to general normal scale mixtures, in which case the tolerance radius around the normal error distribution takes the form of an upper bound $0.3429/\sqrt{n}$ for the variance of the scale mixture distribution. Proving our results requires somewhat nonstandard `corner asymptotics' since behaviour of estimators must be studied when the crucial parameter $γ=1/m$ is close to zero, which is not an inner point of the parameter space, and the maximum likelihood estimator of $m$ is equal to $\infty$ with positive probability.
Problem

Research questions and friction points this paper is trying to address.

t-distribution
normal model misspecification
robustness
asymptotic efficiency
scale mixtures
Innovation

Methods, ideas, or system contributions that make the work stand out.

t-ness tolerance
local asymptotic framework
maximum likelihood efficiency
normal scale mixtures
corner asymptotics
🔎 Similar Papers
No similar papers found.