🤖 AI Summary
This paper establishes the Pinsker lower bound for inner-product kernel regression on the high-dimensional sphere $mathbb{S}^d$, under the precise asymptotic regime where the sample size satisfies $n = alpha d^gamma (1 + o_d(1))$. Methodologically, it integrates tools from high-dimensional probability, spherical harmonic analysis, kernel method theory, minimax statistical inference, spectral theory, and asymptotics of orthogonal polynomials. The main contribution is the first derivation—within this high-dimensional asymptotic framework—of the **exact Pinsker constant** and a **minimax risk expression with explicit constants**. Unlike prior work that only characterized convergence rates, this result precisely quantifies how the optimal estimation risk depends on the dimension $d$, the sample scaling parameters $alpha$ and $gamma$, and the structural properties of the kernel function. The findings thus provide a rigorous theoretical benchmark and quantitative guidance for nonparametric inference in high dimensions.
📝 Abstract
Building on recent studies of large-dimensional kernel regression, particularly those involving inner product kernels on the sphere $mathbb{S}^{d}$, we investigate the Pinsker bound for inner product kernel regression in such settings. Specifically, we address the scenario where the sample size $n$ is given by $alpha d^{gamma}(1+o_{d}(1))$ for some $alpha, gamma>0$. We have determined the exact minimax risk for kernel regression in this setting, not only identifying the minimax rate but also the exact constant, known as the Pinsker constant, associated with the excess risk.