🤖 AI Summary
This paper studies learning unknown operators between separable Hilbert spaces from noisy samples, focusing on minimax risk bounds for uniformly bounded Lipschitz operators. Methodologically, it integrates minimax analysis, information-theoretic lower bound construction, functional estimation in random processes, and operator spectral theory to derive the first tight (matching or nearly matching) information-theoretic upper and lower bounds for this setting. Key contributions include: (1) revealing a “sample complexity curse” that precludes algebraic convergence rates; (2) achieving essentially optimal characterization under exponential decay of the covariance operator’s eigenvalues; and (3) establishing that the risk rate is governed solely by the spectrum of the covariance operator induced by the error metric—valid for both fixed and random designs, and extending to distributions with unbounded support. The results provide a unified theoretical benchmark for operator learning.
📝 Abstract
We develop a minimax theory for operator learning, where the goal is to estimate an unknown operator between separable Hilbert spaces from finitely many noisy input-output samples. For uniformly bounded Lipschitz operators, we prove information-theoretic lower bounds together with matching or near-matching upper bounds, covering both fixed and random designs under Hilbert-valued Gaussian noise and Gaussian white noise errors. The rates are controlled by the spectrum of the covariance operator of the measure that defines the error metric. Our setup is very general and allows for measures with unbounded support. A key implication is a curse of sample complexity which shows that the minimax risk for generic Lipschitz operators cannot decay at any algebraic rate in the sample size. We obtain essentially sharp characterizations when the covariance spectrum decays exponentially and provide general upper and lower bounds in slower-decay regimes.