Minimax-optimal and Locally-adaptive Online Nonparametric Regression

📅 2024-10-04
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This paper addresses the problem of locally adaptive optimal estimation in adversarial online nonparametric regression without prior knowledge of smoothness. For general convex losses, we propose the first parameter-free algorithm: it constructs a dynamic local Hölder profile tracking mechanism via pruned chaining trees, enabling data-driven adaptation to unknown spatially varying smoothness. Unlike existing approaches requiring global or piecewise-constant Hölder exponents, our method imposes no such assumptions and achieves computationally efficient, fully data-dependent local tuning. Theoretically, it attains the minimax optimal regret bound over Hölder function classes globally, while preserving locally adaptive optimal rates even in regions of heterogeneous smoothness. Compared to methods relying on fixed smoothness assumptions, our approach significantly enhances robustness and practical applicability.

Technology Category

Application Category

📝 Abstract
We study adversarial online nonparametric regression with general convex losses and propose a parameter-free learning algorithm that achieves minimax optimal rates. Our approach leverages chaining trees to compete against H{""o}lder functions and establishes optimal regret bounds. While competing with nonparametric function classes can be challenging, they often exhibit local patterns - such as local H{""o}lder continuity - that online algorithms can exploit. Without prior knowledge, our method dynamically tracks and adapts to different H{""o}lder profiles by pruning a core chaining tree structure, aligning itself with local smoothness variations. This leads to the first computationally efficient algorithm with locally adaptive optimal rates for online regression in an adversarial setting. Finally, we discuss how these notions could be extended to a boosting framework, offering promising directions for future research.
Problem

Research questions and friction points this paper is trying to address.

Achieves minimax optimal rates in adversarial online nonparametric regression
Adapts dynamically to local Hölder continuity without prior knowledge
Proposes first computationally efficient algorithm for locally adaptive optimal rates
Innovation

Methods, ideas, or system contributions that make the work stand out.

Parameter-free algorithm for minimax optimal rates
Dynamic adaptation to local Hölder profiles
Computationally efficient with chaining trees
🔎 Similar Papers
No similar papers found.