🤖 AI Summary
This work addresses the challenge of selecting an appropriate context window length in autoregressive neural PDE simulators by proposing SAKE, a low-overhead two-stage algorithm. SAKE first constructs a structured candidate set based on physically interpretable system anchors and then efficiently identifies a near-optimal window length via knee-point estimation. To the best of our knowledge, this is the first approach to frame context window selection as a standalone, lightweight hyperparameter optimization task. Evaluated across all eight PDE families in PDEBench, SAKE achieves a 67.8% exact hit rate, a 91.7% neighbor hit rate, and an average regret of only 6.1%, while requiring just 5.1% of the search cost—reducing computational overhead by 94.9%—thereby striking a strong balance between predictive accuracy and computational efficiency.
📝 Abstract
Autoregressive neural PDE simulators predict the evolution of physical fields one step at a time from a finite history, but low-cost context-window selection for such simulators remains an unformalized problem. Existing approaches to context-window selection in time-series forecasting include exhaustive validation, direct low-cost search, and system-theoretic memory estimation, but they are either expensive, brittle, or not directly aligned with downstream rollout performance. We formalize explicit context-window selection for fixed-window autoregressive neural PDE simulators as an independent low-cost algorithmic problem, and propose \textbf{System-Anchored Knee Estimation (SAKE)}, a two-stage method that first identifies a small structured candidate set from physically interpretable system anchors and then performs knee-aware downstream selection within it. Across all eight PDEBench families evaluated under the shared \(L\in\{1,\dots,16\}\) protocol, SAKE is the strongest overall matched-budget low-cost selector among the evaluated methods, achieving 67.8\% Exact, 91.7\% Within-1, 6.1\% mean regret@knee, and a cost ratio of 0.051 (94.9\% normalized search-cost savings).