🤖 AI Summary
This work addresses the challenge that traditional hyperparameter tuning methods struggle with oscillatory behaviors induced by rotational dynamics in game learning, often lacking adaptivity. To overcome this limitation, the paper proposes Modal LookAhead (MoLA), which introduces frequency-domain analysis into game optimization for the first time. MoLA adaptively adjusts hyperparameters by estimating the frequency characteristics of oscillatory dynamics in both continuous and discrete dynamical systems. The method offers strong theoretical convergence guarantees while maintaining low computational overhead. Empirical results demonstrate that MoLA significantly accelerates training in games featuring purely rotational or mixed dynamics, consistently outperforming existing baselines with only negligible additional computational cost.
📝 Abstract
Learning in smooth games fundamentally differs from standard minimization due to rotational dynamics, which invalidate classical hyperparameter tuning strategies. Despite their practical importance, effective methods for tuning in games remain underexplored. A notable example is LookAhead (LA), which achieves strong empirical performance but introduces additional parameters that critically influence performance. We propose a principled approach to hyperparameter selection in games by leveraging frequency estimation of oscillatory dynamics. Specifically, we analyze oscillations both in continuous-time trajectories and through the spectrum of the discrete dynamics in the associated frequency-based space. Building on this analysis, we introduce \emph{Modal LookAhead (MoLA)}, an extension of LA that selects the hyperparameters adaptively to a given problem. We provide convergence guarantees and demonstrate in experiments that MoLA accelerates training in both purely rotational games and mixed regimes, all with minimal computational overhead.