🤖 AI Summary
Existing methods for context-dependent black-box optimization struggle to effectively exploit contextual information, and evolutionary algorithms—particularly CMA-ES—lack context-aware initialization strategies. To address this, we propose a context-aware warm-start mechanism tailored for CMA-ES. Our core contribution is the first joint modeling of CMA-ES distribution parameters (mean vector and covariance matrix) with context encodings via a meta-learned parameter mapping network. This enables cross-task compression of distribution states and context-embedding-guided covariance calibration. The resulting initialization is context-driven and transferable across tasks. Evaluated on multi-task benchmarks, our method accelerates convergence by 2.3–5.1× compared to standard CMA-ES and outperforms state-of-the-art meta-optimization approaches, demonstrating superior generalization capability.