π€ AI Summary
In shared cloud environments, resource interference severely degrades the robustness and accuracy of existing performance tuning tools. To address this, we propose a tournament-based application performance tuning methodβthe first to introduce relative performance comparison into automated tuning under noisy conditions. Our approach employs iterative pairwise configuration comparisons, robust execution scheduling, and statistical stability enhancement to effectively suppress interference effects. Compared to state-of-the-art tuners, it reduces execution time by over 27% and constrains performance variability to within 0.5%. The method achieves superior efficiency, precision, and robustness simultaneously, establishing a novel paradigm for automated performance optimization in interference-prone cloud environments.
π Abstract
This work introduces a new subarea of performance tuning -- performance tuning in a shared interference-prone computing environment. We demonstrate that existing tuners are significantly suboptimal by design because of their inability to account for interference during tuning. Our solution, DarwinGame, employs a tournament-based design to systematically compare application executions with different tunable parameter configurations, enabling it to identify the relative performance of different tunable parameter configurations in a noisy environment. Compared to existing solutions, DarwinGame achieves more than 27% reduction in execution time, with less than 0.5% performance variability. DarwinGame is the first performance tuner that will help developers tune their applications in shared, interference-prone, cloud environments.