DarwinGame: Playing Tournaments for Tuning Applications in Noisy Cloud Environments

πŸ“… 2025-09-29
πŸ“ˆ Citations: 0
✨ Influential: 0
πŸ“„ PDF
πŸ€– AI Summary
In shared cloud environments, resource interference severely degrades the robustness and accuracy of existing performance tuning tools. To address this, we propose a tournament-based application performance tuning methodβ€”the first to introduce relative performance comparison into automated tuning under noisy conditions. Our approach employs iterative pairwise configuration comparisons, robust execution scheduling, and statistical stability enhancement to effectively suppress interference effects. Compared to state-of-the-art tuners, it reduces execution time by over 27% and constrains performance variability to within 0.5%. The method achieves superior efficiency, precision, and robustness simultaneously, establishing a novel paradigm for automated performance optimization in interference-prone cloud environments.

Technology Category

Application Category

πŸ“ Abstract
This work introduces a new subarea of performance tuning -- performance tuning in a shared interference-prone computing environment. We demonstrate that existing tuners are significantly suboptimal by design because of their inability to account for interference during tuning. Our solution, DarwinGame, employs a tournament-based design to systematically compare application executions with different tunable parameter configurations, enabling it to identify the relative performance of different tunable parameter configurations in a noisy environment. Compared to existing solutions, DarwinGame achieves more than 27% reduction in execution time, with less than 0.5% performance variability. DarwinGame is the first performance tuner that will help developers tune their applications in shared, interference-prone, cloud environments.
Problem

Research questions and friction points this paper is trying to address.

Tuning applications in shared noisy cloud environments
Addressing performance interference during parameter optimization
Systematically comparing configurations in interference-prone computing environments
Innovation

Methods, ideas, or system contributions that make the work stand out.

Tournament-based design for parameter comparison
Systematic performance evaluation in noisy environments
Reduces execution time with minimal performance variability
πŸ”Ž Similar Papers
No similar papers found.