🤖 AI Summary
In Python search-based unit test generation, the DynaMOSA and MIO algorithms suffer from inefficient hyperparameter configurations; their default settings often yield suboptimal code coverage, while conventional grid search incurs excessive computational overhead. Method: This paper introduces differential evolution (DE) into the Pynguin framework for automated multi-objective hyperparameter optimization of search-based test generation algorithms. We design a fitness function and encoding scheme tailored to testing objectives and conduct end-to-end tuning experiments on standard benchmarks. Contribution/Results: DE-optimized DynaMOSA achieves significant improvements in branch and line coverage (average +8.2%) over baseline configurations. Moreover, it converges 3.7× faster than grid search while reducing tuning overhead by approximately 65%. This work establishes an efficient, reproducible hyperparameter optimization paradigm for search-based test generation.
📝 Abstract
Search-based test-generation algorithms have countless configuration options. Users rarely adjust these options and usually stick to the default values, which may not lead to the best possible results. Tuning an algorithm's hyperparameters is a method to find better hyperparameter values, but it typically comes with a high demand of resources. Meta-heuristic search algorithms -- that effectively solve the test-generation problem -- have been proposed as a solution to also efficiently tune parameters. In this work we explore the use of differential evolution as a means for tuning the hyperparameters of the DynaMOSA and MIO many-objective search algorithms as implemented in the Pynguin framework. Our results show that significant improvement of the resulting test suite's coverage is possible with the tuned DynaMOSA algorithm and that differential evolution is more efficient than basic grid search.