The Hidden Cost of Defaults in Recommender System Evaluation

📅 2025-08-28
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work identifies an unreported default configuration—specifically, RecBole’s implicit early stopping—in recommender system evaluation that covertly biases hyperparameter optimization: it prematurely terminates random and Bayesian search, severely restricting the effective search space and amplifying result variance to a degree comparable with algorithmic differences between optimizers. Moving beyond conventional model-centric auditing, this study pioneers a framework-behavior audit, systematically analyzing six recommendation models, two benchmark datasets, and multiple search strategies via execution tracing and variance quantification. Empirical results confirm that such defaults introduce substantial, invisible bias. The paper proposes concrete best practices—including explicit configuration logging, deterministic search-space bounding, and opt-in early stopping—to enhance framework transparency, reproducibility, and experimental rigor across the recommender systems toolchain.

Technology Category

Application Category

📝 Abstract
Hyperparameter optimization is critical for improving the performance of recommender systems, yet its implementation is often treated as a neutral or secondary concern. In this work, we shift focus from model benchmarking to auditing the behavior of RecBole, a widely used recommendation framework. We show that RecBole's internal defaults, particularly an undocumented early-stopping policy, can prematurely terminate Random Search and Bayesian Optimization. This limits search coverage in ways that are not visible to users. Using six models and two datasets, we compare search strategies and quantify both performance variance and search path instability. Our findings reveal that hidden framework logic can introduce variability comparable to the differences between search strategies. These results highlight the importance of treating frameworks as active components of experimental design and call for more transparent, reproducibility-aware tooling in recommender systems research. We provide actionable recommendations for researchers and developers to mitigate hidden configuration behaviors and improve the transparency of hyperparameter tuning workflows.
Problem

Research questions and friction points this paper is trying to address.

Auditing hidden defaults in RecBole recommender framework
Quantifying premature termination impact on hyperparameter optimization
Evaluating search strategy variability from undocumented framework logic
Innovation

Methods, ideas, or system contributions that make the work stand out.

Auditing RecBole framework's hidden defaults
Revealing premature termination in optimization methods
Recommending transparent hyperparameter tuning workflows
🔎 Similar Papers
No similar papers found.
H
Hannah Berling
University of Gothenburg, Gothenburg, Sweden
R
Robin Svahn
University of Gothenburg, Gothenburg, Sweden
Alan Said
Alan Said
University of Gothenburg
Recommender SystemsPersonalizationHuman-centered Artificial Intelligence