🤖 AI Summary
This paper addresses core challenges to computational experiment reproducibility in HPC and systems research—namely, strong hardware dependence, complex environment configuration, and high operational costs. Methodologically, it introduces a layered, pragmatic solution framework grounded in a novel “feasibility–rigor trade-off” principle, and proposes a dual-dimensional (challenge–recommendation) model tailored for authors, reviewers, institutions, and the broader community. The approach integrates structured workshops, cross-role consensus modeling, actionable checklists, and explores ecosystem innovations including AI-assisted environment generation and artifact digital repositories. Key contributions include: (1) a comprehensive, lifecycle-spanning practical guide and standardized reproducibility checklist; (2) a widely adopted community consensus on reproducible practices within the HPC domain; and (3) a systemic pathway advancing reproducibility from conceptual principle to institutionalized practice.
📝 Abstract
This report synthesizes findings from the November 2024 Community Workshop on Practical Reproducibility in HPC, which convened researchers, artifact authors, reviewers, and chairs of reproducibility initiatives to address the critical challenge of making computational experiments reproducible in a cost-effective manner. The workshop deliberately focused on systems and HPC computer science research due to its unique requirements, including specialized hardware access and deep system reconfigurability. Through structured discussions, lightning talks, and panel sessions, participants identified key barriers to practical reproducibility and formulated actionable recommendations for the community. The report presents a dual framework of challenges and recommendations organized by target audience (authors, reviewers, organizations, and community). It characterizes technical obstacles in experiment packaging and review, including completeness of artifact descriptions, acquisition of specialized hardware, and establishing reproducibility conditions. The recommendations range from immediate practical tools (comprehensive checklists for artifact packaging) to ecosystem-level improvements (refining badge systems, creating artifact digital libraries, and developing AI-assisted environment creation). Rather than advocating for reproducibility regardless of cost, the report emphasizes striking an appropriate balance between reproducibility rigor and practical feasibility, positioning reproducibility as an integral component of scientific exploration rather than a burdensome afterthought. Appendices provide detailed, immediately actionable checklists for authors and reviewers to improve reproducibility practices across the HPC community.