🤖 AI Summary
This work addresses the persistent “novelty treadmill” and reproducibility crisis in software engineering research by proposing the systematic adoption of the Registered Report format in top-tier conferences such as ICSE, FSE, and ISSTA. The approach introduces a two-stage review process that separates the evaluation of research proposals from the assessment of empirical results. By subjecting study designs to peer review prior to data collection or experimentation, the model ensures methodological rigor upfront; a second review then evaluates the fidelity and quality of result execution after completion. This paradigm shifts away from the conventional single-stage review, incentivizing robust, reproducible science over novelty alone. Grounded in community-wide surveys and cross-conference collaboration, the initiative has garnered broad support and offers a practical pathway toward enhancing the scientific credibility and reproducibility of software engineering research.
📝 Abstract
To address the'novelty-vicious cycle'and the'replicability crisis'of the field (both discussed in the survey) we propose abolishing the"ICSE paper"as we know it and replacing it with a two-tier system that also evolves the existing notion of'Registered Report'. Authors proposing a new idea, experiment, or analysis would submit a"Registered Proposal"of their idea and the proposed experimental methodology to undergo peer review. The following year, anyone can submit (shorter)"Results Reports"on the realization of the empirical work based on the registered proposals of the previous ICSE (or FSE or ISSTA or ASE etc.). Both works should be first class citizens of the mainstream events. We argue that such a disruptive (heretical?) idea is supported and based on the responses of the community of the Future of Software Engineering pre-survey