๐ค AI Summary
This work addresses the lack of a sustainable, transparent, and reproducible physical design benchmark in academia, which has hindered fair comparisons between 2D and 3D integration approachesโsuch as face-to-face (F2F) hybrid bonding. To this end, we present RosettaStone 2.0, an open-source evaluation framework built upon OpenROAD-Research that provides a unified RTL-to-GDS reference flow, enabling, for the first time, equitable comparison of 2D and 3D designs within a single platform. The framework incorporates continuous integration (CI) regression testing, the METRICS2.1 evaluation specification, structured logging, and automated report generation. Furthermore, it introduces a community-driven public leaderboard and enforces Developer Certificate of Origin (DCO) compliance, significantly enhancing the reproducibility, transparency, and credibility of academic research in physical design.
๐ Abstract
This paper presents RosettaStone 2.0, an open benchmark translation and evaluation framework built on OpenROAD-Research. RosettaStone 2.0 provides complete RTL-to-GDS reference flows for both conventional 2D designs and Pin-3D-style face-to-face (F2F) hybrid-bonded 3D designs, enabling rigorous apples-to-apples comparison across planar and three-dimensional implementation settings. The framework is integrated within OpenROAD-flow-scripts (ORFS)-Research; it incorporates continuous integration (CI)-based regression testing and provides a standardized evaluation pipeline based on the METRICS2.1 convention, with structured logs and reports generated by ORFS-Research. To support transparent and reproducible research, RosettaStone 2.0 further provides a community-facing leaderboard, which is governed by verified pull requests and enforced through Developer Certificate of Origin (DCO) compliance.