π€ AI Summary
Hardware reverse engineering (HRE) research has long suffered from fragmentation and a lack of systematic integration, impeding rigorous technical evaluation and reproducibility. This work presents the first systematization of knowledge (SoK) based on a comprehensive review of 187 peer-reviewed publications over the past two decades, focusing on integrated circuits, FPGAs, and netlist-level reverse engineering. We introduce an artifact-centric reproducibility framework, complemented by benchmark analyses and a survey of relevant legal and policy considerations. Empirical assessment reveals that only seven studies (4%) are fully reproducible. Building on these findings, we propose three key recommendations: enhancing artifact reusability, establishing unified evaluation benchmarks, and clarifying the legal boundaries of HRE research. Collectively, these contributions offer a roadmap to foster interdisciplinary collaboration and advance the field in a principled, reproducible manner.
π Abstract
As hardware serves as the root of trust in modern computing systems, Hardware Reverse Engineering (HRE) is foundational for security assurance. In practice, HRE enables critical security applications, including design verification, supply-chain assurance, and vulnerability discovery. Over the past two decades, academic research on Integrated Circuit (IC), Field-Programmable Gate Array (FPGA), and netlist reverse engineering has steadily grown. However, knowledge remains fragmented across domains and communities, which complicates assessing the state of the art and hampers identifying shared research challenges. In this paper, we present a systematization of knowledge based on an in-depth analysis of 187 peer-reviewed publications. Using this corpus, we characterize technical methods across the HRE workflow and identify technical and organizational challenges that impede research progress. We analyze all 30 artifacts from our corpus using established artifact evaluation practices. Key results could be reproduced for only seven publications (4%). Based on our findings, we derive stakeholder-centric recommendations for academia, industry, and government to enable more coordinated and reproducible HRE research. These recommendations target three cross-cutting opportunities: (i) improving reproducibility and reuse via artifact-centric practices, (ii) enabling rigorous comparability through standardized benchmarks and evaluation metrics, and (iii) improving legal clarity for public HRE research.