What Makes the Best Decomposition? Investigating Binary Decomposition Under FCG Variance

📅 2025-06-24
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Binary reuse detection relies on function call graph (FCG)-driven binary decomposition, yet existing methods assume FCG structural stability across compiler versions—a flawed assumption, as compiler choices, optimization levels (especially inlining), and target architectures significantly distort FCGs. Method: This work systematically characterizes FCG variability across 17 compilers, 6 optimization levels, and 4 architectures, constructing a large-scale cross-compiler binary dataset; it proposes a decomposition evaluation framework grounded in mapping stability and clustering consistency. Contribution/Results: We identify three robust mapping patterns that persist despite drastic FCG size variations. Empirical evaluation reveals that mainstream decomposition methods suffer concurrent degradation in coverage and community consistency under cross-compiler settings. Our findings provide both theoretical foundations and methodological support for enhancing the robustness of binary reuse detection.

Technology Category

Application Category

📝 Abstract
Binary decomposition, which decomposes binary files into modules, plays a critical role in binary reuse detection. Existing binary decomposition works either apply anchor-based methods by extending anchor functions to generate modules, or apply clustering-based methods by using clustering algorithms to group binary functions, which all rely on that reused code shares similar function call relationships. However, we find that function call graphs (FCGs) vary a lot when using different compilation settings, especially with diverse function inlining decisions. In this work, we conduct the first systematic empirical study on the variance of FCGs compiled by various compilation settings and explore its effect on binary decomposition methods. We first construct a dataset compiled by 17 compilers, using 6 optimizations to 4 architectures and analyze the changes and mappings of the FCGs. We find that the size of FCGs changes dramatically, while the FCGs are still linked by three different kinds of mappings. Then we evaluate the existing works under the FCG variance, and results show that existing works are facing great challenges when conducting cross-compiler evaluation with diverse optimization settings. Finally, we propose a method to identify the optimal decomposition and compare the existing decomposition works with the optimal decomposition. Existing works either suffer from low coverage or cannot generate stable community similarities.
Problem

Research questions and friction points this paper is trying to address.

Investigates FCG variance impact on binary decomposition methods
Evaluates existing decomposition works under diverse compilation settings
Proposes method to identify optimal binary decomposition
Innovation

Methods, ideas, or system contributions that make the work stand out.

Systematic study on FCG variance impact
Dataset from 17 compilers analysis
Optimal decomposition method proposal
🔎 Similar Papers
No similar papers found.
Ang Jia
Ang Jia
Xi'an Jiaotong University
binary similarity
H
He Jiang
School of Software, and Key Laboratory for Ubiquitous Network and Service Software of Liaoning Province, Dalian Key Laboratory of Artificial Intelligence, Dalian University of Technology, Dalian 116024, China
Zhilei Ren
Zhilei Ren
Dalian University of Technology
Software EngineeringTestingProfilingEvolutionary Computation
X
Xiaochen Li
School of Software, Dalian University of Technology, China
M
Ming Fan
Ministry of Education Key Laboratory of Intelligent Networks and Network Security, Xi’an Jiaotong University, China
T
Ting Liu
Ministry of Education Key Laboratory of Intelligent Networks and Network Security, Xi’an Jiaotong University, China