🤖 AI Summary
This work addresses the high query cost and limited practicality of black-box model inversion attacks (MIAs) against face recognition systems. To tackle this, we propose an efficient, low-query MIA framework tailored for facial privacy protection. Methodologically, we introduce long-tailed learning—a concept previously unexplored in MIAs—into proxy model initialization, enabling more robust initializations for underrepresented facial identities. We further integrate an NGOpt-based adaptive optimizer selection mechanism to guide zeroth-order black-box optimization (e.g., CMA-ES, SPSA). Compared to state-of-the-art methods, our approach reduces query counts by approximately 95% across multiple face recognition models while preserving high-resolution reconstruction fidelity and achieving top-tier attack success rates. This substantial improvement in query efficiency and practical viability advances black-box MIA as a more feasible and scalable benchmark for evaluating privacy risks in facial recognition systems.
📝 Abstract
Model Inversion Attacks (MIAs) aim to reconstruct private training data from models, leading to privacy leakage, particularly in facial recognition systems. Although many studies have enhanced the effectiveness of white-box MIAs, less attention has been paid to improving efficiency and utility under limited attacker capabilities. Existing black-box MIAs necessitate an impractical number of queries, incurring significant overhead. Therefore, we analyze the limitations of existing MIAs and introduce Surrogate Model-based Inversion with Long-tailed Enhancement (SMILE), a high-resolution oriented and query-efficient MIA for the black-box setting. We begin by analyzing the initialization of MIAs from a data distribution perspective and propose a long-tailed surrogate training method to obtain high-quality initial points. We then enhance the attack's effectiveness by employing the gradient-free black-box optimization algorithm selected by NGOpt. Our experiments show that SMILE outperforms existing state-of-the-art black-box MIAs while requiring only about 5% of the query overhead.