🤖 AI Summary
Detecting deepfakes (e.g., face swapping and facial reenactment) in eKYC systems remains challenging due to diverse and unseen manipulations. This paper proposes a robust video-level detection method: it leverages a pre-trained face feature extractor to obtain frame-wise identity embeddings, models their temporal consistency, and jointly computes dynamic identity deviation by incorporating the user’s registered reference image as a trusted anchor. This design significantly enhances generalization against unknown image degradations—including compression, motion blur, and illumination variations—as well as previously unseen forgery types. Extensive experiments on multiple benchmarks demonstrate state-of-the-art performance across diverse forgery generators (FaceSwap, DeepFake, FSGAN, First Order Motion), while maintaining high robustness under cross-domain degradation scenarios. The source code is publicly available.
📝 Abstract
In this paper, we present a deepfake detection algorithm specifically designed for electronic Know Your Customer (eKYC) systems. To ensure the reliability of eKYC systems against deepfake attacks, it is essential to develop a robust deepfake detector capable of identifying both face swapping and face reenactment, while also being robust to image degradation. We address these challenges through three key contributions: (1)~Our approach evaluates the video's authenticity by detecting temporal inconsistencies in identity vectors extracted by face recognition models, leading to comprehensive detection of both face swapping and face reenactment. (2)~In addition to processing video input, the algorithm utilizes a registered image (assumed to be genuine) to calculate identity discrepancies between the input video and the registered image, significantly improving detection accuracy. (3)~We find that employing a face feature extractor trained on a larger dataset enhances both detection performance and robustness against image degradation. Our experimental results show that our proposed method accurately detects both face swapping and face reenactment comprehensively and is robust against various forms of unseen image degradation. Our source code is publicly available https://github.com/TaikiMiyagawa/DeepfakeDetection4eKYC.