Demographic Variability in Face Image Quality Measures

📅 2024-09-25
🏛️ Biometrics and Electronic Signatures
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This study systematically evaluates the demographic fairness of all facial image quality assessment (FIQA) metrics specified in ISO/IEC 29794-5 across age, gender, and skin tone—addressing potential demographic bias in online identity verification. Using a large-scale annotated face dataset, we employ standardized quality metrics, stratified group-wise comparisons, and multiple rigorous statistical tests to conduct the first empirical, multi-dimensional cross-group analysis of FIQA metrics mandated by an international standard. Results indicate that the majority of metrics exhibit no statistically significant performance disparities across demographic groups; only two metrics demonstrate robust and observable performance differences specifically with respect to skin tone. These findings provide critical empirical evidence to inform revisions of the FIQA standard, guide algorithmic bias mitigation strategies, and enhance fairness in biometric systems—thereby filling a key gap in the fairness evaluation of international biometric standards.

Technology Category

Application Category

📝 Abstract
Face image quality assessment (FIQA) algorithms are being integrated into online identity management applications. These applications allow users to upload a face image as part of their document issuance process, where the image is then run through a quality assessment process to make sure it meets the quality and compliance requirements. Concerns about demographic bias have always been raised about biometric systems, given the societal issues this may cause. Therefore, it is important that any demographic variability in FIQA algorithms is assessed so it can be mitigated. In this work, we study demographic variability in all face image quality measures included in the ISO/IEC 29794-5 international standard across three demographic variables: age, gender, and skin tone. The results are rather promising and show no clear bias toward any specific demographic group for most measures. Only two quality measures are found to have rather considerable variations in their outcomes for different groups on the skin tone variable.
Problem

Research questions and friction points this paper is trying to address.

FIQA Algorithm
Bias Evaluation
Diverse Populations
Innovation

Methods, ideas, or system contributions that make the work stand out.

FIQA Algorithm
Fairness Evaluation
Biometric Systems
🔎 Similar Papers
No similar papers found.