Human, Algorithm, or Both? Gender Bias in Human-Augmented Recruiting

📅 2026-03-06
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This study addresses the concern that AI-driven hiring may exacerbate gender bias and investigates the underexplored empirical impact of human-AI collaboration on fairness. Leveraging real-world data from a hiring platform, it systematically compares gender equity outcomes across three decision-making paradigms: fully human, fully AI, and human-AI collaborative. By quantitatively analyzing gender distributions among candidates and disparities across key engagement stages—browsing, clicking, and contacting—the research reveals that human-AI collaboration yields the most gender-balanced candidate pools. Moreover, human recruiters’ proactive searches built upon AI recommendations significantly enhance gender fairness in subsequent interactions. These findings demonstrate that collaborative decision-making can surpass either standalone approach, leading to more equitable hiring outcomes.

Technology Category

Application Category

📝 Abstract
Recent years have seen rapid growth in the market for HR technology and AI-driven HR solutions in particular. This popularity has also resulted in increased attention to the negative aspects of using AI to support hiring practices, such as the risk of reinforcing existing biases against vulnerable groups based on gender or other sensitive attributes. Combining human experience with AI efficiency in making recruiting and selection decisions has the potential to help mitigate these biases, but despite a considerable amount of research on fairness in algorithmic hiring, actual empirical evaluations comparing the fairness of human, AI, and human-augmented decision-making remain scarce. In this study, we address this gap by presenting a quantitative analysis of gender bias across three scenarios of a real-world recruitment platform: (1) recruiters searching a CV database manually for relevant candidates, (2) AI-driven matching between candidates and jobs, and (3) a combination of human and AI-driven recruiting. We find that human recruiters produce lists of candidates that are fairer in terms of gender than the AI-only solution, with more deliberation by humans resulting in fairer outcomes. However, the combination of human and AI-driven is more than the sum of its parts and produces the fairest candidate lists: interacting with the slate of recommended candidates first before manually searching for additional candidates has a beneficial effect on the gender fairness of the set of candidates that are viewed, clicked, and contacted afterwards. Our work provides one of the first empirical comparisons of fairness across human, AI, and hybrid recruiting processes, offering evidence to inform the development of more equitable hiring practices and highlighting the importance of human oversight for mitigating bias in algorithmic hiring.
Problem

Research questions and friction points this paper is trying to address.

gender bias
algorithmic hiring
human-AI collaboration
recruiting fairness
HR technology
Innovation

Methods, ideas, or system contributions that make the work stand out.

human-AI collaboration
gender bias
algorithmic fairness
recruiting
hybrid decision-making
🔎 Similar Papers
No similar papers found.