Discretion in the Loop: Human Expertise in Algorithm-Assisted College Advising

📅 2025-05-19
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This study addresses the underexamined role of human expert intervention in algorithmic early-warning systems. Leveraging randomized controlled trial data from Georgia State University, we integrate causal graph modeling, heterogeneous treatment effect estimation, structured log analysis, and qualitative coding of advisor notes to establish, for the first time, a “human expertise–algorithm output–student outcome” causal framework and propose testable identification conditions for “expert-directed intervention.” Results indicate that approximately 67% of advisor interventions rely on contextual factors—such as financial hardship or family crises—not captured by the algorithm. Expert involvement significantly improves student retention and graduation rates while enhancing system fairness. We further identify multiple statistically significant intervention patterns. These findings introduce a novel evaluation paradigm and design principles for algorithm-augmented educational decision-making, advancing both theoretical understanding and practical implementation of human-in-the-loop AI systems in education.

Technology Category

Application Category

📝 Abstract
In higher education, many institutions use algorithmic alerts to flag at-risk students and deliver advising at scale. While much research has focused on evaluating algorithmic predictions, relatively little is known about how discretionary interventions by human experts shape outcomes in algorithm-assisted settings. We study this question using rich quantitative and qualitative data from a randomized controlled trial of an algorithm-assisted advising program at Georgia State University. Taking a mixed-methods approach, we examine whether and how advisors use context unavailable to an algorithm to guide interventions and influence student success. We develop a causal graphical framework for human expertise in the interventional setting, extending prior work on discretion in purely predictive settings. We then test a necessary condition for discretionary expertise using structured advisor logs and student outcomes data, identifying several interventions that meet the criterion for statistical significance. Accordingly, we estimate that 2 out of 3 interventions taken by advisors in the treatment arm were plausibly"expertly targeted"to students using non-algorithmic context. Systematic qualitative analysis of advisor notes corroborates these findings, showing that advisors incorporate diverse forms of contextual information--such as personal circumstances, financial issues, and student engagement--into their decisions. Finally, we explore the broader implications of human discretion for long-term outcomes and equity, using heterogeneous treatment effect estimation. Our results offer theoretical and practical insight into the real-world effectiveness of algorithm-supported college advising, and underscore the importance of accounting for human expertise in the design, evaluation, and implementation of algorithmic decision systems.
Problem

Research questions and friction points this paper is trying to address.

How human experts shape outcomes in algorithm-assisted advising
Whether advisors use non-algorithmic context to guide interventions
The impact of human discretion on long-term outcomes and equity
Innovation

Methods, ideas, or system contributions that make the work stand out.

Mixed-methods approach combining quantitative and qualitative data
Causal graphical framework for human expertise
Heterogeneous treatment effect estimation for equity
🔎 Similar Papers
No similar papers found.