Beyond Explanation: Evidentiary Rights for Algorithmic Accountability

๐Ÿ“… 2026-03-23
๐Ÿ“ˆ Citations: 0
โœจ Influential: 0
๐Ÿ“„ PDF
๐Ÿค– AI Summary
This study addresses the limitations of current algorithmic accountability mechanisms, which overemphasize explanations while lacking enforceable rights to evidence, thereby failing to support substantive disputes. Reconceptualizing algorithmic accountability not as a transparency issue but as a matter of procedural rights, this work introduces a novel five-category typology of โ€œdispute failuresโ€ and proposes a new paradigm centered on the right to evidence access and the right to counterfactual interrogation. By integrating legal case analysis, counterfactual input probing, and a procedural rights framework, the approach enables effective scrutiny of algorithmic behavior without requiring disclosure of internal model details. An analysis of 168 litigation cases reveals that, absent liability exemptions, plaintiffs with evidence access rights prevail in 97% of cases, compared to only 9% for those without such rights.

Technology Category

Application Category

๐Ÿ“ Abstract
Algorithmic accountability scholarship has focused heavily on explanation, helping affected parties understand why decisions were made. We argue this focus is insufficient. Explanation without evidentiary access does not enable meaningful contestation. A person told "your risk score was 0.73" understands the decision but cannot verify the score, test alternatives, or produce counter-evidence. We introduce a taxonomy of contestation failures, showing that most accountability interventions address only one failure mode (opacity) while leaving four others unaddressed. Drawing on analysis of 168 legal cases spanning algorithmic decision-making contexts, we find that contestation faces a two-gate structure: a procedural gate (evidentiary access) and a doctrinal gate (substantive liability rules). Among litigated cases, those without evidence access almost never succeed (9%); those with access succeed at rates approaching 97% in domains without liability shields. Where doctrinal immunities apply (e.g., Section 230), even full evidentiary scrutiny produces no liability. This association almost certainly reflects selection effects; our empirical contribution is diagnostic rather than causal. The data identify where contestation fails among observable cases, not whether providing access would change outcomes for currently-excluded cases. We propose evidentiary rights as the missing procedural component, and develop counterfactual interrogation rights that allow affected parties to probe decision systems with modified inputs and observe whether outcomes change, without requiring disclosure of model internals. This reframes algorithmic accountability from a transparency problem to a procedural rights problem.
Problem

Research questions and friction points this paper is trying to address.

algorithmic accountability
evidentiary rights
contestation failures
procedural access
explanation
Innovation

Methods, ideas, or system contributions that make the work stand out.

evidentiary rights
algorithmic accountability
counterfactual interrogation
contestation
procedural rights
๐Ÿ”Ž Similar Papers
No similar papers found.