đ¤ AI Summary
This work addresses the lack of a precise definition of âpersonalizationâ in existing algorithmic recourse methods, which hinders systematic evaluation of its impact on effectiveness, cost, and reasonableness. The paper formalizes personalization as individualized actionability by incorporating hard constraintsârestricting the set of actionable featuresâand soft constraintsâmodeling usersâ preferences over the value and cost of recommended actionsâwithin a causal recourse framework. It further introduces a pre-recourse user prompting mechanism to enable personalized recommendations. Experimental results demonstrate that hard constraints substantially reduce both the effectiveness and reasonableness of recourse suggestions. Moreover, significant disparities emerge across social groups in terms of recourse cost and reasonableness, revealing a complex trade-off between personalized design and fairness.
đ Abstract
Algorithmic recourse aims to provide actionable recommendations that enable individuals to change unfavorable model outcomes, and prior work has extensively studied properties such as efficiency, robustness, and fairness. However, the role of personalization in recourse remains largely implicit and underexplored. While existing approaches incorporate elements of personalization through user interactions, they typically lack an explicit definition of personalization and do not systematically analyze its downstream effects on other recourse desiderata.
In this paper, we formalize personalization as individual actionability, characterized along two dimensions: hard constraints that specify which features are individually actionable, and soft, individualized constraints that capture preferences over action values and costs. We operationalize these dimensions within the causal algorithmic recourse framework, adopting a pre-hoc user-prompting approach in which individuals express preferences via rankings or scores prior to the generation of any recourse recommendation. Through extensive empirical evaluation, we investigate how personalization interacts with key recourse desiderata, including validity, cost, and plausibility. Our results highlight important trade-offs: individual actionability constraints, particularly hard ones, can substantially degrade the plausibility and validity of recourse recommendations across amortized and non-amortized approaches. Notably, we also find that incorporating individual actionability can reveal disparities in the cost and plausibility of recourse actions across socio-demographic groups. These findings underscore the need for principled definitions, careful operationalization, and rigorous evaluation of personalization in algorithmic recourse.