A Profit-Based Measure of Lending Discrimination

📅 2025-12-23
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This paper addresses implicit discrimination in loan pricing algorithms—specifically, systematic preferential treatment of male and Black borrowers—by proposing a novel fairness metric grounded in actual lending profitability differentials across demographic groups. Method: Leveraging approximately 80,000 personal loan records from a U.S. fintech platform, the study integrates financial econometrics, credit risk modeling, and fairness auditing frameworks. It employs empirical regression analysis and counterfactual calibration to assess model behavior under varying inclusion/exclusion of protected attributes (e.g., gender, race). Contribution/Results: Contrary to conventional “fairness through unawareness,” excluding protected attributes exacerbates miscalibration: the model significantly overestimates default risk for women and underestimates it for Black borrowers. In contrast, controlled incorporation of sensitive attributes enables effective risk recalibration, eliminating intergroup profit disparities. This work pioneers the use of real-world commercial profit—not just predictive accuracy or statistical parity—as the benchmark for fairness evaluation, demonstrating that sensitive attributes are both necessary and corrective for achieving risk-consistent fairness.

Technology Category

Application Category

📝 Abstract
Algorithmic lending has transformed the consumer credit landscape, with complex machine learning models now commonly used to make or assist underwriting decisions. To comply with fair lending laws, these algorithms typically exclude legally protected characteristics, such as race and gender. Yet algorithmic underwriting can still inadvertently favor certain groups, prompting new questions about how to audit lending algorithms for potentially discriminatory behavior. Building on prior theoretical work, we introduce a profit-based measure of lending discrimination in loan pricing. Applying our approach to approximately 80,000 personal loans from a major U.S. fintech platform, we find that loans made to men and Black borrowers yielded lower profits than loans to other groups, indicating that men and Black applicants benefited from relatively favorable lending decisions. We trace these disparities to miscalibration in the platform's underwriting model, which underestimates credit risk for Black borrowers and overestimates risk for women. We show that one could correct this miscalibration -- and the corresponding lending disparities -- by explicitly including race and gender in underwriting models, illustrating a tension between competing notions of fairness.
Problem

Research questions and friction points this paper is trying to address.

Develops a profit-based measure to detect lending discrimination in algorithmic underwriting
Analyzes disparities in loan profitability across demographic groups using fintech data
Examines tension between fairness notions by including protected attributes in models
Innovation

Methods, ideas, or system contributions that make the work stand out.

Profit-based measure for lending discrimination
Audit algorithm using miscalibration in risk estimation
Include protected attributes to correct disparities
🔎 Similar Papers
No similar papers found.