π€ AI Summary
Shapley value estimation suffers from exponential computational complexity (O(2βΏ)) and, in the case of Kernel SHAP, lacks non-asymptotic theoretical guarantees. To address these limitations, this paper proposes Leverage SHAPβthe first lightweight Shapley value approximation algorithm incorporating leverage score sampling. The method establishes a theoretical connection to model-free active learning and constructs a weighted least-squares estimator. It achieves provable non-asymptotic error bounds with only O(n log n) query complexity. Unlike Kernel SHAP, Leverage SHAP provides rigorous finite-sample guarantees while substantially reducing computational overhead. Empirical evaluations demonstrate that Leverage SHAP outperforms highly optimized baseline implementations in the official SHAP library, both in accuracy and efficiency.
π Abstract
Originally introduced in game theory, Shapley values have emerged as a central tool in explainable machine learning, where they are used to attribute model predictions to specific input features. However, computing Shapley values exactly is expensive: for a general model with $n$ features, $O(2^n)$ model evaluations are necessary. To address this issue, approximation algorithms are widely used. One of the most popular is the Kernel SHAP algorithm, which is model agnostic and remarkably effective in practice. However, to the best of our knowledge, Kernel SHAP has no strong non-asymptotic complexity guarantees. We address this issue by introducing Leverage SHAP, a light-weight modification of Kernel SHAP that provides provably accurate Shapley value estimates with just $O(nlog n)$ model evaluations. Our approach takes advantage of a connection between Shapley value estimation and agnostic active learning by employing leverage score sampling, a powerful regression tool. Beyond theoretical guarantees, we show that Leverage SHAP consistently outperforms even the highly optimized implementation of Kernel SHAP available in the ubiquitous SHAP library [Lundberg&Lee, 2017].