🤖 AI Summary
This work proposes ReasonX, a declarative explanation framework grounded in linear constraint theory algebra, to address key limitations of existing explainable AI (XAI) methods—namely, insufficient abstraction capacity, weak interactivity, and difficulty in integrating symbolic knowledge. ReasonX uniquely unifies declarative query algebra with symbolic reasoning, enabling users to inject background knowledge via queries for multi-granular, interactive explanations of decision tree models. The framework employs mixed-integer linear programming (MILP) to jointly handle factual and counterfactual reasoning, and implements a meta-interpreter for the query algebra through a Python frontend coupled with a constraint logic programming (CLP) backend. Experimental results demonstrate that ReasonX exhibits strong explanatory power in qualitative case studies and significantly outperforms state-of-the-art XAI tools in quantitative evaluations.
📝 Abstract
Explaining opaque Machine Learning (ML) models has become an increasingly important challenge. However, current eXplanation in AI (XAI) methods suffer several shortcomings, including insufficient abstraction, limited user interactivity, and inadequate integration of symbolic knowledge. We propose ReasonX, an explanation tool based on expressions (or, queries) in a closed algebra of operators over theories of linear constraints. ReasonX provides declarative and interactive explanations for decision trees, which may represent the ML models under analysis or serve as global or local surrogate models for any black-box predictor. Users can express background or common sense knowledge as linear constraints. This allows for reasoning at multiple levels of abstraction, ranging from fully specified examples to under-specified or partially constrained ones. ReasonX leverages Mixed-Integer Linear Programming (MILP) to reason over the features of factual and contrastive instances. We present here the architecture of ReasonX, which consists of a Python layer, closer to the user, and a Constraint Logic Programming (CLP) layer, which implements a meta-interpreter of the query algebra. The capabilities of ReasonX are demonstrated through qualitative examples, and compared to other XAI tools through quantitative experiments.