Using Certifying Constraint Solvers for Generating Step-wise Explanations

📅 2025-11-13
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Existing methods for generating unsatisfiability explanations in constraint solving suffer from low efficiency, hindering practical deployment in explainable AI. Method: This paper proposes an abstraction-proof framework grounded in verifiable solver proofs, enabling efficient translation of formal proofs into stepwise, human-understandable explanations. We design a lightweight abstraction mechanism to construct a logically tractable proof skeleton and introduce a serialization algorithm enhanced with semantic-aware pruning and equivalence-based simplification, substantially reducing explanation length and logical complexity. Contribution/Results: Our approach achieves a 10×–100× speedup over state-of-the-art methods while preserving both explanation accuracy and intelligibility. Crucially, it is the first work to systematically transform verifiable proofs into compact, interpretable inference chains—establishing a novel paradigm for formal reasoning in explainable AI.

Technology Category

Application Category

📝 Abstract
In the field of Explainable Constraint Solving, it is common to explain to a user why a problem is unsatisfiable. A recently proposed method for this is to compute a sequence of explanation steps. Such a step-wise explanation shows individual reasoning steps involving constraints from the original specification, that in the end explain a conflict. However, computing a step-wise explanation is computationally expensive, limiting the scope of problems for which it can be used. We investigate how we can use proofs generated by a constraint solver as a starting point for computing step-wise explanations, instead of computing them step-by-step. More specifically, we define a framework of abstract proofs, in which both proofs and step-wise explanations can be represented. We then propose several methods for converting a proof to a step-wise explanation sequence, with special attention to trimming and simplification techniques to keep the sequence and its individual steps small. Our results show our method significantly speeds up the generation of step-wise explanation sequences, while the resulting step-wise explanation has a quality similar to the current state-of-the-art.
Problem

Research questions and friction points this paper is trying to address.

Generating step-wise explanations for unsatisfiable constraint problems efficiently
Converting solver proofs into simplified explanation sequences automatically
Reducing computational cost while maintaining explanation quality standards
Innovation

Methods, ideas, or system contributions that make the work stand out.

Using solver proofs for step-wise explanations
Converting proofs into explanation sequences
Trimming techniques to simplify explanation steps
I
Ignace Bleukx
KU Leuven, Department of Computer Science; Leuven.AI, Celestijnenlaan 200A, 3000 Leuven, Belgium
M
Maarten Flippo
Delft University of Technology, Mekelweg 5, 2628 CD Delft, The Netherlands
Bart Bogaerts
Bart Bogaerts
Research Professor, DTAI lab, Department of Computer Science, KU Leuven
Combinatorial OptimizationProofsKnoweledge RepresentationLogic
Emir Demirović
Emir Demirović
Assistant Professor of Computer Science, Delft University of Technology
combinatorial optimisationoptimal decision treesconstraint programmingMaxSATmachine learning
T
Tias Guns
KU Leuven, Department of Computer Science; Leuven.AI, Celestijnenlaan 200A, 3000 Leuven, Belgium