Heterogeneous Graph Neural Networks for Assumption-Based Argumentation

📅 2025-11-12
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Computing stable extensions in assumption-based argumentation (ABA) frameworks is computationally intractable. Method: This paper introduces heterogeneous graph neural networks (HGNNs) to ABA reasoning—proposing ABAGCN and ABAGAT architectures that explicitly model support, derivation, and attack relations among assumptions, rules, and conclusions. Leveraging residual heterogeneous convolutions and multi-head attention, models are trained on the ICCMA 2023 benchmark and synthetic datasets via Bayesian hyperparameter optimization. A polynomial-time extension reconstruction algorithm is further designed from model predictions. Contribution/Results: The approach achieves a node-level F1 score of 0.71—significantly outperforming existing GNN baselines. Extension reconstruction attains F1 > 0.85 on small ABA frameworks and maintains ~0.58 on large ones, balancing efficiency and accuracy. This work advances scalable, approximate reasoning for structured argumentation.

Technology Category

Application Category

📝 Abstract
Assumption-Based Argumentation (ABA) is a powerful structured argumentation formalism, but exact computation of extensions under stable semantics is intractable for large frameworks. We present the first Graph Neural Network (GNN) approach to approximate credulous acceptance in ABA. To leverage GNNs, we model ABA frameworks via a dependency graph representation encoding assumptions, claims and rules as nodes, with heterogeneous edge labels distinguishing support, derive and attack relations. We propose two GNN architectures - ABAGCN and ABAGAT - that stack residual heterogeneous convolution or attention layers, respectively, to learn node embeddings. Our models are trained on the ICCMA 2023 benchmark, augmented with synthetic ABAFs, with hyperparameters optimised via Bayesian search. Empirically, both ABAGCN and ABAGAT outperform a state-of-the-art GNN baseline that we adapt from the abstract argumentation literature, achieving a node-level F1 score of up to 0.71 on the ICCMA instances. Finally, we develop a sound polynomial time extension-reconstruction algorithm driven by our predictor: it reconstructs stable extensions with F1 above 0.85 on small ABAFs and maintains an F1 of about 0.58 on large frameworks. Our work opens new avenues for scalable approximate reasoning in structured argumentation.
Problem

Research questions and friction points this paper is trying to address.

Approximating credulous acceptance in Assumption-Based Argumentation
Modeling ABA frameworks via heterogeneous dependency graph representation
Developing scalable approximate reasoning for large argumentation frameworks
Innovation

Methods, ideas, or system contributions that make the work stand out.

Uses heterogeneous graph neural networks for ABA
Proposes ABAGCN and ABAGAT with residual layers
Develops polynomial time extension-reconstruction algorithm
🔎 Similar Papers
No similar papers found.