Towards explainable decision support using hybrid neural models for logistic terminal automation

📅 2025-09-09
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
To address the weak interpretability and insufficient causal reliability of deep learning models in logistics terminal automation—hindering critical decision-making—this paper proposes an endogenously interpretable hybrid neuro-symbolic dynamical systems framework. The framework integrates neural-symbolic reasoning with system dynamics modeling, unifying concept-level interpretability, mechanistic explanation, and causal machine learning. It preserves rigorous causal mechanisms while enhancing model expressiveness and automation capability. Evaluated on the AutoMoTIF intermodal terminal case study, the method achieves high-precision forecasting and transparent, auditable decision support, demonstrating its effectiveness and deployability in complex cyber-physical systems. This work establishes a novel modeling paradigm for intelligent logistics that simultaneously ensures predictive performance, trustworthiness, and full traceability.

Technology Category

Application Category

📝 Abstract
The integration of Deep Learning (DL) in System Dynamics (SD) modeling for transportation logistics offers significant advantages in scalability and predictive accuracy. However, these gains are often offset by the loss of explainability and causal reliability $-$ key requirements in critical decision-making systems. This paper presents a novel framework for interpretable-by-design neural system dynamics modeling that synergizes DL with techniques from Concept-Based Interpretability, Mechanistic Interpretability, and Causal Machine Learning. The proposed hybrid approach enables the construction of neural network models that operate on semantically meaningful and actionable variables, while retaining the causal grounding and transparency typical of traditional SD models. The framework is conceived to be applied to real-world case-studies from the EU-funded project AutoMoTIF, focusing on data-driven decision support, automation, and optimization of multimodal logistic terminals. We aim at showing how neuro-symbolic methods can bridge the gap between black-box predictive models and the need for critical decision support in complex dynamical environments within cyber-physical systems enabled by the industrial Internet-of-Things.
Problem

Research questions and friction points this paper is trying to address.

Bridging black-box predictive models with explainable decision support
Integrating deep learning into system dynamics for logistics automation
Enhancing causal reliability and transparency in neural network models
Innovation

Methods, ideas, or system contributions that make the work stand out.

Hybrid neural system dynamics modeling framework
Combines deep learning with interpretability techniques
Uses semantically meaningful and actionable variables
🔎 Similar Papers
No similar papers found.