🤖 AI Summary
To address the challenges of sparse connectivity, high latency, frequent disconnections, and partially observable node states in Delay-Tolerant Networks (DTNs), this paper proposes an adaptive routing method based on Partially Observable Markov Decision Processes (POMDPs). The method introduces a novel dependency-aware node failure model, leveraging Continuous-Time Markov Chains (CTMCs) to capture inter-node failure correlations and repairability—thereby relaxing the conventional fully observable Markov Decision Process (MDP) assumption. Integrated within the JuliaPOMDP framework and coupled with the DtnSim simulator, the approach enables efficient online decision-making under uncertainty. Experimental results demonstrate significant improvements in end-to-end delay reduction and message delivery ratio, alongside strong scalability. This work establishes a new paradigm for robust, adaptive routing in uncertain DTN environments.
📝 Abstract
Routing in Delay-Tolerant Networks (DTNs) is inherently challenging due to sparse connectivity, long delays, and frequent disruptions. While Markov Decision Processes (MDPs) have been used to model uncertainty, they assume full state observability - an assumption that breaks down in partitioned DTNs, where each node operates with inherently partial knowledge of the network state. In this work, we investigate the role of Partially Observable Markov Decision Processes (POMDPs) for DTN routing under uncertainty. We introduce and evaluate a novel model: Dependent Node Failures (DNF), which captures correlated node failures via repairable node states modeled as Continuous-Time Markov Chains (CTMCs). We implement the model using JuliaPOMDP and integrate it with DTN simulations via DtnSim. Our evaluation demonstrates that POMDP-based routing yields improved delivery ratios and delay performance under uncertain conditions while maintaining scalability. These results highlight the potential of POMDPs as a principled foundation for decision-making in future DTN deployments.