🤖 AI Summary
Probabilistic bisimilarity in labeled Markov chains lacks robustness against minor perturbations in transition probabilities, resulting in a discontinuous distance function and severely undermining reliability—especially for models estimated from experimental data.
Method: We introduce, for the first time, a robust probabilistic bisimulation notion that semantically guarantees continuity of the induced distance. Building upon this, we design the first efficient algorithm that is both theoretically rigorous and practically applicable, integrating fixed-point theory, rational function optimization, and numerical stability analysis.
Contribution/Results: We formally prove the algorithm’s continuity and convergence. Experiments demonstrate substantial improvements in both stability and convergence speed of distance computation, effectively mitigating discontinuities induced by probabilistic approximations.
📝 Abstract
Despite its prevalence, probabilistic bisimilarity suffers from a lack of robustness under minuscule perturbations of the transition probabilities. This can lead to discontinuities in the probabilistic bisimilarity distance function, undermining its reliability in practical applications where transition probabilities are often approximations derived from experimental data. Motivated by this limitation, we introduce the notion of robust probabilistic bisimilarity for labelled Markov chains, which ensures the continuity of the probabilistic bisimilarity distance function. We also propose an efficient algorithm for computing robust probabilistic bisimilarity and show that it performs well in practice, as evidenced by our experimental results.