FLOC 2018: FEDERATED LOGIC CONFERENCE 2018
Deciding Probabilistic Bisimilarity Distance One for Labelled Markov Chains

Authors: Qiyi Tang and Franck van Breugel

Paper Information

Title:Deciding Probabilistic Bisimilarity Distance One for Labelled Markov Chains
Authors:Qiyi Tang and Franck van Breugel
Proceedings:CAV All Papers
Editors: Georg Weissenbacher, Hana Chockler and Igor Konnov
Keywords:labelled Markov chain, probabilistic bisimilarity, probabilistic bisimilarity distance
Abstract:

ABSTRACT. Probabilistic bisimilarity, due to Larsen and Skou, is an equivalence relation that captures which states of a labelled Markov chain behave the same. Since this behavioural equivalence only identifies states that transition to states that behave exactly the same with exactly the same probability, this notion of equivalence is not robust, as first observed by Giacalone, Jou and Smolka. The probabilistic bisimilarity distances, as first defined by Desharnais, Gupta, Jagadeesan and Panangaden, provide a quantitative generalization of probabilistic bisimilarity. The distance of states captures the similarity of their behaviour. The smaller the distance, the more alike the states behave. In particular, states are probabilistic bisimilar if and only if their distance is zero. This quantitative notion is robust in that small changes in the transition probabilities result in small changes in the distances.

During the last decade, several algorithms have been proposed to approximate and compute the probabilistic bisimilarity distances. The main result of this paper is an algorithm that decides distance one in O(n^2 + m^2), where n is the number of states and m is the number of transitions of the labelled Markov chain. The algorithm is the key new ingredient of our algorithm to compute the distances. The state of the art algorithm, that combines algorithms due to Derisavi, Hermanns and Sanders and due to Bacci, Bacci, Larsen and Mardare, can compute distances for labelled Markov chains up to 150 states. For one such labelled Markov chain, that algorithm takes more than 49 hours. In contrast, our new algorithm only takes 13 milliseconds. Furthermore, our algorithm can compute distances for labelled Markov chains with more than 10,000 states in less than 50 minutes.

Pages:19
Talk:Jul 15 16:45 (Session 107A: Probabilistic Systems)
Paper: