Affiliation:
1. Indian Institute of Technology Kanpur, India
2. Indian Institute of Technology Bombay, India
3. CNRS, UMR IRISA, Rennes, France
4. National University of Singapore, Singapore
Abstract
A finite-state Markov chain
M
can be regarded as a linear transform operating on the set of probability distributions over its node set. The iterative applications of
M
to an initial probability distribution μ
0
will generate a trajectory of probability distributions. Thus, a set of initial distributions will induce a set of trajectories. It is an interesting and useful task to analyze the dynamics of
M
as defined by this set of trajectories. The novel idea here is to carry out this task in a
symbolic
framework. Specifically, we discretize the probability value space [0,1] into a finite set of intervals
I
= {
I
1
,
I
2
,...,
I
m
}. A concrete probability distribution μ over the node set {1, 2,...,
n
} of
M
is then symbolically represented as
D
, a tuple of intervals drawn from
I
where the
i
th component of
D
will be the interval in which μ(
i
) falls. The set of discretized distributions
D
is a finite alphabet. Hence, the trajectory, generated by repeated applications of
M
to an initial distribution, will induce an infinite string over this alphabet. Given a set of initial distributions, the symbolic dynamics of
M
will then consist of a language of infinite strings
L
over the alphabet
D
.
Our main goal is to verify whether
L
meets a specification given as a linear-time temporal logic formula φ. In our logic, an atomic proposition will assert that the current probability of a node falls in the interval
I
from
I
. If
L
is an ω-regular language, one can hope to solve our model-checking problem (whether
L
⊧ φ?) using standard techniques. However, we show that, in general, this is not the case. Consequently, we develop the notion of an ϵ-approximation, based on the transient and long-term behaviors of the Markov chain
M
. Briefly, the symbolic trajectory ξ' is an ϵ-approximation of the symbolic trajectory ξ iff (1) ξ' agrees with ξ during its transient phase; and (2) both ξ and ξ' are within an ϵ-neighborhood at all times after the transient phase. Our main results are that one can effectively check whether (i) for each infinite word in
L
, at least one of its ϵ-approximations satisfies the given specification; (ii) for each infinite word in
L
, all its ϵ-approximations satisfy the specification. These verification results are strong in that they apply to all finite state Markov chains.
Funder
Department of Science and Technology, India
Agence Nationale de la Recherche
Ministry of Education - Singapore
Publisher
Association for Computing Machinery (ACM)
Subject
Artificial Intelligence,Hardware and Architecture,Information Systems,Control and Systems Engineering,Software
Reference45 articles.
1. Approximate Verification of the Symbolic Dynamics of Markov Chains
2. A theory of timed automata
3. Model checking meets performance evaluation
4. Comparative branching-time semantics for Markov chains. In Proceedings of CONCUR'03;Baier C.;Lecture Notes in Computer Science,2003
Cited by
21 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献
1. Skolem and positivity completeness of ergodic Markov chains;Information Processing Letters;2024-08
2. On Robustness for the Skolem, Positivity and Ultimate Positivity Problems;Logical Methods in Computer Science;2024-06-05
3. CTL Model Checking of MDPs over Distribution Spaces: Algorithms and Sampling-based Computations;Proceedings of the 27th ACM International Conference on Hybrid Systems: Computation and Control;2024-05-14
4. Linear dynamical systems with continuous weight functions;Proceedings of the 27th ACM International Conference on Hybrid Systems: Computation and Control;2024-05-14
5. Measurement-Based Verification of Quantum Markov Chains;Lecture Notes in Computer Science;2024