Abstract
Let T be a non-empty subset of η X n stochastic matrices. Define
The sequence T
1, T
2, · ·· is called a Markov set-chain. An important problem in this area is to determine when such a set-chain converges. This paper gives a notion of a sequential limiting set and shows how it can be used to obtain a result on set-chain convergence.
Publisher
Cambridge University Press (CUP)
Subject
Statistics, Probability and Uncertainty,General Mathematics,Statistics and Probability