Abstract
Markov trials are a sequence of dependent trials with two outcomes, success and failure, which are the states of a Markov chain. The distribution of the number of successes in n Markov trials and the first-passage time for a specified number of successes are obtained using an augmented Markov chain model.
Publisher
Cambridge University Press (CUP)
Subject
Applied Mathematics,Statistics and Probability
Cited by
14 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献