Question 10.5.P.2: Consider a Markov chain on {1, 2, 3, 4} with transition matr...

Consider a Markov chain on {1, 2, 3, 4} with transition matrix

P=\left[\begin{array}{cccc}2 / 3 & 1 / 2 & 0 & 0 \\1 / 3 & 1 / 6 & 1 / 2 & 0 \\0 & 1 / 3 & 1 / 6 & 1 / 2 \\0 & 0 & 1 / 3 & 1 / 2\end{array}\right].

a. If the Markov chain starts at state 2, find the expected number of steps required to reach state 4.

b. If the Markov chain starts at state 2, find the probability that state 1 is reached before state 4.

The blue check mark means that this solution has been answered and checked by an expert. This guarantees that the final answer is accurate.
Learn more on how we answer questions.

a. Reorder the states as {4, 1, 2, 3} and make state 4 into an absorbing state to produce the canonical form

So

The expected number of steps required to reach state 4, starting at state 2, is the sum of the entries in the column of M corresponding to state 2, which is

11.25+7.50+3.00=21.75.

b. Make states 1 and 4 into absorbing states and reorder the states as {1, 4,2, 3} to produce the canonical form

So

Thus the probability that, starting at state 2, state 1 is reached before state 4 is the entry in A whose row corresponds to state 1 and whose column corresponds to state 2; this entry is 15/19.

Related Answered Questions

Question: 10.6.P.1

Verified Answer:

For entry (3,6) of A, the probability of a transit...
Question: 10.6.P.2

Verified Answer:

For entry (6, 3) of A, the probability of a transi...