Question 11.5.23: Assume that an ergodic Markov chain has states s1,s2,...,sk.......

Assume that an ergodic Markov chain has states s1,s2,…,sk. Let Sj(n) denote the number of times that the chain is in state sj in the first n steps. Let w denote the fixed probability row vector for this chain. Show that, regardless of the starting state, the expected value of Sj(n) , divided by n, tends to wj as n →∞. Hint : If the chain starts in state si, then the expected value of Sj(n) is given by the expression

\sum\limits_{h=0}^{n}{p_{ij}^{(h)}} .

The blue check mark means that this solution has been answered and checked by an expert. This guarantees that the final answer is accurate.
Learn more on how we answer questions.

Assume that the chain is started in state si. Let Xj(n) equal 1 if the chain is in state si on the nth step and 0 otherwise. Then

Sj(n) = Xj(0) + Xj(1) + Xj (2) + . . .Xj(n) .

and

E(Xj(n) ) = Pnij .

Thus

E(S^{(n)}_j)= \sum\limits_{h=0}^{n}{p^{(n)}_{ij}}.

If now follows then from Exercise 16 that

\underset{n\rightarrow \infty }{\lim} \frac{E(S^{(n)}_{j})}{n} = w_j.

Related Answered Questions

Question: 11.2.33

Verified Answer:

In each case Exercise 27 shows that f(i) = biNf(N)...
Question: 11.2.31

Verified Answer:

You can easily check that the proportion of G’s in...
Question: 11.2.27

Verified Answer:

Use the solution to Exercise 24 with w = f.
Question: 11.5.21

Verified Answer:

The transition matrix is P= \begin{matrix} ...
Question: 11.5.19

Verified Answer:

Recall that m_{ij}= \sum\limits_{j}{\frac{z...
Question: 11.5.18

Verified Answer:

Form a Markov chain whose states are the possible ...
Question: 11.5.17

Verified Answer:

We know that wZ = w. We also know that mki = (zii ...
Question: 11.5.15

Verified Answer:

If pij = pji then P has column sums 1. We have see...
Question: 11.5.13

Verified Answer:

Assume that w is a fixed vector for P. Then ...