Assume that an ergodic Markov chain has states s1,s2,…,sk. Let Sj(n) denote the number of times that the chain is in state sj in the first n steps. Let w denote the fixed probability row vector for this chain. Show that, regardless of the starting state, the expected value of Sj(n) , divided by n, tends to wj as n →∞. Hint : If the chain starts in state si, then the expected value of Sj(n) is given by the expression
\sum\limits_{h=0}^{n}{p_{ij}^{(h)}} .
Assume that the chain is started in state si. Let Xj(n) equal 1 if the chain is in state si on the nth step and 0 otherwise. Then
Sj(n) = Xj(0) + Xj(1) + Xj (2) + . . .Xj(n) .
and
E(Xj(n) ) = Pnij .
Thus
E(S^{(n)}_j)= \sum\limits_{h=0}^{n}{p^{(n)}_{ij}}.
If now follows then from Exercise 16 that
\underset{n\rightarrow \infty }{\lim} \frac{E(S^{(n)}_{j})}{n} = w_j.