Assume that an ergodic Markov chain has states s1,s2,…,sk. Let Sj(n) denote the number of times that the chain is in state sj in the first n steps. Let w denote the fixed probability row vector for this chain. Show that, regardless of the starting state, the expected value of Sj(n) , divided by n, tends to wj as n →∞. Hint : If the chain starts in state si, then the expected value of Sj(n) is given by the expression
\sum\limits_{h=0}^{n}{p_{ij}^{(h)}} .