Consider a sequence of independent trials, each of which is equally likely to result in any of the outcomes 0, 1, . . . , m. Say that a round begins with the first trial, and that a new round begins each time outcome 0 occurs. Let N denote the number of trials that it takes until all of the outcomes 1, . . . , m−1 have occurred in the same round. Also, let T_{j} denote the number of trials that it takes until j distinct outcomes have occurred, and let I_{j} denote the j th distinct outcome to occur. (Therefore, outcome I_{j} first occurs at trial T_{j} .)
(a) Argue that the random vectors (I_{1}, . . . , I_{m}) and (T_{1}, . . . , T_{m}) are independent.
(b) Define X by letting X = j if outcome 0 is the j th distinct outcome to occur. (Thus, I_{X} = 0.) Derive an equation for E[N] in terms of E[T_{j} ], j = 1, . . . , m− 1 by conditioning on X.
(c) Determine E[T_{j} ], j = 1, . . . , m−1.
Hint: See Exercise 42 of Chapter 2.
(d) Find E[N].
(a) By symmetry, for any value of \left(T_1, \ldots, T_m\right), the random vector \left(I_1, \ldots, I_m\right) is equally likely to be any of the m ! permutations.
\begin{aligned} (b) E[N] & =\sum\limits_{i=1}^m E[N \mid X=i] P\{X=i\} \\& =\frac{1}{m} \sum\limits_{i=1}^m E[N \mid X=i] \\& =\frac{1}{m}\left(\sum\limits_{i=1}^{m-1}\left(E\left[T_i\right]+E[N]\right)+E\left[T_{m-1}\right]\right)\end{aligned}
where the final equality used the independence of X and T_i. Therefore,
E[N]=E\left[T_{m-1}\right]+\sum\limits_{i=1}^{m-1} E\left[T_i\right]
(c)E\left[T_i\right]=\sum\limits_{j=1}^i \frac{m}{m+1-j}
(d)