Let X and Y be two random variables for which only 6 possible events— A_{1}, A_{2}, A_{3}, A_{4}, A_{5}, A_{6}—are defined:
(a) What is the joint PDF of X and Y ?
(b) Calculate the marginal distributions of X and Y .
(c) Are both variables independent?
(d) Determine the joint PDF for U = X + Y .
(e) Calculate E(U) and Var(U) and compare it with E(X) + E(Y ) and Var(X) + Var(Y ), respectively.
i | 1 | 2 | 3 | 4 | 5 | 6 |
P(A_{i} ) | 0.3 | 0.1 | 0.1 | 0.2 | 0.2 | 0.1 |
X_{i} | -1 | 2 | 2 | -1 | -1 | 2 |
Y_{i} | 0 | 2 | 0 | 1 | 2 | 1 |
(a) The joint PDF is:
(b) The marginal distributions are obtained from the row and column sums of the joint PDF, respectively:
(c) The random variables X and Y are independent if
P(X = x, Y = y) = P(X = x)P(Y = y) ∀x, y.
However, in our example we have, for example,
P(X = −1, Y = 0) = 0.3 ≠ P(X = −1) · P(Y = 0) = 0.7 · 0.4 = 0.28.
Hence, the two variables are not independent.
(d) The joint distribution of X and Y can be used to obtain the desired distribution of U. For example, If X = −1 and Y = 0, then U= X + Y = −1. The respective probability is P(U = −1) = 0.3 because P(U = −1) = P(X = −1, Y = 0) = 0.3 and there is no other combination of X- and Y -values which yields X + Y = −1. The distribution of U is therefore as follows:
(e) We calculate
E(U) = \sum\limits_{k=-1}^{4}{k \cdot } P(U = k) = 0.8
E(X) = (−1)0.7 + 2 · 0.3 = −0.1
E(Y ) = 0 · 0.4 + 1 · 0.3 + 2 · 0.3 = 0.9
E(U²) = 0.3 · (−1)² + ·· ·+0.1 · 4² = 3.4
E(X²) = 0.7 · (−1)² + 0.3 · 2² = 1.9
E(Y²) = 0.4 · 0² + 0.3 · 1² + 0.3 · 2² = 1.5
Var(U) = E(U²) − [E(U)]² = 3.4 − (0.8)² = 2.76
Var(X) = E(X²) − [E(X)]² = 1.9 − (−0.1)² = 1.89
Var(Y ) = E(Y²) − [E(Y )]² = 1.5 − (0.9)² = 0.69.
It can be seen that E(X) + E(Y ) = −0.1 + 0.9 = 0.8 = E(U). This makes sense because we know from (7.31) that E(X + Y ) = E(X) + E(Y ). However, Var(U) = 2.76 ≠ Var(X) + Var(Y ) = 1.89 + 0.69. This follows from (7.7.1) which says that Var(X ± Y ) = Var(X) + Var(Y ) ± 2Cov(X, Y ) and therefore, Var(X ± Y ) = Var(X) + Var(Y ) only if the covariance is 0. We know from (c) that X and Y are not independent and thus Cov(X, Y ) ≠ 0.
E(X + Y ) = E(X) + E(Y ) (additivity). (7.31)
Y | ||||
0 | 1 | 2 | ||
X | -1 | 0.3 | 0.2 | 0.2 |
2 | 0.1 | 0.1 | 0.1 |
X | -1 | 2 |
P(X = x) | 0.7 | 0.3 |
Y | 0 | 1 | 2 |
P(Y = y) | 0.4 | 0.3 | 0.3 |
k | -1 | 0 | 1 | 2 | 3 | 4 |
P(U = k) | 0.3 | 0.2 | 0.2 | 0.1 | 0.1 | 0.1 |