Let X_{1} and X_{2} be independent exponential random variables with rates \mu_{1} and \mu_{2}. Find the conditional density of X_{1} given that X_{1}+X_{2}=t .
To begin, let us first note that if f(x, y) is the joint density of X, Y, then the joint density of X and X + Y is
f_{X,X+Y}(x,t)=f(x,t-x)To verify the preceding, note that
P(X\leqslant x,X+Y\leqslant t)=\int\int_{u\leqslant x,u+v\leqslant t}f(u,v)d v\,d u=\int_{-\infty}^{x}\int_{-\infty}^{t-u}f(u,v)d v\,d u
=\int_{-\infty}^{x}\int_{-\infty}^{t}f(u,y-u)d y\,d u
where the final equation made the change of variable v = y−u. Differentiating the preceding joint distribution function first with respect to t and then with respect to x yields the verification.
Applying the preceding to our example yields
f_{X_{1}|X_{1}+X_{2}}(x|t)={\frac{f_{X_{1},X_{1}+X_{2}}(x,t)}{f_{X_{1}+X_{2}}(t)}}=\frac{\mu_{1}e^{-\mu_{1}x}\mu_{2}e^{-\mu_{2}(t-x)}}{f_{X_{1}+X_{2}}(t)},\quad0\leqslant{x}\leqslant t
=C e^{-(\mu_{1}-\mu_{2})x},\quad0\leqslant x\leqslant t
where
C={\frac{\mu_{1}\mu_{2}e^{-\mu_{2}t}}{f_{X_{1}+X_{2}}(t)}}Now, if \mu_{1}=\mu_{2}, then
f_{X_{1}|X_{1}+X_{2}}(x|t)=C,\quad0\leqslant x\leqslant tyielding that C = 1/t and that X_{1} given X_{1}+X_{2}=t is uniformly distributed on (0, t). On the other hand, if \mu_{1}\neq\mu_{2}, then we use
1=\int_{0}^{t}f_{X_{1}|X_{1}+X_{2}}(x|t)d x=\frac{C}{\mu_{1}-\mu_{2}}\left(1-e^{-(\mu_{1}-\mu_{2})t}\right)to obtain
C={\frac{\mu_{1}-\mu_{2}}{1-e^{-(\mu_{1}-\mu_{2})t}}}thus yielding the result:
f_{X_{1}|X_{1}+X_{2}}(x|t)={\frac{(\mu_{1}-\mu_{2})e^{-(\mu_{1}-\mu_{2})x}}{1-e^{-(\mu_{1}-\mu_{2})t}}}An interesting byproduct of our analysis is that
f_{X_{1}+X_{2}}(t)={\frac{\mu_{1}\mu_{2}e^{-\mu_{2}t}}{C}}=\begin{cases}\mu^{2}t e^{-\mu t}, & \text{if}\,\,\mu_{1}=\mu_{2}=\mu\\ {\frac{\mu_{1}\mu_{2}(e^{-\mu_{2}t}-e^{-\mu_{1}t})}{\mu_{1}-\mu_{2}}}), & \text{if}\,\,\mu_{1}\neq\mu_{2}\end{cases}