Let X_{1}, X_{2}, . . . , X_{n} be n i.i.d. random variables which follow a uniform distribution, U(0, θ). Write down the likelihood function and argue, without differentiating the function, what the maximum likelihood estimate of θ is.
The probability density function of U(0, θ) is
f (x) = \frac{1}{\theta } if 0 < x < θ and 0 otherwise.
Note that this equates to the PDF from Definition 8.2.1 for a = 0 and b = θ. The likelihood function is therefore
L\left(x_{1},x_{2}, \ldots , x_{n}\mid \theta \right) = \left(\frac{1}{\theta }\right)^{n} if 0 < x_{i} < θ and 0 otherwise.
One can see that L\left(x_{1},x_{2}, \ldots , x_{n}\mid \theta \right) increases as θ decreases. The maximum of the likelihood function is therefore achieved for the smallest valid θ. In particular, θ is minimized when θ ≥ max (x_{1}, x_{2}, . . . , x_{n}) = x_{(n)}. This follows from the definition of the PDF which requires that 0 < x_{i} < θ and therefore θ > x_{i} . Thus, the maximum likelihood estimate of θ is x_{(n)}, the greatest observed value in the sample.