Let \( X_1, X_2, \dots, X_{18} \) be a random sample from the distribution
\[
f(x; \theta) = \begin{cases}
\frac{2x}{\theta} e^{-x^2/\theta}, & x>0,\\
0, & x \leq 0.
\end{cases}
\]
What is the maximum likelihood estimator (MLE) of \( \theta \)?
Show Hint
The maximum likelihood estimator (MLE) for a parameter is found by differentiating the log-likelihood function and setting it equal to zero. In cases where the likelihood involves sums of squares, the MLE is typically the sample mean or a scaled version of it.
Step 1: Write the likelihood function.
The likelihood function \( L(\theta) \) for the sample is given by:
\[
L(\theta) = \prod_{i=1}^{18} f(X_i; \theta) = \prod_{i=1}^{18} \frac{2X_i}{\theta} e^{-X_i^2/\theta}.
\]
Thus, the log-likelihood function is:
\[
\ell(\theta) = \sum_{i=1}^{18} \left[ \ln \left( \frac{2X_i}{\theta} \right) - \frac{X_i^2}{\theta} \right] = \sum_{i=1}^{18} \left( \ln(2X_i) - \ln(\theta) - \frac{X_i^2}{\theta} \right).
\]
Step 2: Differentiate and find the MLE.
The derivative of the log-likelihood function with respect to \( \theta \) is:
\[
\frac{d}{d\theta} \ell(\theta) = -\frac{18}{\theta} + \frac{1}{\theta^2} \sum_{i=1}^{18} X_i^2.
\]
Setting this equal to 0 and solving for \( \theta \), we get:
\[
\hat{\theta} = \frac{1}{18} \sum_{i=1}^{18} X_i^2.
\]
Step 3: Conclusion.
Thus, the MLE of \( \theta \) is \( \hat{\theta} = \frac{1}{18} \sum_{i=1}^{18} X_i^2 \), and the correct answer is (A).