We are given a random sample \( X_1, X_2, \dots, X_n \) from a normal distribution \( N(-\theta, \theta) \), where \( \theta > 0 \) is the unknown parameter. We need to determine which of the following options is correct.
Step 1: Understanding the Distribution
The random variables \( X_1, X_2, \dots, X_n \) are from the normal distribution \( N(-\theta, \theta) \), which means:
The mean of the distribution is \( -\theta \),
The variance of the distribution is \( \theta \).
Step 2: Factorization Theorem
To determine which statistic is minimal sufficient, we apply the Factorization Theorem, which states that a statistic is sufficient if the likelihood function can be factored into two parts: one depending on the data through the statistic, and the other not depending on the parameter. The likelihood function for the normal distribution is given by: \[ L(\theta) = \prod_{i=1}^{n} \frac{1}{\sqrt{2\pi \theta}} \exp \left( -\frac{(X_i + \theta)^2}{2\theta} \right). \] This can be written as: \[ L(\theta) = \frac{1}{(\sqrt{2\pi \theta})^n} \exp \left( -\sum_{i=1}^{n} \frac{(X_i + \theta)^2}{2\theta} \right), \] which depends on the sum \( \sum_{i=1}^{n} X_i^2 \), making it crucial in determining the likelihood.
Step 3: Identifying the Minimal Sufficient Statistic
The statistic \( \sum_{i=1}^{n} X_i^2 \) is sufficient for \( \theta \) because it contains all the information needed to estimate \( \theta \). Additionally, it is minimal because there are no redundant pieces of information: it captures both the mean and variance of the distribution. Thus, \( \sum_{i=1}^{n} X_i^2 \) is the minimal sufficient statistic.
Step 4: Re-evaluating the Other Options
(A) \( \sum_{i=1}^{n} X_i \): This is a sufficient statistic, but not minimal, because it does not capture information about the variance of the distribution.
(C) \( \left( \frac{1}{n} \sum_{i=1}^{n} X_i, \frac{1}{n-1} \sum_{j=1}^{n} (X_j - \frac{1}{n} \sum_{i=1}^{n} X_i)^2 \right) \): While this is a complete statistic, it is not minimal since the sum of squares \( \sum_{i=1}^{n} X_i^2 \) alone is already sufficient and minimal.
(D) \( -\frac{1}{n} \sum_{i=1}^{n} X_i \): This is the unbiased estimator of \( \theta \), but not minimal or sufficient for the variance in this case. Thus, the correct answer is (B) \( \sum_{i=1}^{n} X_i^2 \).
Let \( (X_1, X_2, X_3) \) follow the multinomial distribution with the number of trials being 100 and the probability vector \( \left( \frac{3}{10}, \frac{1}{10}, \frac{3}{5} \right) \).
Then \( E(X_2 | X_3 = 40) \) equals:
An electricity utility company charges ₹7 per kWh. If a 40-watt desk light is left on for 10 hours each night for 180 days, what would be the cost of energy consumption? If the desk light is on for 2 more hours each night for the 180 days, what would be the percentage-increase in the cost of energy consumption?
In the context of the given figure, which one of the following options correctly represents the entries in the blocks labelled (i), (ii), (iii), and (iv), respectively?

A bag contains Violet (V), Yellow (Y), Red (R), and Green (G) balls. On counting them, the following results are obtained:
(i) The sum of Yellow balls and twice the number of Violet balls is 50.
(ii) The sum of Violet and Green balls is 50.
(iii) The sum of Yellow and Red balls is 50.
(iv) The sum of Violet and twice the number of Red balls is 50.
Which one of the following Pie charts correctly represents the balls in the bag?