Step 1: Understanding the probability mass function.
We are given a probability mass function with possible values of \( x \) being 0 or 1, with probabilities depending on \( \theta \).
Step 2: Unbiased estimator.
An estimator \( T(X) \) is unbiased if \( E[T(X)] = \theta \). For \( X_1 \) and \( X_2 \), we calculate the expected value of \( \frac{X_1 + X_2}{2} \), which is:
\[
E\left[\frac{X_1 + X_2}{2}\right] = \frac{E[X_1] + E[X_2]}{2} = \frac{\theta + \theta}{2} = \theta.
\]
Thus, \( \frac{X_1 + X_2}{2} \) is an unbiased estimator of \( \theta \).
Next, for the other estimators:
- \( \frac{X_1^2 + X_2}{2} \) is also unbiased since \( E[X_1^2] = \theta \), making \( \frac{X_1^2 + X_2}{2} \) an unbiased estimator of \( \theta \).
- \( \frac{X_1^2 + X_2^2}{2} \) is unbiased for the same reason: \( E[X_1^2] = \theta \), and similarly for \( X_2 \), so the estimator is unbiased.
Step 3: Conclusion.
The correct answers are (A), (B), and (C).