1) Convergence in distribution vs. probability:
If a sequence \( X_n \) converges in distribution to a real constant \( c \), this implies that for any \( \epsilon>0 \), \( P(|X_n - c| \geq \epsilon) \to 0 \) as \( n \to \infty \). This is exactly the definition of convergence in probability. Hence, convergence in distribution to a constant implies convergence in probability to that constant.
2) Explanation of the other options:
(B) Convergence in probability does not necessarily imply convergence in 3rd mean. Convergence in probability is a weaker condition compared to convergence in mean.
(C) The sum of independent random variables converging in distribution does not imply that the sum of the sequences converges in distribution to the sum of the limits.
(D) Convergence of expectations does not guarantee convergence in mean. Convergence in expectation is a weaker condition than convergence in 1st mean.