Step 1: For an integrating (averaging) voltmeter of duration \(T\), the equivalent noise bandwidth is \(B_{\mathrm{eq}}=\dfrac{1}{2T}\). With input white noise of one-sided bandwidth \(B=5\,\text{kHz}\) and RMS \(\sigma=10\ \text{mV}\), the output noise standard deviation is
\[
\sigma_{\text{avg}}=\sigma\sqrt{\frac{B_{\mathrm{eq}}}{B}}
=\sigma\sqrt{\frac{1}{2TB}}=0.01\,\text{V}\sqrt{\frac{1}{2T\cdot 5000}}
=\frac{0.1\ \text{mV}}{\sqrt{T}}.
\]
Step 2: “99% accurate” for a 20 mV DC means the error band is \(\pm 1%\times 20\text{ mV}=\pm 0.2\text{ mV}\). For 95% certainty (\(\approx 1.96\sigma\) for Gaussian noise),
\[
1.96\,\sigma_{\text{avg}} \le 0.2\ \text{mV} \Rightarrow
\sigma_{\text{avg}}\le 0.102\ \text{mV}.
\]
Step 3: Using \(\sigma_{\text{avg}}=\dfrac{0.1}{\sqrt{T}}\ \text{mV}\):
\[
\frac{0.1}{\sqrt{T}} \le 0.102
\ \Rightarrow\ \sqrt{T}\ge \frac{0.1}{0.102}\approx 0.98
\ \Rightarrow\ T \gtrsim 0.96\ \text{s}\approx 1.0\ \text{s}.
\]
Thus the minimum averaging time is \(\boxed{1.0\ \text{s}}\).