Step 1: Recall definition of returns to scale.
- Constant returns to scale: Output increases in exact proportion to inputs → \(f(tx_1, tx_2) = t f(x_1, x_2)\).
- Increasing returns to scale: Output increases by more than proportionate → \(f(tx_1, tx_2)>t f(x_1, x_2)\).
- Decreasing returns to scale: Output increases by less than proportionate → \(f(tx_1, tx_2)<t f(x_1, x_2)\).
Step 2: Apply to question.
Since the question asks for decreasing returns to scale, the correct condition is:
\[
f(tx_1, tx_2)<t f(x_1, x_2)
\]
Final Answer:
\[
\boxed{f(tx_1, tx_2)<t f(x_1, x_2)}
\]