Question:

Innovative Labs, a startup focused on developing intelligent language models, is training a neural network to improve its text prediction accuracy. During the training process, the team uses the practice of fine-tuning the weighting of the neural network based on the error rate (loss) obtained in the previous iteration to minimize error. This practice is known as _ _ _ _ _ _ _ _ _ _ _ _ _ .

Show Hint

Backpropagation helps neural networks learn by minimizing error through weight adjustment.
  • Forward Propagation
  • Activation Function
  • Back Propagation
  • Deep Learning
Hide Solution
collegedunia
Verified By Collegedunia

The Correct Option is C

Solution and Explanation

Step 1: Understanding the process.
The question describes adjusting weights of a neural network based on the error (loss) from previous output.

Step 2:
Key concept.
Backpropagation is the algorithm used to update weights by propagating the error backward through the network to minimize loss.

Step 3:
Evaluating options.
(A): Forward propagation computes output, not error correction.
(B): Activation function decides neuron output.
(C): Correct. Backpropagation updates weights using error.
(D): Deep learning is a broader concept.
Step 4:
Conclusion.
Thus, the process is Back Propagation.
Final Answer: Back Propagation.
Was this answer helpful?
0
0