Dropout
Definition
A regularization technique that randomly deactivates a percentage of neurons during training to prevent overfitting and improve generalization.In-Depth Explanation
During each training step, dropout randomly sets a fraction of neuron outputs to zero (typically 20-50%). This forces the network to learn redundant representations and not rely too heavily on any single neuron. At inference time, all neurons are active but outputs are scaled. Dropout effectively trains an ensemble of sub-networks.
Real-World Example
Applying 30% dropout to hidden layers means each training step uses only 70% of neurons, making the model more robust.