Technical
Batch Normalization(BatchNorm)
Definition
A technique that normalizes the inputs of each layer to have zero mean and unit variance, stabilizing and accelerating neural network training.In-Depth Explanation
Batch normalization addresses internal covariate shift—the changing distribution of layer inputs during training. It normalizes activations across the batch dimension, then applies learned scale and shift parameters. Benefits include faster convergence, higher learning rates, reduced sensitivity to initialization, and some regularization effect.
Real-World Example
Adding batch normalization layers after each convolutional layer in a deep image classifier to train 10x faster.
0 views0 found helpful