Batch Normalization

Batchnormalization is not a Norm!

This article discusses different types of normalizations in a Convolutional Neural Network and why Batch Normalization is necessary. There is an experiment that involves random dropping of BatchNorm Layers based on a hyperparameter in an Object Recognition model and validating it across the accuracy.