Machine Learning

Batchnormalization is not a Norm!

This article discusses different types of normalizations in a Convolutional Neural Network and why Batch Normalization is necessary. There is an experiment that involves random dropping of BatchNorm Layers based on a hyperparameter in an Object Recognition model and validating it across the accuracy.

Is ReLU ReLUvant?

This article discusses the usage of ReLU in a Convolutional Neural Network. There is also an experiment that involves random dropping of Activation Layers based on a hyperparameter in an Object Recognition model and validating it across the accuracy.