Deep Learning

Batchnormalization is not a Norm!

This article discusses different types of normalizations in a Convolutional Neural Network and why Batch Normalization is necessary. There is an experiment that involves random dropping of BatchNorm Layers based on a hyperparameter in an Object Recognition model and validating it across the accuracy.

Is ReLU ReLUvant?

This article discusses the usage of ReLU in a Convolutional Neural Network. There is also an experiment that involves random dropping of Activation Layers based on a hyperparameter in an Object Recognition model and validating it across the accuracy.

Facial feature manipulation using Adversarial Networks

Given a facial image, this can generate 24 different images by altering the race, gender and age of the person.

Stanford DawnBench - CIFAR10

An attempt to beat DawnBench using modified Resnet9

Tiny-Imagenet Challenge

Modified version of DenseNet architecture to train on tiny-ImageNet dataset.