GCP data lab – MNIST NN with Tensorflow
Srivastava, Hinton, et al, https://www.cs.toronto.edu/~hinton/absps/JMLRdropout.pdf
Dropout: A Simple Way to Prevent Neural Networks from Overfitting
Softmax: no need to worry about overfitting
Deep NN: too many parameters, need to prevent overfit (during the training)
Regularisation: both training and testing
Pruning: both training and testing
Dropout: Only in training.