GCP data lab – MNIST NN with Tensorflow

by allenlu2007

Reference:

Srivastava, Hinton, et al, https://www.cs.toronto.edu/~hinton/absps/JMLRdropout.pdf

 

Dropout: A Simple Way to Prevent Neural Networks from Overfitting

 

 

Softmax: no need to worry about overfitting

 

Deep NN: too many parameters,  need to prevent overfit (during the training)

* regularization

* pruning

* dropout

 

Regularisation: both training and testing

Pruning: both training and testing

Dropout: Only in training.  

 

NewImage

 

NewImage

Advertisements