Let’s take a look at one of the images in the set. Looks like a 4! Add colormap, cmap=’jet’ to use the colorbar
exampleNumber=2#Pick the example we want to visualizeexample=batch[exampleNumber,:]#Then we load that example.plt.imshow(np.reshape(example,[28,28]),cmap='jet')#Next we reshape it to 28x28 and display it.
Before we can get to training our model using the data, we will have to define a few functions that the training and testing process can use.
Here we define the loss function for softmax regression.
defgetLoss(w,x,y,lam):m=x.shape#First we get the number of training examplesy_mat=oneHotIt(y)#Next we convert the integer class coding into a one-hot representationscores=np.dot(x,w)#Then we compute raw class scores given our input and current weightsprob=softmax(scores)#Next we perform a softmax on these scores to get their probabilitiesloss=(-1/m)*np.sum(y_mat*np.log(prob))+(lam/2)*np.sum(w*w)#We then find the loss of the probabilitiesgrad=(-1/m)*np.dot(x.T,(y_mat-prob))+lam*w#And compute the gradient for that lossreturnloss,grad
The below function converts integer class coding, where there is a unidimensional array of labels into a one-hot varient, where the array is size m (examples) x n (classes).