Fill the gaps to train a deep neural network If you want to read what a function
(e.g. dnn) does use ? to access its
documentation (i.e. ?dnn)
Don’t be shy to play with the hyperparameters. What happens when you
adjust the learning rate? Does the number of epochs have an influence?
What’s with the overall architecture of your network?