#### COMP9444 Neural Networks and Deep Learning

### Quiz 3 (Hidden Units and Convolution)

This is an optional quiz to test your understanding of
the material from Weeks 3 and 4.

- Sketch the following activation functions, and write their formula:
Sigmoid, Tanh, ReLU.

- Explain how Dropout is used for neural networks, in both the training and testing phase.

- Explain what is meant by Overfitting in neural networks, and list four different methods for avoiding it.

- Write the formula for the Softmax loss function.

- Write the formula for activation
*Z*^{i}_{j,k}
of the node at location (*j,k*) in the *i*^{th} filter of a
Convolutional neural network which is connected by weights
*K*^{i}_{l,m,n} to all nodes in an
*M × N* window from the *L* channels in the previous layer,
assuming bias weights are included and the activation function is *g*().
How many free parameters would there be in this layer?

- If the previous layer has size
*J × K*,
and a filter of size *M × N* is applied with stride
*s* and zero-padding of width *P*,
what will be the size of the resulting convolutional layer?

- If max pooling with filter size
*F* and stride *s*
is applied to a layer of size *J × K*,
what will be the size of the resulting (downsampled) layer?