COMP9444 Neural Networks and Deep Learning
Quiz 2 (Probability and Backprop Variations)
This is an optional quiz to test your understanding of
the material from Week 2.
- Write the formula for a Gaussian distribution with mean μ
and standard deviation σ.
- Write the formula for Bayes' Rule,
in terms of a cause A and an effect B.
- Write the formula for the Entropy H(p) of a continuous
probability distribution p()
- Write the formula for the Kullback-Leibler Divergence
DKL(p || q)
between two continuous probability distributions
p() and q().
- Write the formulas for these Loss functions: Squared Error, Cross Entropy, Weight Decay.
(remember to define any variables you use)
-
In the context of Supervised Learning,
explain the difference between Maximum Likelihood estimation and Bayesian Inference.
- Briefly explain the concept of Momentum, as an enhancement for Gradient Descent.