COMP9444 Neural Networks and Deep Learning
Term 3, 2019

Exercises 3: Probability


  1. Bayes' Rule

    One bag contains 2 red balls and 3 white balls. Another bag contains 3 red balls and 2 green balls. One of these bags is chosen at random, and two balls are drawn randomly from that bag, without replacement. Both of the balls turn out to be red. What is the probability that the first bag is the one that was chosen?

  2. Entropy and Kullback-Leibler Divergence

    Consider these two probability distributions on the same space Ω = {A, B, C, D}

    p = ⟨ ½, ¼, ⅛, ⅛ ⟩
    q = ⟨ ¼, ⅛, ⅛, ½ ⟩
    1. Construct a Huffmann tree for each distribution p and q
    2. Compute the entropy H(p)
    3. Compute the KL-Divergence in each direction DKL(p || q) and DKL(q || p). Which one is larger? Why?


Make sure you try answering the Exercises yourself, before checking the Sample Solutions