Extension Revision Solutions - Prolog, Natural Language Processing, IAC

  1. Prolog programming

    1. Briefly explain the concept of memoization, and its relation to the predicates assert and asserta.

      Solution: Memoization is the process of noting down the results of complicated computations or inferences that might be needed again later, so that it is not necessary to re-compute them if they are needed. In Prolog this can be done by asserting the fact that has been computed. It is often the case that it is necessary to use asserta in order to ensure that the asserted result is accessed before the rule that would be used if the asserted result were not available.


        factorial(0, 1).
        factorial(N, NFact) :-
            N > 0,
            Nminus1 is N - 1,
            factorial(Nminus1, Nminus1Fact),
            asserta((factorial(Nminus1, Nminus1Fact) :- !)),
            NFact is N * Nminus1Fact.

    2. What is the output of the following query?
             ?- functor(owns(jim, book(author(joanna, rowling),
                               title("Harry Potter and the Half-Blood Prince"))),



    3. Given the following facts:
             likes(mary, pizza).
             likes(mary, curry).
             likes(john, beer).
             likes(john, pizza).
             is_liked(X) :- likes(_, X).
      what is the output of the following queries?
      1. ?- setof(X, is_liked(X), List1).
        X = _G187
        List1 = [beer, curry, pizza]

      2. ?- findall(X, likes(_, X), List2).
        X = _G187
        List2 = [pizza, curry, beer, pizza]


  2. Natural Language Processing

    1. Write down the formula for computing the probability of a sentence w1, w2, …, wT with the tag sequence C1, C2, …, CT.

      Solution: Πi=1,T Pr(wi | Ci) × Πi=1,T Pr(Ci | Ci–1)

    2. Describe, with an example, a Markov chain. What would you need to add to your Markov chain in order to have a Hidden Markov model?

      Solution: A Markov chain is an edge-labelled directed graph, where each node represents a "state", e.g. a lexical category, and the edge-labels are probabilities of moving the state at the end of the directed arc.

      Markov chain diagram

      To convert our example into a Hidden Markov Model (HMM), we would need to add to each node a table of output probabilities, in this case lexical generation probabilities - e.g. if one is in state N, what are the probabilities of producing each possible noun when in this state.

  3. Briefly describe an IAC network, paying attention to:

    1. competitive pools
    2. inhibitory and excitatory connections
    3. the output function of a unit in an IAC net

    Solution: An IAC network is a collection of nodes and weighted connections, normally bidirectional connections.

    The nodes are divided into competitive pools. All the interconnections between nodes in any pool are inhibitory; i.e. their weights are negative.

    Connections between nodes in different pools, when they exist, are excitatory connections - i.e. the weights are positive. (Nodes are unconnected iff the weights are zero.)

    Output function. Denote the weight of the connection between node i and node j by wij, and let the current activation of node j be aj. Then the total net input of node j is:

    netj = Σi wij ai + extinputj
    and then the output of node j is aj + Δaj, where:
    if (netj > 0)
       Δaj = (maxaj)netjdecay(ajrest)
       Δaj = (ajmin)netjdecay(ajrest)
    where, max, min, decay, and rest are parameters, signifying maximum and minimum activation, decay rate in the absence of input, and the resting value to which activation will decay in the absence of input.

CRICOS Provider Code No. 00098G