Linking Syntax and Semantics

Aim:
To describe an interpretation algorithm that uses the syntactic analysis of a sentence together with grammar rules augmented by feature information describing how the semantic analysis of a phrase is derived from the semantic analyses of its constituents. We shall show how this works for a simple grammar and then investigate a few more complicated examples, including the handling of auxiliary verbs, and prepositional phrases.
The concepts of lambda-expression and lambda-reduction are central to the handling of certain grammar rules.
Reference: Chapter 9 of Allen, J.: Natural Language Understanding, 2nd ed., Benjamin Cummings, 1995.
Keywords: CNP, common noun phrase, lambda reduction, VAR feature
Plan:
  • Every constituent must have an interpretation - lambda-expressions, lambda-reduction
  • Example: grammar rules with semantic interpretation via features
  • The VAR feature
  • Lexicon entries with SEM features
  • Handling PPs and VPs with lambda-expressions
  • Handling different types of PPs


Interpretation Algorithm


Lambda Expressions


Lambda Expressions 2


Lambda Expressions 3


9.2 Grammar / Lexicon with Semantic Interpretation


Parse Tree with Logical Form (SEM) Features

The whole of the parse/semantic tree for this sentence (Mary sees Jack) is shown below (= Fig. 9.1 of Allen):


Grammatical Rules with SEM Features

Like Table 9.3 in Allen
1(S SEM (?semvp ?semnp) → (NP SEM ?semnp) (VP SEM ?semvp)
2(VP VAR ?v SEM (lambda a2 (?semv ?v a2)))→ (V[_none] SEM ?semv)
3(VP VAR ?v SEM (lambda a3 (?semv ?v a3 ?semnp))) →
(V[_np] SEM ?semv) (NP SEM ?semnp)
4(NP WH - VAR ?v SEM (PRO ?v ?sempro)) → (PRO SEM ?sempro)
5(NP VAR ?v SEM (NAME ?v ?semname)) → (NAME SEM ?semname)
6(NP VAR ?v SEM (?semdet ?v : (?semcnp ?v)) →
(DET SEM ?semdet) (CNP SEM ?semcnp))
7(CNP SEM ?semn) → (N SEM ?semn)

Head_feature(S, VP, NP, CNP) = VAR


The VAR feature


Lexical Entries with Semantic Information

Table 9.2 of Allen
a(det AGR 3s SEM A)
can(det SUBCAT base SEM CAN1)
car(n SEM CAR1 AGR 3s)
cry(v SEM CRY1 VFORM base SUBCAT _none)
decide(v SEM DECIDES1 VFORM base SUBCAT _none)
decide(v SEM DECIDES-ON1 VFORM base SUBCAT _pp:on)
dog(n SEM DOG1 AGR 3s)


Lexical Entries with Semantic Information 2

fish(n SEM FISH1 AGR 3s)
fish(n SEM (PLUR FISH1) AGR 3p)
house(n SEM HOUSE1 AGR 3s)
has(aux VFORM pres AGR 3s SUBCAT pastprt SEM perf)
he(pro SEM HE1 AGR 3s)
in(p PFORM { LOC MOT} SEM AT-LOC1)
Jill(name AGR 3s SEM "Jill")

Lexical Entries with Semantic Information 3

man(n SEM MAN1 AGR 3s)
men(n SEM (PLUR MAN1) AGR 3p)
on(p PFORM {LOC, on} SEM ON-LOC1)
saw(v SEM SEES1 VFORM past SUBCAT _np)
see(v SEM SEES1 VFORM base SUBCAT _np IRREG-PAST + EN-PASTPRT +)
she(pro SEM SHE1 AGR 3s)
the(DET SEM THE AGR { 3s 3p} )
to<(to AGR - VFORM inf)


Handling Semantic Interpretation

The chart parser can be modified so that it handles semantic interpretation as follows:


Interpreting an example sentence: Jill saw the dog

  1. The word Jill is parsed as a name. A new discourse variable, j1, is generated, and set as the VAR feature of the NAME.

  2. This constituent is used with rule 5 to build an NP. Since VAR is a head feature, VAR j1 is passed up to the NP, and the SEM is built using rule 5 to give SEM (NAME j1 "Jill")

  3. The lexical entry for the word saw generates a V constituent with the SEM <PAST SEES1> and a new VAR ev1.

  4. The lexical entry for the produces SEM THE.


Interpreting an example sentence 2

  1. The lexical entry for dog produces a N constituent with SEM DOG1 and VAR d1. This in turn gives rise to a CNP constituent with the same SEM and VAR, via rule 7 .

  2. Rule 6 combines the SEMs THE and DOG1 with the VAR d1 to produce an NP with the SEM (THE d1 : (DOG1 d1)) and VAR d1.

  3. (THE d1 : (DOG1 d1)) is combined with the SEM of the verb and its VAR by rule 3 to form a VP with VAR ev1 and SEM

    (lambda x (<PAST SEES1> ev1 x (THE d1 (DOG1 d1))))


Interpreting an example sentence 3

  1. This is then combined with the subject NP (NAME j1 "Jill") to form the SEM

    (<PAST SEES1> ev1 (NAME j1 "Jill") <THE d1 (DOG1 d1)>)

    and VAR ev1 (after lambda-reduction).


Completed Parse Tree for Jill saw the dog (Fig. 9.5 Allen)


9.3 Prepositional Phrases and Verb Phrases

Auxiliary Verbs


Prepositional Phrases and Verb Phrases 2


Prepositional Phrases


PP Modifying an NP


PP Modifying an NP 2


PP Modifying an NP 3


PP Modifying an NP 4


PP Modifying an NP 5


PP Modifying a VP


PP Modifying a VP 2

You are now in a position to do all of the exercises at http://www.cse.unsw.edu.au/~cs9414/Exercises/semantics.html


PP Modifying a VP 3

Parse tree for cry in the corner using this rule: (Allen Fig. 9.6, corrected):


PP as a subcategorized constituent in a VP


PP as a subcategorized constituent in a VP 2


Using the PRED Feature in Grammar Rules


Logical Forms of PRED + and PRED – PPs


Parse Trees for PRED + and PRED –


Summary: Semantic Interpretation
The semantic interpretation algorithm is feature-driven: reading the augmented grammar rules right to left provides a description of how to build the SEM feature of the phrase described by the grammar rule. When the semantic description has a gap (as with VP), the SEM feature is a lambda-expression.

CRICOS Provider Code No. 00098G
Copyright (C) Bill Wilson, 2006, except where another source is acknowledged.