Lecture 17 Semantic Analysis: Syntax-Driven Semantics CS 4705

advertisement
Lecture 17
Semantic Analysis: Syntax-Driven
Semantics
CS 4705
Review
• Some representations of meaning:
– First order logic
– Frames
– etc.
• Some linguistically relevant categories we want to
represent
– Predicates, arguments, variables, quantifiers
– Categories, events, time, aspect
• Today: How can we compute meaning about
these categories from these representations?
Compositional Semantics
• The meaning of the whole is made up of the
meaning of its parts
– George cooks. Dan eats. Dan is sick.
– Cook(George) Eat(Dan) Sick(Dan)
– If George cooks and Dan eats, Dan will get sick.
(Cook(George) ^ eat(Dan))  Sick(Dan)
Sick(Dan)  Cook(George)
• Part of the meaning derives from the people and
activities it’s about (predicates and arguments, or,
nouns and verbs) and part from the way they are
ordered and related grammatically: syntax
Syntax-Driven Semantics
S
NP VP eat(Dan)
Nom V
N
Dan
eats
• So….can we link up syntactic structures to a
corresponding semantic representation to produce
the ‘meaning’ of a sentence in the course of
parsing it?
Specific vs. General-Purpose Rules
• We don’t want to have to specify for every
possible parse tree what semantic representation it
maps to
• We want to identify general mappings from parse
trees to semantic representations:
– Again (as with feature structures) we will augment the
lexicon and the grammar
– Rule-to-rule hypothesis: a mapping exists between rules
of the grammar and rules of semantic representation
Semantic Attachments
• Extend each grammar rule with instructions on
how to map the components of the rule to a
semantic representation (grammars are getting
complex)
S  NP VP {VP.sem(NP.sem)}
• Each semantic function is defined in terms of the
semantic representation of choice
• Problem: how to define these functions and how to
specify their composition so we always get the
meaning representation we want from our
grammar?
A ‘Simple’ Example
AyCaramba serves meat.
• Associating constants with constituents
– ProperNoun  AyCaramba {AyCaramba}
– MassNoun  meat {Meat}
• Defining functions to produce these from input
– NP  ProperNoun {ProperNoun.sem}
– NP  MassNoun {MassNoun.sem}
– Assumption: meaning reps of children are passed up to
parents for non-branching constuents
• Verbs here are where the action is
– V  serves {E(e,x,y) Isa(e,Serving) ^ Server(e,x) ^
Served(e,y)}
– Will every verb have its own distinct representation?
– Predicate(Agent,Patient)…
• How do we combine these pieces?
– VP  V NP
– Goal: E(e,x) Isa(e,Serving) ^ Server(e,x) ^
Served(e,Meat)
– VP semantics must tell us
• Which vars to be replaced by which args?
• How this replacement is done?
Lambda Notation
•
Extension to FOPC
 x P(x)
 + variable(s) + FOPC expression in those variables
•
Lambda binding
•
•
Apply lambda-expression to logical terms to bind
lambda-expression’s parameters to terms (lambda
reduction)
Simple process: substitute terms for variables in
lambda expression
xP(x)(car)
P(car)
• Lambda notation provides requisite verb
semantics
– Formal parameter list makes variables within the body
of the logical expression available for binding to
external arguments provided by e.g. NPs
– Lambda reduction implements the replacement
• Semantic attachment for
– V  serves {V.sem(NP.sem)}
{E(e,x,y) Isa(e,Serving) ^ Server(e,y) ^ Served(e,x)}
becomes
{x E(e,y) Isa(e,Serving) ^ Server(e,y) ^ Served(e,x)}
– Now ‘x’ is available to be bound when V.sem is applied
to NP.sem
– -application binds x to value of NP.sem (Meat)
– -reduction replaces x within -expression to Meat
– Value of VP.sem becomes:
{E(e,y) Isa(e,Serving) ^ Server(e,y) ^ Served(e,Meat)}
• Similarly, we need a semantic attachment for S
NP VP {VP.sem(NP.sem)} to add the subject NP
to our semantic representation of AyCaramba
serves meat
– We need another -expression in the value of VP.sem
– But currently V.sem doesn’t give us one
– So, we change V.sem to include another -expression
– V  serves
{x y E(e) Isa(e,Serving) ^ Server(e,y) ^ Served(e,x)}
• VP semantics (V.sem(NP.sem) binds the outer expression to the object NP (Meat) but leaves the
inner -expression for subsequent binding to the
subject NP when the semantics of S is determined
{E(e) Isa(e,Serving) ^ Server(e,AyCaramba) ^
Served(e,Meat)}
Some Additional Problems to Solve
• Complex terms
A restaurant serves meat.
– ‘a restaurant’: E x Isa(x,Restaurant)
– E e Isa(e,Serving) ^ Server(e,<E x Isa(x,Restaurant)>) ^
Served(e,Meat)
– Allows quantified expressions to appear where terms
can by providing rules to turn them into well-formed
FOPC expressions
• Quantifier scope
Every restaurant serves meat.
Every restaurant serves every meat.
• Appropriate representations for other constituents?
– Adjective phrases: intersective semantics
Nom  Adj Nom {x Nom.sem(x) ^ Isa(x,Adj.sem)}
Adj  tiny
x Isa(x, Restaurant) ^ Isa(x,Cheap)
But….fake gun?
Ex Isa(x, Gun) ^ AM(x,Fake)
Doing Compositional Semantics
• To incorporate semantics into grammar we must
– Figure out right representation for a single constituent
based on the parts of that constituent (e.g. Adj)
– Figuring out the right representation for a category of
constituents based on other grammar rules making use
of that constituent (e.g Nom Adj Nom)
• This gives us a set of function-like semantic
attachments incorporated into our CFG
– E.g. Nom  Adj Nom {x Nom.sem(x) ^
Isa(x,Adj.sem)}
What do we do with them?
• As we did with feature structures:
– Alter an Early-style parser so when constituents (dot at
the end of the rule) are completed, the attached
semantic function applied and meaning representation
created and stored with state
• Or, let parser run to completion and then walk
through resulting tree running semantic
attachments from bottom-up
Option 1 (Integrated Semantic Analysis)
S  NP VP {VP.sem(NP.sem)}
– VP.sem has been stored in state representing VP
– NP.sem stored with the state for NP
– When rule completed, go get value of VP.sem, go get
NP.sem, and apply VP.sem to NP.sem
– Store result in S.sem.
• As fragments of input parsed, semantic fragments
created
• Can be used to block ambiguous representations
Drawback
• You also perform semantic analysis on orphaned
constituents that play no role in final parse
• Hence, case for pipelined approach: Do semantics
after syntactic parse
Non-Compositional Language
• What do we do with language whose meaning
isn’t derived from the meanings of its parts
–
–
–
–
–
–
Metaphor: You’re the cream in my coffee.
She’s the cream in George’s coffee.
The break-in was just the tip of the iceberg.
This was only the tip of Shirley’s iceberg.
Idioms: The old man finally kicked the bucket.
The old man finally kicked the proverbial bucket.
• Solutions?
– Mix lexical items with special grammar rules?
Summing Up
• Hypothesis: Principle of Compositionality
– Semantics of NL sentences and phrases can be
composed from the semantics of their subparts
• Rules can be derived which map syntactic analysis
to semantic representation (Rule-to-Rule
Hypothesis)
– Lambda notation provides a way to extend FOPC to
this end
– But coming up with rule2rule mappings is hard
• Idioms, metaphors perplex the process
Download