lect14

100 %
0 %
Information about lect14
Education

Published on June 17, 2007

Author: Crystal

Source: authorstream.com

Computational Semantics:  Computational Semantics Dr. Björn Gambäck SICS – Swedish Institute of Computer Science AB Stockholm, Sweden Last Week: Grammar Coverage :  Last Week: Grammar Coverage Coverage is never complete Add more rules… All grammars leak More specific rules Add more features General NLP System Architecture:  General NLP System Architecture User Modeling Dialogue Management Grammar Semantics:  Semantics Syntax how signs are related to each other Semantics how signs are related to things Pragmatics how signs are related to people Mr. Smith is expressive Compositional Semantics:  Compositional Semantics Compositional Semantics = The abstract meaning of a sentence (built from the meaning of its parts) Situational Semantics = Adds context-dependent information 'Forget about it' World knowledge = knowledge about the world shared between groups of people Computational Semantics? :  Computational Semantics? Automating the processes of mapping natural language to semantic representations using logical representations to draw inferences Patrick Blackburn andamp; Johan Bos (Saarbrücken, 1999) Representation and Inference for Natural Language: A First Course in Computational Semantics Vocabularies:  Vocabularies Define the basis of a conversation the topic the language { (LOVE,2), (CUSTOMER,1),(ROBBER,1), (MIA,0), (VINCENT,0), (HONEY-BUNNY,0), (PUMPKIN,0) } Linguistic Meaning:  Linguistic Meaning Translation from linguistic form to some 'language of thought' (linguistic form = grammatical / syntactic form) Fodor: mental states with propositional content are 'computational' the mind 'computes' a conclusion from the premises (beliefs, desires, etc.) on the basis of their structural characteristics Thus: beliefs, etc., must have a representational structure Logical Forms should be:  Logical Forms should be Disambiguated: alternative readings  different logical forms Representing literal 'meanings' (truth conditions) Vehicle for reasoning Basis for generation: one logical form  several readings First-Order Languages:  First-Order Languages Non-logical = all symbols in the vocabulary Variables = x, y, z, w, … (infinitely many) Boolean operators:  negation  implication  disjunction  conjunction Quantifiers:  universal  existential (, ) and , Free and Bound Variables:  Free and Bound Variables (CUSTOMER(x)) x(ROBBER(x) y PERSON(y))) the first occurrence of x is free the second and third occurrences of x are bound the first and second occurrences of y are also bound sentence = a formula containing no free variables Beliefs:  Beliefs Acquiring a new belief: linguistic form  mental representation Aristotle: Deduction and inference are based on formal relations Circumstantial problem: Accessing the language of thought via the language of speech Fundamental problem: Falls short of explaining what language really means (We're just shifting the problem to another language.) What is Missing?:  What is Missing? When we speak or think, we speak or think about something. We speak about things in the world. Utterances concerning the actual world may be true or false. The truth or falsity of an utterance depends on the meaning of the expression uttered the factual constitution of its subject matter. First-Order Models:  First-Order Models A model is a pair (D,F) D = domain: the set of entities F = interpretation function: map symbols in the vocabulary to entities Model Example 1:  Model Example 1 D = {d1, d2, d3, d4} F(MIA) = d1 F(HONEY-BUNNY) = d2 F(VINCENT) = d3 F(PUMPKIN) = d4 F(CUSTOMER) = {d1 , d3} F(ROBBER) = {d2 , d4} F(LOVE) = {(d4, d2), (d3 , d1)} d1 d4 d3 d2 customer robber Model Example 2:  Model Example 2 D = {d1, d2, d3, d4} F(MIA) = d2 F(HONEY-BUNNY) = d1 F(VINCENT) = d4 F(PUMPKIN) = d3 F(CUSTOMER) = {d2 , d3 , d4} F(ROBBER) = {d1} F(LOVE) = {(d3 , d4)} d1 d3 d2 d4 robber customer Model-Theoretic Semantics (Montague):  Model-Theoretic Semantics (Montague) Separate meaning of expressions from factual constitutions The subject matter is represented by a model Model = abstract structure encoding factual information pertaining to truth values of sentences State for each sentence S in which possible models uttering S  truth in which possible models uttering S  falsehood The Meaning of Sentences (Frege):  The Meaning of Sentences (Frege) Giving an account of linguistic meaning = describing the meanings of complete sentences Explaining the meaning of a sentence S = explaining under which conditions S is true Explaining the meanings of other units = describe how they contribute to S’s meaning Semantic Construction:  Semantic Construction Given a sentence of a language, is there a systematic way of constructing its semantic representation? Can we translate a syntactic structure into an abstract representation of its actual meaning? (e.g. first-order logic) Compositionality, Frege’s Principle:  Compositionality, Frege’s Principle Meaning ultimately flows from the lexicon Meanings are combined by syntactic information The meaning of the whole is a function of the meaning of its parts (’parts’ = the substructure given by syntax) Syntactic Structure:  Syntactic Structure S VP Vincent NP likes V NP Mia Vincent loves Mia VINCENT MIA LOVES(?,?) LOVES(?,MIA) LOVES(VINCENT,MIA) Three Tasks:  Three Tasks We Need to Specify: a syntax for the language fragment semantic representations for the lexical items the translation compositionally (= specify the translation of all expressions in terms of the translation of their parts) All in a way that is naturally implemented Task 1: A Context-Free Grammar:  Task 1: A Context-Free Grammar s  np, vp. vp  iv. vp  tv, np. np  pname. np  det, n. pname  [vincent]. pname  [mia]. n  [robber]. n  [woman]. det  [a]. det  [every]. iv  [snores]. tv  [loves]. Montague: 'I fail to see any great interest in syntax except as a preliminary to semantics.' Incomplete / Quasi-Logical Forms:  Incomplete / Quasi-Logical Forms To build representations we need to work with ‘incomplete’ formulas indicate where the information they lack must go VP LOVES(?,MIA) Task 2: Semantic Lexicon:  Task 2: Semantic Lexicon pname(sem=vincent)  [vincent]. pname(sem=mia)  [mia]. n(sem=(X,robber(X)))  [robber]. n(sem=(X,woman(X)))  [woman]. iv(sem=(X,snore(X)))  [snores]. tv(sem=(X,Y,love(X,Y)))  [loves]. Associating missing information with an explicit variable Quantifiers / Determiners:  Quantifiers / Determiners Every robber snores x(ROBBER(x)  SNORE(x)) forall(X, robber(X) =andgt; snore(X)) A robber snores x(ROBBER(x))andamp; SNORE(x)) exists(X, robber(X) andamp; snore(X)) det(X,N,VP,forall(X, N =andgt; VP)) [every]. det(X,N,VP,exists(X, N andamp; VP)) [a]. Noun contribution = restriction VP contribution = nuclear scope Task 3: Production Rules:  Task 3: Production Rules s(sem=N)  np(sem=(X,VP,N)), vp(sem=(X,VP)). vp(sem=(X,V))  iv(sem=(X,V)). vp(sem=(X,N))  tv(sem=(X,Y,V)), np(sem=(Y,V,N)). np(sem=(Name,X,X))  pname(sem=Name). np(sem=(X,VP,Det)) det(sem=(X,N,VP,Det)),n(sem=(X,N)). How did we do?:  How did we do? It works! The underlying intuition is pretty clear. Much of the work is done by the rules. Hard to treat the grammar in a modular way. Lambda Calculus (Church):  Lambda Calculus (Church) Notational extension of first order logic Variable binding by an operator  ('lambda') x.MAN(x) Variables bound by  are ‘placeholders’ (for missing information) 'lambda reduction' performs the substitutions Functional Application & Lambda Reduction:  Functional Application andamp; Lambda Reduction Concatenation indicates ‘functional application’ (= that we wish to perform a substitution) (x.MAN(x)) VINCENT x.MAN(x)= functor VINCENT = argument lambda reduction = perform the substitution MAN(VINCENT) Marking more complex kinds of information:  Marking more complex kinds of information Representation of 'a man' Q.x(MAN(x)  Q) The variable Q indicates that: some information is missing where this information has to be plugged in “Every robber snores”:  'Every robber snores' Step 1: assign -expressions to the syntactic categories robber: x.ROBBER(x) snores: x.SNORES(x) every: N.VP.x(N(x)  VP(x)) “Every robber snores”, cont.:  'Every robber snores', cont. Step 2: associate the NP with the application that has the DET as functor and the NOUN as argument every (DET) N.VP.x(N(x)  VP(x)) robber (N) y.ROBBER(y) every robber (NP) (N.VP.x(N(x)  VP(x))) (y.ROBBER(y)) Lambda Reduction:  Lambda Reduction Step 3: Perform the demanded substitutions every (DET) N.VP.x(N(x)  VP(x)) robber (N) y.ROBBER(y) every robber (NP) (N.VP.x(N(x)  VP(x))) (y.ROBBER(y)) every robber (NP) VP.x((y.ROBBER(y))(x)  VP(x)) every robber (NP) VP.x(ROBBER(x) VP(x)) “Every robber snores”, final representation:  'Every robber snores', final representation Step 4: Add the VP every (DET) N.VP.x(N(x)  VP(x)) robber (N) y.ROBBER(y) every robber (NP) VP.x(ROBBER(x) VP(x)) snores (V) z.SNORES(z) every robber snores (S) (VP.x(ROBBER(x) VP(x)))(z.SNORES(z)) every robber snores (S) x(ROBBER(x) (z.SNORES(z))(x)) every robber snores (S) x(ROBBER(x) SNORES(x)) Transitive Verbs:  Transitive Verbs loves: NP.z.(NP(x.LOVE(z,x)) TV semantic representations take their object NP’s semantic representation as argument Subject NP semantic representations take the VP semantic representation as argument Quantifying Noun Phrases: “Every woman loves a man”:  Quantifying Noun Phrases: 'Every woman loves a man' every woman (NP) VP.w(WOMAN(w)VP(w)) loves (V) NP.x.(NP(y.LOVE(x,y)) every woman loves a man (S) (VP.w(WOMAN(w)VP(w)))(x.(m(MAN(m)andamp; LOVE(x,m))) a man (NP) VP.m(MAN(m)andamp; VP(m)) loves a man (VP) (NP.x.(NP(y.LOVE(x,y))) (VP.m(MAN(m)andamp; VP(m))) loves a man (VP) x.(VP.m(MAN(m)andamp;VP(m))(y.LOVE(x,y))) loves a man (VP) x.(m(MAN(m)andamp; LOVE(x,m))) loves a man (VP) x.(m(MAN(m)andamp;(y.LOVE(x,y))(m))) every woman loves a man (S) w(WOMAN(w)(x.(m(MAN(m)andamp; LOVE(x,m)))(w))) every woman loves a man (S) w(WOMAN(w) x(MAN(m)andamp; LOVE(w,m))) Scope Ambiguities:  Scope Ambiguities Every woman loves a man w(WOMAN(w) x(MAN(m)andamp; LOVE(w,m))) 'for each woman there is a man that she loves' Second reading: x(MAN(m)andamp; w(WOMAN(w) LOVE(w,m))) 'there is one man who is loved by all women' Construction of Semantic Representations:  Construction of Semantic Representations Three basic principles: Lexicalization: try to keep semantic information lexicalized Compositionality: pass information up compositionally from terminals Underspecification: Don’t make a choice unless you have to (the interpretation of ambiguous parts is left unresolved) Underspecification:  Underspecification A meaning  of a formalism L is underspecified = represents an ambiguous sentence in a more compact manner than by a disjunction of all readings L is complete = L’s disambiguation device produces all possible refinements of any  Example: consider a sentence with 3 quantified NPs (with underspecifed scoping relations) L must be able to represent all 23! = 64 refinements (partial and complete disambiguations) of the sentence. Phenomena for Underspecification:  Phenomena for Underspecification local ambiguities e.g., lexical ambiguities, anaphoric or deictic use of PRO global ambiguities e.g., scopal ambiguities, collective-distributive readings ambiguous or incoherent non-semantic information e.g., PP-attachment, number disagreement

Add a comment

Related presentations

Related pages

lect14

4.12.2015 2 4.12.2015 5 "Satz:" Alles, was von einer k-Register-Maschine berechnet werden kann, kann auch von einer additiven (k+3) -Register-Maschine ...
Read more

Lect14 - PHYSICS - 212 - Course Hero

View Notes - Lect14 from PHYSICS 212 at UIllinois. Physics 212 Lecture 14 Biot-Savart Law r r 0 I ds r dB = 2 4 r :05 Physics 212 Lecture 14, Slide 1
Read more

Lect14 - Physics 211 Lecture 14 Today's Concepts: a ...

View Notes - Lect14 from PHYS 211 at University of Illinois, Urbana Champaign. Physics 211 Lecture 14 Today's Concepts: a) Rotational Motion b) Moment of
Read more

lect14 (PDF Download Available) - ResearchGate - Share and ...

Official Full-Text Publication: lect14 on ResearchGate, the professional network for scientists.
Read more

Lect14 - Ace Recommendation Platform - 15

In both cases shown below a hula hoop with mass M and radius R is spun with the same angular velocity about a vertical axis through its center. In Case 1 ...
Read more

lect14 - Trinity University - San Antonio, Texas

Lecture #14: The Immune Response and AID s. I. THE HUMORAL RESPONSE. 1. The humoral response occurs when B-cells with specific receptors are stimulated by ...
Read more

Lect14 - Ocean Acidification.ppt

Title: Microsoft PowerPoint - Lect14 - Ocean Acidification.ppt [Compatibility Mode] Author: shulld Created Date: 5/24/2013 9:31:36 AM
Read more

Lecture 14: The Atmosphere on Mars - Career Account Web Pages

PURDUE UNIVERSITY. EAS 105-THE PLANETS. Prof. Robert L. Nowack . Lecture 14 . Atmosphere on Mars and the Search for Life . Viking 1 and 2 recorded and sent ...
Read more