180 07 lec9

57 %
43 %
Information about 180 07 lec9
Entertainment

Published on November 16, 2007

Author: UpBeat

Source: authorstream.com

LINGUIST 180: Introduction to Computational Linguistics:  LINGUIST 180: Introduction to Computational Linguistics Dan Jurafsky, Marie-Catherine de Marneffe Lecture 9: Grammar and Parsing (I) Thanks to Jim Martin for many of these slides! Outline for Grammar/Parsing Week:  Outline for Grammar/Parsing Week Context-Free Grammars and Constituency Some common CFG phenomena for English Sentence-level constructions NP, PP, VP Coordination Subcategorization Top-down and Bottom-up Parsing Dynamic Programming Parsing Quick sketch of probabilistic parsing Review:  Review Parts of Speech Basic syntactic/morphological categories that words belong to Part of Speech tagging Assigning parts of speech to all the words in a sentence Syntax:  Syntax Syntax: from Greek syntaxis “setting out together, arrangement’’ Refers to the way words are arranged together, and the relationship between them. Distinction: Prescriptive grammar: how people ought to talk Descriptive grammar: how they do talk Goal of syntax is to model the knowledge of that people unconsciously have about the grammar of their native language Syntax:  Syntax Why should we care? Grammar checkers Question answering Information extraction Machine translation key ideas of syntax:  key ideas of syntax Constituency (we’ll spend most of our time on this) Subcategorization Grammatical relations Plus one part we won’t have time for: Movement/long-distance dependency Context-Free Grammars (CFG):  Context-Free Grammars (CFG) Capture constituency and ordering Ordering: What are the rules that govern the ordering of words and bigger units in the language? Constituency: How words group into units and how the various kinds of units behave Constituency:  Constituency E.g., Noun phrases (NPs) Three parties from Brooklyn A high-class spot such as Mindy’s The Broadway coppers They Harry the Horse The reason he comes into the Hot Box How do we know these form a constituent? Constituency (II):  Constituency (II) They can all appear before a verb: Three parties from Brooklyn arrive… A high-class spot such as Mindy’s attracts… The Broadway coppers love… They sit But individual words can’t always appear before verbs: *from arrive… *as attracts… *the is *spot is… Must be able to state generalizations like: Noun phrases occur before verbs Constituency (III):  Constituency (III) Preposing and postposing: On September 17th, I’d like to fly from Atlanta to Denver I’d like to fly on September 17th from Atlanta to Denver I’d like to fly from Atlanta to Denver on September 17th. But not: *On September, I’d like to fly 17th from Atlanta to Denver *On I’d like to fly September 17th from Atlanta to Denver CFG example:  CFG example S -> NP VP NP -> Det NOMINAL NOMINAL -> Noun VP -> Verb Det -> a Noun -> flight Verb -> left CFGs: set of rules:  CFGs: set of rules S -> NP VP This says that there are units called S, NP, and VP in this language That an S consists of an NP followed immediately by a VP Doesn’t say that that’s the only kind of S Nor does it say that this is the only place that NPs and VPs occur Generativity:  Generativity As with FSAs you can view these rules as either analysis or synthesis machines Generate strings in the language Reject strings not in the language Impose structures (trees) on strings in the language How can we define grammatical vs. ungrammatical sentences? Derivations:  Derivations A derivation is a sequence of rules applied to a string that accounts for that string Covers all the elements in the string Covers only the elements in the string Derivations as Trees:  Derivations as Trees CFGs more formally:  CFGs more formally A context-free grammar has 4 parameters (“is a 4-tuple”) A set of non-terminal symbols (“variables”) N A set of terminal symbols  (disjoint from N) A set of productions P, each of the form A ->  Where A is a non-terminal and  is a string of symbols from the infinite set of strings (  N)* A designated start symbol S Defining a CF language via derivation:  Defining a CF language via derivation A string A derives a string B if A can be rewritten as B via some series of rule applications More formally: If A ->  is a production of P  and  are any strings in the set (  N)* Then we say that A directly derives  or A   Derivation is a generalization of direct derivation Let 1, 2, … m be strings in (  N)*, m>= 1, s.t. 1 2, 2 3… m-1 m We say that 1derives m or 1* m We then formally define language LG generated by grammar G A set of strings composed of terminal symbols derived from S LG = {w | w is in * and S * w} Parsing:  Parsing Parsing is the process of taking a string and a grammar and returning a (many?) parse tree(s) for that string Context?:  Context? The notion of context in CFGs has nothing to do with the ordinary meaning of the word context in language All it really means is that the non-terminal on the left-hand side of a rule is out there all by itself (free of context) A -> B C Means that I can rewrite an A as a B followed by a C regardless of the context in which A is found Key Constituents (English):  Key Constituents (English) Sentences Noun phrases Verb phrases Prepositional phrases Sentence-Types:  Sentence-Types Declaratives: A plane left S -> NP VP Imperatives: Leave! S -> VP Yes-No Questions: Did the plane leave? S -> Aux NP VP WH Questions: When did the plane leave? S -> WH Aux NP VP NPs:  NPs NP -> Pronoun I came, you saw it, they conquered NP -> Proper-Noun Los Angeles is west of Texas John Hennessy is the president of Stanford NP -> Det Noun The president NP -> Nominal Nominal -> Noun Noun A morning flight to Denver PPs:  PPs PP -> Preposition NP From LA To the store On Tuesday morning With lunch Recursion:  Recursion We’ll have to deal with rules such as the following where the non-terminal on the left also appears somewhere on the right (directly) NP -> NP PP [[The flight] [to Boston]] VP -> VP PP [[departed Miami] [at noon]] Recursion:  Recursion Of course, this is what makes syntax interesting Flights from Denver Flights from Denver to Miami Flights from Denver to Miami in February Flights from Denver to Miami in February on a Friday Flights from Denver to Miami in February on a Friday under $300 Flights from Denver to Miami in February on a Friday under $300 with lunch Recursion:  Recursion [[Flights] [from Denver]] [[[Flights] [from Denver]] [to Miami]] [[[[Flights] [from Denver]] [to Miami]] [in February]] [[[[[Flights] [from Denver]] [to Miami]] [in February]] [on a Friday]] Etc. NP -> NP PP Implications of recursion and context-freeness:  Implications of recursion and context-freeness If you have a rule like VP -> V NP It only cares that the thing after the verb is an NP It doesn’t have to know about the internal affairs of that NP The point:  The point VP -> V NP (I) hate flights from Denver flights from Denver to Miami flights from Denver to Miami in February flights from Denver to Miami in February on a Friday flights from Denver to Miami in February on a Friday under $300 flights from Denver to Miami in February on a Friday under $300 with lunch Bracketed Notation:  Bracketed Notation [S [NP [PRO I]] [VP [V prefer] [NP [Det a] [Nom [N morning] [N flight] ] ] ] ] Coordination Constructions:  Coordination Constructions S -> S and S John went to NY and Mary followed him NP -> NP and NP VP -> VP and VP … In fact the right rule for English is X -> X and X (Metarule) However we can say “He was longwinded and a bully.” Problems:  Problems Agreement Subcategorization Movement (for want of a better term) Agreement:  Agreement This dog Those dogs This dog eats Those dogs eat *This dogs *Those dog *This dog eat *Those dogs eats Possible CFG Solution:  Possible CFG Solution S -> NP VP NP -> Det Nominal VP -> V NP … SgS -> SgNP SgVP PlS -> PlNp PlVP SgNP -> SgDet SgNom PlNP -> PlDet PlNom PlVP -> PlV NP SgVP ->SgV Np … CFG Solution for Agreement:  CFG Solution for Agreement It works and stays within the power of CFGs But it’s ugly And it doesn’t scale all that well Subcategorization:  Subcategorization Sneeze: John sneezed *John sneezed the book Say: You said [United has a flight]S Prefer: I prefer [to leave earlier]TO-VP *I prefer United has a flight Give: Give [me]NP[a cheaper fare]NP Help: Can you help [me]NP[with a flight]PP *Give with a flight Subcategorization:  Subcategorization Subcat expresses the constraints that a predicate (verb for now) places on the number and syntactic types of arguments it wants to take (occur with). So?:  So? So the various rules for VPs overgenerate They permit the presence of strings containing verbs and arguments that don’t go together For example: VP -> V NP therefore Sneezed the book is a VP since “sneeze” is a verb and “the book” is a valid NP Possible CFG Solution:  Possible CFG Solution VP -> V VP -> V NP VP -> V NP PP … VP -> IntransV VP -> TransV NP VP -> TransVwPP NP PP … Forward Pointer:  Forward Pointer It turns out that verb subcategorization facts will provide a key element for semantic analysis (determining who did what to who in an event). Movement:  Movement Core example My travel agent booked the flight [[My travel agent]NP [booked [the flight]NP]VP]S i.e. “book” is a straightforward transitive verb. It expects a single NP arg within the VP as an argument, and a single NP arg as the subject. Movement:  Movement What about? Which flight do you want me to have the travel agent book? The direct object argument to “book” isn’t appearing in the right place. It is in fact a long way from where its supposed to appear. And note that it’s separated from its verb by 2 other verbs. CFGs: a summary:  CFGs: a summary CFGs appear to be just about what we need to account for a lot of basic syntactic structure in English. But there are problems That can be dealt with adequately, although not elegantly, by staying within the CFG framework. There are simpler, more elegant, solutions that take us out of the CFG framework (beyond its formal power). Syntactic theories: HPSG, LFG, CCG, Minimalism, etc. Other syntactic stuff:  Other syntactic stuff Grammatical relations Subject I booked a flight to New York The flight was booked by my agent Object I booked a flight to New York Complement I said that I wanted to leave Dependency parsing:  Dependency parsing Word to word links instead of constituency Based on the European rather than American traditions But dates back to the Greeks The original notions of Subject, Object and the progenitor of subcategorization (called ‘valence’) came out of Dependency theory. Dependency parsing is quite popular as a computational model since relationships between words are quite useful Dependency parsing:  Dependency parsing Parse tree: Nesting of multi-word constituents Typed dep parse: Grammatical relations between individual words Why are dependency parses useful?:  Why are dependency parses useful? Example: multi-document summarization Need to identify sentences from different documents that each say roughly the same thing: phrase structure trees of paraphrasing sentences which differ in word order can be significantly different but dependency representations will be very similar Parsing:  Parsing Parsing: assigning correct trees to input strings Correct tree: a tree that covers all and only the elements of the input and has an S at the top For now: enumerate all possible trees A further task: disambiguation: means choosing the correct tree from among all the possible trees. Treebanks:  Treebanks Parsed corpora in the form of trees The Penn Treebank The Brown corpus The WSJ corpus Tgrep http://www.ldc.upenn.edu/ldc/online/treebank/ Tregex http://www-nlp.stanford.edu/nlp/javadoc/javanlp/ Parsing involves search:  Parsing involves search As with everything of interest, parsing involves a search which involves the making of choices We’ll start with some basic (meaning bad) methods before moving on to the one or two that you need to know For Now:  For Now Assume… You have all the words already in some buffer The input isn’t pos tagged We won’t worry about morphological analysis All the words are known Top-Down Parsing:  Top-Down Parsing Since we’re trying to find trees rooted with an S (Sentences) start with the rules that give us an S. Then work your way down from there to the words. Top Down Space:  Top Down Space Bottom-Up Parsing:  Bottom-Up Parsing Of course, we also want trees that cover the input words. So start with trees that link up with the words in the right way. Then work your way up from there. Bottom-Up Space:  Bottom-Up Space Control:  Control Of course, in both cases we left out how to keep track of the search space and how to make choices Which node to try to expand next Which grammar rule to use to expand a node Top-Down, Depth-First, Left-to-Right Search:  Top-Down, Depth-First, Left-to-Right Search Example:  Example Example:  Example Example:  Example Top-Down and Bottom-Up:  Top-Down and Bottom-Up Top-down Only searches for trees that can be answers (i.e. S’s) But also suggests trees that are not consistent with the words Bottom-up Only forms trees consistent with the words Suggest trees that make no sense globally So Combine Them:  So Combine Them There are a million ways to combine top-down expectations with bottom-up data to get more efficient searches Most use one kind as the control and the other as a filter As in top-down parsing with bottom-up filtering Adding Bottom-Up Filtering:  Adding Bottom-Up Filtering 3 problems with TDDFLtR Parser:  3 problems with TDDFLtR Parser Left-Recursion Ambiguity Inefficient reparsing of subtrees Left-Recursion:  Left-Recursion What happens in the following situation S -> NP VP S -> Aux NP VP NP -> NP PP NP -> Det Nominal … With the sentence starting with Did the flight… Ambiguity:  Ambiguity “One morning I shot an elephant in my pajamas. How he got into my pajamas I don’t know.” Groucho Marx Lots of ambiguity:  Lots of ambiguity VP -> VP PP NP -> NP PP Show me the meal on flight 286 from SF to Denver 14 parses! Lots of ambiguity:  Lots of ambiguity Church and Patil (1982) Number of parses for such sentences grows at rate of number of parenthesizations of arithmetic expressions Which grow with Catalan numbers PPs Parses 1 2 2 5 3 14 4 132 5 469 6 1430 Avoiding Repeated Work:  Avoiding Repeated Work Parsing is hard, and slow. It’s wasteful to redo stuff over and over and over. Consider an attempt to top-down parse the following as an NP: A flight from Indianapolis to Houston on TWA Grammars and Parsing:  Grammars and Parsing Context-Free Grammars and Constituency Some common CFG phenomena for English Baby parsers: Top-down and Bottom-up Parsing Thursday: Real parsers:: Dynamic Programming parsing CKY Probabilistic parsing Optional section: the Earley algorithm

Add a comment

Related presentations

Related pages

Lecture 9 Analog and Digital I/Q Modulation

Lecture 9 Analog and Digital I/Q ... −180°<θ<180 ... Microsoft PowerPoint - Lec9.v1.ppt Author: cjt Created Date:
Read more

Lec9_atoms115.pdf - Physics115:Energy Lecture9 1.0079 1 4 ...

View Lec9_atoms115.pdf from PHYSICS 115 at Wisconsin. Physics115:Energy Lecture9 1.0079 1 4.0026 2 20.180 10 14.007 7 39.948 18 35.453 17 18.998 9 15.999
Read more

AJ120 Lec5 Police Operations - YouTube

AJ120 Lec5 Police Operations ... 07. Rick Ramos 180 views. 30:07 AJ120 Lec9 Courts and Criminal Trial - Duration: ...
Read more

Lecture 8: Routing I Distance-vector Algorithms

A 07 1 B 70 1 8 C 102 ... » Time-out in 180 seconds to detect failures ... Microsoft PowerPoint - 123f11_Lec9 Author: Stefan Savage Created Date:
Read more

Übersicht

Wintersemester 07/08 Prof. Dittrich (Universität Erfurt) Die Produktion Winter 1 / 20 Übersicht ... 5 50 130 180 18 10 26 36 6 50 150 200 20 8,3 25 33,3
Read more

24.910 Topics in Linguistic Theory: Laboratory Phonology ...

110 120 130 140 150 160 170 180 190 σ= 10 ms if this were the population mean ... 9/14/2007 12:07:40 PM ...
Read more

Soil Dynamics Prof. Deepankar Choudhury Department of ...

(Refer Slide Time: 07:07) ... degree or 180 degree when this is 0 tan theta0 means it will be either 0 degree or 180 degree. So, ...
Read more

Google

Advertising Programmes Business Solutions +Google About Google Google.com © 2016 - Privacy - Terms. Search; Images; Maps; Play; YouTube; News; Gmail ...
Read more