Nancy Nov03

50 %
50 %
Information about Nancy Nov03
Entertainment

Published on November 20, 2007

Author: Herminia

Source: authorstream.com

Putting Meaning Into Your Trees :  Putting Meaning Into Your Trees Martha Palmer University of Pennsylvania LORIAL INRIA Nancy, France November 14, 2003 Outline:  Outline Introduction Background: WordNet, Levin classes, VerbNet Proposition Bank – capturing shallow semantics Mapping PropBank to VerbNet Mapping PropBank to WordNet Word sense in Machine Translation:  Word sense in Machine Translation Different syntactic frames John left the room Juan saiu do quarto. (Portuguese) John left the book on the table. Juan deizou o livro na mesa. Same syntactic frame? John left a fortune. Juan deixou uma fortuna. Ask Jeeves – A Q/A, IR ex. :  Ask Jeeves – A Q/A, IR ex. What do you call a successful movie? Tips on Being a Successful Movie Vampire ... I shall call the police. Successful Casting Call & Shoot for ``Clash of Empires'' ... thank everyone for their participation in the making of yesterday's movie. Demme's casting is also highly entertaining, although I wouldn't go so far as to call it successful. This movie's resemblance to its predecessor is pretty vague... VHS Movies: Successful Cold Call Selling: Over 100 New Ideas, Scripts, and Examples from the Nation's Foremost Sales Trainer. Blockbuster Ask Jeeves – filtering w/ POS tag:  Ask Jeeves – filtering w/ POS tag What do you call a successful movie? Tips on Being a Successful Movie Vampire ... I shall call the police. Successful Casting Call & Shoot for ``Clash of Empires'' ... thank everyone for their participation in the making of yesterday's movie. Demme's casting is also highly entertaining, although I wouldn't go so far as to call it successful. This movie's resemblance to its predecessor is pretty vague... VHS Movies: Successful Cold Call Selling: Over 100 New Ideas, Scripts, and Examples from the Nation's Foremost Sales Trainer. Filtering out “call the police”:  Filtering out “call the police” call(you,movie,what) ≠ call(you,police) Syntax English lexical resource is required:  English lexical resource is required That provides sets of possible syntactic frames for verbs. And provides clear, replicable sense distinctions. AskJeeves: Who do you call for a good electronic lexical database for English? WordNet – call, 28 senses:  WordNet – call, 28 senses name, call -- (assign a specified, proper name to; "They named their son David"; …) -> LABEL 2. call, telephone, call up, phone, ring -- (get or try to get into communication (with someone) by telephone; "I tried to call you all night"; …) ->TELECOMMUNICATE 3. call -- (ascribe a quality to or give a name of a common noun that reflects a quality; "He called me a bastard"; …) -> LABEL 4. call, send for -- (order, request, or command to come; "She was called into the director's office"; "Call the police!") -> ORDER WordNet – Princeton (Miller 1985, Fellbaum 1998) :  WordNet – Princeton (Miller 1985, Fellbaum 1998) On-line lexical reference (dictionary) Nouns, verbs, adjectives, and adverbs grouped into synonym sets Other relations include hypernyms (ISA), antonyms, meronyms Limitations as a computational lexicon Contains little syntactic information No explicit predicate argument structures No systematic extension of basic senses Sense distinctions are very fine-grained, ITA 73% No hierarchical entries Levin classes (Levin, 1993) :  Levin classes (Levin, 1993) 3100 verbs, 47 top level classes, 193 second and third level Each class has a syntactic signature based on alternations. John broke the jar. / The jar broke. / Jars break easily. John cut the bread. / *The bread cut. / Bread cuts easily. John hit the wall. / *The wall hit. / *Walls hit easily. Levin classes (Levin, 1993) :  Levin classes (Levin, 1993) Verb class hierarchy: 3100 verbs, 47 top level classes, 193 Each class has a syntactic signature based on alternations. John broke the jar. / The jar broke. / Jars break easily. change-of-state John cut the bread. / *The bread cut. / Bread cuts easily. change-of-state, recognizable action, sharp instrument John hit the wall. / *The wall hit. / *Walls hit easily. contact, exertion of force Confusions in Levin classes?:  Confusions in Levin classes? Not semantically homogenous {braid, clip, file, powder, pluck, etc...} Multiple class listings homonymy or polysemy? Conflicting alternations? Carry verbs disallow the Conative, (*she carried at the ball), but include {push,pull,shove,kick,draw,yank,tug} also in Push/pull class, does take the Conative (she kicked at the ball) Intersective Levin Classes:  Intersective Levin Classes “at” ¬CH-LOC “across the room” CH-LOC “apart” CH-STATE Dang, Kipper & Palmer, ACL98 Intersective Levin Classes:  Intersective Levin Classes More syntactically and semantically coherent sets of syntactic patterns explicit semantic components relations between senses VERBNET www.cis.upenn.edu/verbnet Dang, Kipper & Palmer, IJCAI00, Coling00 VerbNet – Karin Kipper:  VerbNet – Karin Kipper Class entries: Capture generalizations about verb behavior Organized hierarchically Members have common semantic elements, semantic roles and syntactic frames Verb entries: Refer to a set of classes (different senses) each class member linked to WN synset(s) (not all WN senses are covered) Semantic role labels::  Semantic role labels: Laurent broke the LCD projector. break (agent(Laurent), patient(LCD-projector)) cause(agent(Laurent), broken(LCD-projector)) agent(A) -> intentional(A), sentient(A), causer(A), affector(A) patient(P) -> affected(P), change(P),… Hand built resources vs. Real data:  Hand built resources vs. Real data VerbNet is based on linguistic theory – how useful is it? How well does it correspond to syntactic variations found in naturally occurring text? Proposition Bank: From Sentences to Propositions:  Proposition Bank: From Sentences to Propositions . . . When Powell met Zhu Rongji on Thursday they discussed the return of the spy plane. meet(Powell, Zhu) discuss([Powell, Zhu], return(X, plane)) meet(Somebody1, Somebody2) Capturing semantic roles*:  Capturing semantic roles* Claire broke [ ARG1 the laser pointer.] [ARG1 The windows] were broken by the hurricane. [ARG1 The vase] broke into pieces when it toppled over. SUBJ SUBJ SUBJ *See also Framenet, http://www.icsi.berkeley.edu/~framenet/ English lexical resource is required:  English lexical resource is required That provides sets of possible syntactic frames for verbs with semantic role labels. And provides clear, replicable sense distinctions. A TreeBanked Sentence:  A TreeBanked Sentence Analysts S NP-SBJ VP NP *T*-1 S NP-SBJ VP would NP PP-LOC The same sentence, PropBanked:  The same sentence, PropBanked Analysts have been expecting Arg0 Arg1 Slide24:  Frames File Example: expect Roles: Arg0: expecter Arg1: thing expected Example: Transitive, active: Portfolio managers expect further declines in interest rates. Arg0: Portfolio managers REL: expect Arg1: further declines in interest rates Frames File example: give:  Frames File example: give Roles: Arg0: giver Arg1: thing given Arg2: entity given to Example: double object The executives gave the chefs a standing ovation. Arg0: The executives REL: gave Arg2: the chefs Arg1: a standing ovation Word Senses in PropBank:  Word Senses in PropBank Orders to ignore word sense not feasible for 700+ verbs Mary left the room Mary left her daughter-in-law her pearls in her will Frameset leave.01 "move away from": Arg0: entity leaving Arg1: place left Frameset leave.02 "give": Arg0: giver Arg1: thing given Arg2: beneficiary How do these relate to traditional word senses in VerbNet and WordNet? Annotation procedure :  Annotation procedure PTB II - Extraction of all sentences with given verb Create Frame File for that verb Paul Kingsbury (3100+ lemmas, 4400 framesets,118K predicates) Over 300 created automatically via VerbNet First pass: Automatic tagging (Joseph Rosenzweig) http://www.cis.upenn.edu/~josephr/TIDES/index.html#lexicon Second pass: Double blind hand correction Paul Kingsbury Tagging tool highlights discrepancies Scott Cotton Third pass: Solomonization (adjudication) Betsy Klipple, Olga Babko-Malaya Trends in Argument Numbering:  Trends in Argument Numbering Arg0 = agent Arg1 = direct object / theme / patient Arg2 = indirect object / benefactive / instrument / attribute / end state Arg3 = start point / benefactive / instrument / attribute Arg4 = end point Per word vs frame level – more general? Additional tags (arguments or adjuncts?):  Additional tags (arguments or adjuncts?) Variety of ArgM’s (Arg#>4): TMP - when? LOC - where at? DIR - where to? MNR - how? PRP -why? REC - himself, themselves, each other PRD -this argument refers to or modifies another ADV –others Inflection:  Inflection Verbs also marked for tense/aspect Passive/Active Perfect/Progressive Third singular (is has does was) Present/Past/Future Infinitives/Participles/Gerunds/Finites Modals and negations marked as ArgMs Frames: Multiple Framesets:  Frames: Multiple Framesets Out of the 787 most frequent verbs: 1 Frameset – 521 2 Frameset – 169 3+ Frameset - 97 (includes light verbs) 94% ITA Framesets are not necessarily consistent between different senses of the same verb Framesets are consistent between different verbs that share similar argument structures, (like FrameNet) Ergative/Unaccusative Verbs:  Ergative/Unaccusative Verbs Roles (no ARG0 for unaccusative verbs) Arg1 = Logical subject, patient, thing rising Arg2 = EXT, amount risen Arg3* = start point Arg4 = end point Sales rose 4% to $3.28 billion from $3.16 billion. The Nasdaq composite index added 1.01 to 456.6 on paltry volume. Actual data for leave:  Actual data for leave http://www.cs.rochester.edu/~gildea/PropBank/Sort/ Leave .01 “move away from” Arg0 rel Arg1 Arg3 Leave .02 “give” Arg0 rel Arg1 Arg2 sub-ARG0 obj-ARG1 44 sub-ARG0 20 sub-ARG0 NP-ARG1-with obj-ARG2 17 sub-ARG0 sub-ARG2 ADJP-ARG3-PRD 10 sub-ARG0 sub-ARG1 ADJP-ARG3-PRD 6 sub-ARG0 sub-ARG1 VP-ARG3-PRD 5 NP-ARG1-with obj-ARG2 4 obj-ARG1 3 sub-ARG0 sub-ARG2 VP-ARG3-PRD 3 PropBank/FrameNet:  PropBank/FrameNet Buy Arg0: buyer Arg1: goods Arg2: seller Arg3: rate Arg4: payment Sell Arg0: seller Arg1: goods Arg2: buyer Arg3: rate Arg4: payment Broader, more neutral, more syntactic – maps readily to VN,TR.FN Rambow, et al, PMLB03 Annotator accuracy – ITA 84%:  Annotator accuracy – ITA 84% English lexical resource is required:  English lexical resource is required That provides sets of possible syntactic frames for verbs with semantic role labels? And provides clear, replicable sense distinctions. English lexical resource is required:  English lexical resource is required That provides sets of possible syntactic frames for verbs with semantic role labels that can be automatically assigned accurately to new text? And provides clear, replicable sense distinctions. Automatic Labelling of Semantic Relations:  Automatic Labelling of Semantic Relations Stochastic Model Features: Predicate Phrase Type Parse Tree Path Position (Before/after predicate) Voice (active/passive) Head Word Gildea & Jurafsky, CL02, Gildea & Palmer, ACL02 Semantic Role Labelling Accuracy-Known Boundaries:  Semantic Role Labelling Accuracy-Known Boundaries Accuracy of semantic role prediction for known boundaries--the system is given the constituents to classify. FrameNet examples (training/test) are handpicked to be unambiguous. Lower performance with unknown boundaries. Higher performance with traces. Almost evens out. Additional Automatic Role Labelers:  Additional Automatic Role Labelers Performance improved from 77% to 88% Colorado (Gold Standard parses, < 10 instances) Same features plus Named Entity tags Head word POS For unseen verbs – backoff to automatic verb clusters SVM’s Role or not role For each likely role, for each Arg#, Arg# or not No overlapping role labels allowed Pradhan, et. al., ICDM03, Sardeneau, et. al, ACL03, Chen & Rambow, EMNLP03, Gildea & Hockemaier, EMNLP03 Additional Automatic Role Labelers:  Additional Automatic Role Labelers Performance improved from 77% to 88% Colorado New results – original features – 88%, 93% Penn (Gold Standard parses, < 10 instances) Same features plus Named Entity tags Head word POS For unseen verbs – backoff to automatic verb clusters SVM’s Role or not role For each likely role, for each Arg#, Arg# or not No overlapping role labels allowed Pradhan, et. al., ICDM03, Sardeneau, et. al, ACL03, Chen & Rambow, EMNLP03, Gildea & Hockemaier, EMNLP03 Word Senses in PropBank:  Word Senses in PropBank Orders to ignore word sense not feasible for 700+ verbs Mary left the room Mary left her daughter-in-law her pearls in her will Frameset leave.01 "move away from": Arg0: entity leaving Arg1: place left Frameset leave.02 "give": Arg0: giver Arg1: thing given Arg2: beneficiary How do these relate to traditional word senses in VerbNet and WordNet? Mapping from PropBank to VerbNet :  Mapping from PropBank to VerbNet Mapping from PB to VerbNet:  Mapping from PB to VerbNet Mapping from PropBank to VerbNet:  Mapping from PropBank to VerbNet Overlap with PropBank framesets 50,000 PropBank instances < 50% VN entries, > 85% VN classes Results MATCH - 78.63%. (80.90% relaxed) (VerbNet isn’t just linguistic theory!) Benefits Thematic role labels and semantic predicates Can extend PropBank coverage with VerbNet classes WordNet sense tags Kingsbury & Kipper, NAACL03, Text Meaning Workshop http://www.cs.rochester.edu/~gildea/VerbNet/ WordNet as a WSD sense inventory:  WordNet as a WSD sense inventory Senses unnecessarily fine-grained? Word Sense Disambiguation bakeoffs Senseval1 – Hector, ITA = 95.5% Senseval2 – WordNet 1.7, ITA verbs = 71% Groupings of Senseval2 verbs, ITA =82% Used syntactic and semantic criteria Groupings Methodology (w/ Dang and Fellbaum):  Groupings Methodology (w/ Dang and Fellbaum) Double blind groupings, adjudication Syntactic Criteria (VerbNet was useful) Distinct subcategorization frames call him a bastard call him a taxi Recognizable alternations – regular sense extensions: play an instrument play a song play a melody on an instrument SIGLEX01, SIGLEX02, JNLE04 Groupings Methodology (cont.):  Groupings Methodology (cont.) Semantic Criteria Differences in semantic classes of arguments Abstract/concrete, human/animal, animate/inanimate, different instrument types,… Differences in the number and type of arguments Often reflected in subcategorization frames John left the room. I left my pearls to my daughter-in-law in my will. Differences in entailments Change of prior entity or creation of a new entity? Differences in types of events Abstract/concrete/mental/emotional/…. Specialized subject domains Results – averaged over 28 verbs :  Results – averaged over 28 verbs MX – Maximum Entropy WSD, p(sense|context) Features: topic, syntactic constituents, semantic classes +2.5%, +1.5 to +5%, +6% Dang and Palmer, Siglex02,Dang et al,Coling02 Grouping improved ITA and Maxent WSD:  Grouping improved ITA and Maxent WSD Call: 31% of errors due to confusion between senses within same group 1: name, call -- (assign a specified, proper name to; They named their son David) call -- (ascribe a quality to or give a name of a common noun that reflects a quality; He called me a bastard) call -- (consider or regard as being;I would not call her beautiful) 75% with training and testing on grouped senses vs. 43% with training and testing on fine-grained senses WordNet: - call, 28 senses, groups:  WordNet: - call, 28 senses, groups WN5, WN16,WN12 WN15 WN26 WN3 WN19 WN4 WN 7 WN8 WN9 WN1 WN22 WN20 WN25 WN18 WN27 WN2 WN 13 WN6 WN23 WN28 WN17 , WN 11 WN10, WN14, WN21, WN24, Loud cry Label Phone/radio Bird or animal cry Request Call a loan/bond Visit Challenge Bid WordNet: - call, 28 senses, groups:  WordNet: - call, 28 senses, groups WN5, WN16,WN12 WN15 WN26 WN3 WN19 WN4 WN 7 WN8 WN9 WN1 WN22 WN20 WN25 WN18 WN27 WN2 WN 13 WN6 WN23 WN28 WN17 , WN 11 WN10, WN14, WN21, WN24, Loud cry Label Phone/radio Bird or animal cry Request Call a loan/bond Visit Challenge Bid Overlap between Groups and Framesets – 95% :  Overlap between Groups and Framesets – 95% WN1 WN2 WN3 WN4 WN6 WN7 WN8 WN5 WN 9 WN10 WN11 WN12 WN13 WN 14 WN19 WN20 Frameset1 Frameset2 develop Palmer, Dang & Fellbaum, NLE 2004 Sense Hierarchy :  Sense Hierarchy PropBank Framesets – coarse grained distinctions Sense Groups (Senseval-2) intermediate level (includes Levin classes) – 95% overlap WordNet – fine grained distinctions English lexical resource is available:  English lexical resource is available That provides sets of possible syntactic frames for verbs with semantic role labels that can be automatically assigned accurately to new text. And provides clear, replicable sense distinctions. A Chinese Treebank Sentence:  A Chinese Treebank Sentence 国会/Congress 最近/recently 通过/pass 了/ASP 银行法/banking law “The Congress passed the banking law recently.” (IP (NP-SBJ (NN 国会/Congress)) (VP (ADVP (ADV 最近/recently)) (VP (VV 通过/pass) (AS 了/ASP) (NP-OBJ (NN 银行法/banking law))))) The Same Sentence, PropBanked:  The Same Sentence, PropBanked 通过(f2) (pass) arg0 argM arg1 国会 最近 银行法 (law) (congress) (IP (NP-SBJ arg0 (NN 国会)) (VP argM (ADVP (ADV 最近)) (VP f2 (VV 通过) (AS 了) arg1 (NP-OBJ (NN 银行法))))) Chinese PropBank Status - (w/ Bert Xue and Scott Cotton):  Chinese PropBank Status - (w/ Bert Xue and Scott Cotton) Create Frame File for that verb - Similar alternations – causative/inchoative, unexpressed object 5000 lemmas, 2000 DONE, (hired Jiang) First pass: Automatic tagging 2000 DONE Subcat frame matcher (Xue & Kulick, MT03) Second pass: Double blind hand correction In progress (includes frameset tagging), 600 DONE Ported RATS to CATS, in use since May Third pass: Solomonization (adjudication) A Korean Treebank Sentence:  A Korean Treebank Sentence (S (NP-SBJ 그/NPN+은/PAU) (VP (S-COMP (NP-SBJ 르노/NPR+이/PCA) (VP (VP (NP-ADV 3/NNU 월/NNX+말/NNX+까지/PAU) (VP (NP-OBJ 인수/NNC+제의/NNC 시한/NNC+을/PCA) 갖/VV+고/ECS)) 있/VX+다/EFN+고/PAD) 덧붙이/VV+었/EPF+다/EFN) ./SFN) 그는 르노가 3 월말까지 인수제의 시한을 갖고 있다고 덧붙였다. He added that Renault has a deadline until the end of March for a merger proposal. The same sentence, PropBanked:  The same sentence, PropBanked 덧붙이었다 그는 갖고 있다 르노가 인수제의 시한을 덧붙이다 (그는, 르노가 3 월말까지 인수제의 시한을 갖고 있다) (add) (he) (Renaut has a deadline until the end of March for a merger proposal) 갖다 (르노가, 3 월말까지, 인수제의 시한을) (has) (Renaut) (until the end of March) (a deadline for a merger proposal) Arg0 Arg2 Arg0 Arg1 (S Arg0 (NP-SBJ 그/NPN+은/PAU) (VP Arg2 (S-COMP ( Arg0 NP-SBJ 르노/NPR+이/PCA) (VP (VP ( ArgM NP-ADV 3/NNU 월/NNX+말/NNX+까지/PAU) (VP ( Arg1 NP-OBJ 인수/NNC+제의/NNC 시한/NNC+을/PCA) 갖/VV+고/ECS)) 있/VX+다/EFN+고/PAD) 덧붙이/VV+었/EPF+다/EFN) ./SFN) 3 월말까지 ArgM PropBank II:  PropBank II Nominalizations NYU Lexical Frames DONE Event Variables, (including temporals and locatives) More fine-grained sense tagging Tagging nominalizations w/ WordNet sense Selected verbs and nouns Nominal Coreference not names Clausal Discourse connectives – selected subset PropBank I :  PropBank I Also, [Arg0substantially lower Dutch corporate tax rates] helped [Arg1[Arg0 the company] keep [Arg1 its tax outlay] [Arg3-PRD flat] [ArgM-ADV relative to earnings growth]]. relative to earnings… flat its tax outlay the company keep the company keep its tax outlay flat tax rates help ArgM-ADV Arg3-PRD Arg1 Arg0 REL Event variables; nominal reference; I Summary:  Summary Shallow semantic annotation that captures critical dependencies and semantic role labels Supports training of supervised automatic taggers Methodology ports readily to other languages English PropBank release – spring 2004 Chinese PropBank release – fall 2004 Korean PropBank release – summer 2005 Summary of Multilingual TreeBanks, PropBanks:  Summary of Multilingual TreeBanks, PropBanks Levin class: escape-51.1-1:  Levin class: escape-51.1-1 WordNet Senses: WN 1, 5, 8 Thematic Roles: Location[+concrete] Theme[+concrete] Frames with Semantics Basic Intransitive "The convict escaped" motion(during(E),Theme) direction(during(E),Prep,Theme, ~Location) Intransitive (+ path PP) "The convict escaped from the prison" Locative Preposition Drop "The convict escaped the prison" Levin class: future_having-13.3:  Levin class: future_having-13.3 WordNet Senses: WN 2,10,13 Thematic Roles: Agent[+animate OR +organization] Recipient[+animate OR +organization] Theme[] Frames with Semantics Dative "I promised somebody my time" Agent V Recipient Theme has_possession(start(E),Agent,Theme) future_possession(end(E),Recipient,Theme) cause(Agent,E) Transitive (+ Recipient PP) "We offered our paycheck to her" Agent V Theme Prep(to) Recipient ) Transitive (Theme Object) "I promised my house (to somebody)" Agent V Theme Actual data for leave:  Actual data for leave http://www.cs.rochester.edu/~gildea/PropBank/Sort/ Leave .01 “move away from” Arg0 rel Arg1 Arg3 Leave .02 “give” Arg0 rel Arg1 Arg2 sub-ARG0 obj-ARG1 44 sub-ARG0 20 sub-ARG0 NP-ARG1-with obj-ARG2 17 sub-ARG0 sub-ARG2 ADJP-ARG3-PRD 10 sub-ARG0 sub-ARG1 ADJP-ARG3-PRD 6 sub-ARG0 sub-ARG1 VP-ARG3-PRD 5 NP-ARG1-with obj-ARG2 4 obj-ARG1 3 sub-ARG0 sub-ARG2 VP-ARG3-PRD 3 Automatic classification:  Automatic classification Merlo & Stevenson automatically classified 59 verbs with 69.8% accuracy 1. Unergative, 2. unaccusative, 3. object-drop 100M words automatically parsed C5.0. Using features: transitivity, causativity, animacy, voice, POS EM clustering – 61%, 2669 instances, 1M words Using Gold Standard semantic role labels 1. float hop/hope jump march leap 2. change clear collapse cool crack open flood 3. borrow clean inherit reap organize study SENSEVAL – Word Sense Disambiguation Evaluation:  SENSEVAL – Word Sense Disambiguation Evaluation NLE99, CHUM01, NLE02, NLE03 DARPA style bakeoff: training data, testing data, scoring algorithm. Maximum Entropy WSD Hoa Dang, best performer on Verbs:  Maximum Entropy WSD Hoa Dang, best performer on Verbs Maximum entropy framework, p(sense|context) Contextual Linguistic Features Topical feature for W: keywords (determined automatically) Local syntactic features for W: presence of subject, complements, passive? words in subject, complement positions, particles, preps, etc. Local semantic features for W: Semantic class info from WordNet (synsets, etc.) Named Entity tag (PERSON, LOCATION,..) for proper Ns words within +/- 2 word window Best Verb Performance - Maxent-WSD Hoa Dang:  Best Verb Performance - Maxent-WSD Hoa Dang MX – Maximum Entropy WSD, p(sense|context) Features: topic, syntactic constituents, semantic classes +2.5%, +1.5 to +5%, +6% Dang and Palmer, Siglex02,Dang et al,Coling02 Role Labels & Framesets as features for WSD:  Role Labels & Framesets as features for WSD Preliminary results Jinying Chen Gold Standard PropBank annotation Decision Tree C5.0, Groups 5 verbs, Features: Frameset tags, Arg labels Comparable results to Maxent with PropBank features Syntactic frames and sense distinctions are inseparable Lexical resources provide concrete criteria for sense distinctions:  Lexical resources provide concrete criteria for sense distinctions PropBank – coarse grained sense distinctions determined by different subcategorization frames (Framesets) Intersective Levin classes – regular sense extensions through differing syntactic constructions VerbNet – distinct semantic predicates for each sense (verb class) Are these the right distinctions? Results – averaged over 28 verbs:  Results – averaged over 28 verbs

Add a comment

Related presentations

Related pages

Category:Place Stanislas - Wikimedia Commons

Category:Place Stanislas. From Wikimedia Commons, the free media repository. Jump to: navigation, ... Nancy - Place Stanislas nov03-02.jpg 1,015 KB.
Read more

FANCY NANCY: SPLENDIFEROUS CHRISTMAS (FANCY NANCY ...

Fancy Nancy is a book I can relate too, even thirty plus years after being a little girl who would have LOVED to have my "fanciness" appreciated by my family!
Read more

Pelosi: 'Disabled Veterans Tax Breaks Our Promise to ...

Pelosi: 'Disabled Veterans Tax Breaks Our Promise to ... D.C. -- House Democratic Leader Nancy Pelosi held a news conference this morning to ...
Read more

Category:2003 in France - Wikimedia Commons

Category:2003 in France. From Wikimedia Commons, the free media repository. Jump to: navigation, ... Nancy - Place Stanislas nov03-02.jpg 1,015 KB.
Read more

Hip No. Barn 3033 NANCY'S APPROVAL 1

KEE 11/03 Consigned by Darby Dan Farm, Agent I NANCY'S APPROVAL Bay Mare; foaled 2000 Fortino II Chambord Buckpasser Cool Mood Mr. Prospector Flack Attack
Read more

Pelosi: 'Republican Tactics on Medicare Bill Brought ...

Pelosi: 'Republican Tactics on Medicare Bill Brought Dishonor to the House' ... D.C. -- House Democratic Leader Nancy Pelosi released the following ...
Read more

nov03 - Ace Recommendation Platform - 1

nov03. We found 20 results related to this asset. ... Lisa Appeddu, Mark Tippin, Vicki Falconer (for Nancy Penner), Robin Jones, Madeline Baugher ...
Read more

Catfight Storie - NOV-03 - FACE SMOTHERING - 59:48 Min.

Free Download: 1-1 Nancy. Face Smothering, letale Legscissors und brutale Kombinationen von Holds zeigen ihre Herrschaft über ihn, als sie für ihre ...
Read more

Aljimal Feed Safety System Public Meetin September *~1*4, 2o03

1 ‘ +Aljimal Feed Safety System Public Meetin September *~1*4, 2o03 Kim Young Stephen Pretanik Thomas Cook Nancy Cook Brad Gottula
Read more

Obituaries - Columbia College Today

Obituaries. Within the ... Beverly Felsberg; sister-in-law, Patricia Stahl; nephew, Robert W. Felsberg; and niece, Nancy Baker. 1947. Louis R. Marmora ...
Read more