Lecture8Unification

60 %
40 %
Information about Lecture8Unification
Entertainment

Published on November 16, 2007

Author: Jacqueline

Source: authorstream.com

Features and Unification:  Features and Unification Read J & M Chapter 11. Solving the Agreement Problem:  Solving the Agreement Problem Number agreement: S  NP VP * Mary walk. [NP NUMBER] [VP NUMBER] NP  det N * those flight [DET NUMBER] [N NUMBER] Without this check, we’d have more ambiguity: Flying planes is dangerous. Flying planes are dangerous. Solving the Agreement Problem:  Solving the Agreement Problem Subcategorization of verbs: VP  V VPto * Mary decided going. [subcat VPto] VP  V VPing Mary imagined going [subcat VPing] Subcategorization of Verbs:  Subcategorization of Verbs Subcategorization of Verbs – Example:  Subcategorization of Verbs – Example Subcat Example Quo asked [Quo “What was it like?”] NP asking[NP a question] Swh asked[Swh what trades you’re interested in] Sto ask [Sto him to tell you] PP that means asking [PP at home] Vto asked [Vto to see a girl called Evelyn] NP Sif asked [NP him] [Sif whether he could make] NP NP asked [NP myself] [NP a question] NP Swh asked [NP him] [Swh why he took time off] Specifying Control Information:  Specifying Control Information John persuaded Mary to go. John promised Mary to go. Who does the going? Subcategorization of Nouns and Adjectives:  Subcategorization of Nouns and Adjectives Jane has a passion for old movies. Jane has an interest in old movies. Orth Passion Cat N Head Subcat [Cat PP] [Head [Prep for] Reflexive Pronouns:  Reflexive Pronouns Mary wants to take care of herself. * Mary wants to take care of himself. * John and Mary want to take care of himself. Mary wants John to take care of himself. Properties of a Good Solution:  Properties of a Good Solution We want a solution to this problem that: avoids combinatorial explosion of features is declarative so that: it can be used for both recognition and generation the linguistic facts can be reused if we change parsing algorithms to suit particular task environments has a clean, formal semantics so that we can make correct statements about what the system will do. So we reject simply writing code to handle the various cases. Feature Structures:  Feature Structures Reentrant Feature Structures:  Reentrant Feature Structures Unification:  Unification [NUMBER SG]  [NUMBER SG] = [NUMBER SG] [NUMBER SG]  [NUMBER PL] Fails [NUMBER SG]  [NUMBER [] ] = [NUMBER SG] [NUMBER SG]  [PERSON 3 ] = NUMBER SG PERSON 3 Two feature structures can be unified if there is no conflict between them. Unification of Reentrant Feature Structures:  Unification of Reentrant Feature Structures  = S  NP VP Subsumption:  Subsumption A less specific (more abstract) feature structure subsumes an equal or more specific one. If F subsumes G, then we write F  G We can define unification in terms of subsumption: F  G is the most general feature structure H such that: F  H and G  H. Two Views of Subsumption:  Two Views of Subsumption A subsumes B A  B Set Theoretic: objs satisfying B  objs satisfying A Logical: B  A A: Noun B: Noun Number SG Note that [] subsumes all other feature structures. It is the least upper bound of the semilattice formed by the set of feature structures. In the set theoretic view, it corresponds to U. In the logical view, it corresponds to T. Analogy to Theorem Proving:  Analogy to Theorem Proving We use the terms “unification” and “subsumption” here in essentially the same way in which they are used in theorem proving systems: x, y P(x, f(y))  Q(x, y) f(3) = 8 P(2, 8) Conclude: ? Think of “unification” as “matching”. Unification Example – Conjunctions:  Unification Example – Conjunctions Mary [VP fed the cat] and [VP swept the floor]. Mary fed the cat and Joe swept the floor. Mary bought the[Adj red] and [Adj green] ribbons. [Vtrans Feed] and [Vtrans water] the plants. * [Vtrans Feed] and [Vintrans cough] the plants. * Mary fed [NP the cat] and [Adj green]. X0  X1 CONJ X2 CAT = CAT  CAT Adding Unification to Grammar Rules:  Adding Unification to Grammar Rules The rule S  NP VP [CAT S HEAD [1] STRUCT [SUBJECT [A1] VP [A2] ] ]  A1: [CAT NP AGREEMENT [2]: [] ] A2: [CAT VP HEAD [1] AGREEMENT [2] ] Note: The STRUCT feature is used here to record the tree structure of the constituents as they are produced. Applying the Grammar Rule:  Applying the Grammar Rule This rule says that we can build an S from two components, A1 and A2. To apply this rule to two candidate components X and Y, the parser must: Copy the rule structure to create a new instance of S. (Remember that unification is destructive and we’ll need to be able to reuse the rule.) Unify the feature structure of X with the specification for A1 on the right side of the rule. This will succeed if the CAT of X unifies with NP, and if it succeeds, it will bind [2] to the AGREEMENT structure of X and A1 to X. Unify the feature structure of Y with the specification for A2 on the right side of the rule. This will succeed if the CAT of Y unifies with VP and if the AGREEMENT structure of Y unifies with [2], namely the agreement structure of X. If it succeeds, it will bind [1] to the HEAD feature of Y and A2 to Y. Example of Applying the Rule:  Example of Applying the Rule A1: [CAT NP HEAD dogs STRUCT [NOM dogs] AGREEMENT [NBR PL PERS 3] ] A2: [CAT VP HEAD ran STRUCT [V ran] AGREEMENT [] ]  [CAT S HEAD [ran] STRUCT [SUBJECT [CAT NP HEAD dogs STRUCT [NOM dogs] AGREEMENT [2]: [NBR PL PERS 3] ] VP [CAT VP HEAD ran STRUCT [V ran] AGREEMENT [2] ] ] ] Features Don’t Have to Be Identical:  Features Don’t Have to Be Identical Mary has become [NP a lawyer] and [AdjP very rich]. If we have an algebra of feature types, then we can use it for other things, even verging into semantics: * [NP Mary] and [NP the potatoes] cooked. What Features to Unify and Pass Up?:  What Features to Unify and Pass Up? [Npdet The dog] and [Npdett the cats] like going outside. [Npdet The dog] and [Npquant all the cats] like going outside. The dog and the cat like going outside. X0  X1 CONJ X2 CAT = CAT  CAT AGREEMENT = ??? Implementing Unification in Parsing:  Implementing Unification in Parsing The simple case: The parser is following a single path. To unify two feature structures we destructively alter them to create a single structure. We change our representation to allow this: NUMBER SG PERSON 3 Adding Unification to the Earley Algorithm:  Adding Unification to the Earley Algorithm Use unification to check constraints before applying each grammar rule. Copy the feature structures before unifying them so that the chart still works: a given entry in the chart may become part of (i.e., be unified with) multiple other entries as the parser follows multiple paths. What if We Need a Basic CFG?:  What if We Need a Basic CFG? Suppose that we need a basic CFG, possibly because: Run-time efficiency is critical The parser must be embedded inside a larger system, such as a speech understanding system, that requires it. Then we can view the unification formalism as a high-level language, useful for description, and then compile a unification grammar into a standard context-free grammar.

Add a comment

Related presentations