Upgrade to Pro — share decks privately, control downloads, hide ads and more …

Understanding ME

mllewis
May 03, 2016

Understanding ME

mllewis

May 03, 2016
Tweet

More Decks by mllewis

Other Decks in Science

Transcript

  1. Modeling disambiguation in word learning via multiple probabilistic constraints Molly

    Lewis Michael C. Frank Stanford University The 35th Annual Cognitive Science Society Meeting 3 August 2013
  2. In the lexicon, each word maps to a unique concept,

    and each concept maps to a unique word (Clark, 1987). w1 w w2 w3 c1 c2 c3 Everything would be fine if language did not deceive us by finding different names for the same thing in different times and places ... A word should be contained in every single thing But it is not. – Czeslaw Milosz
  3. Disambiguation Effect Children tend to map a new word to

    an object they don’t yet know a name for zot (e.g. Markman & Wachtel, 1988)
  4. What are the cognitive processes underlying disambiguation? Mutual Exclusivity (e.g.

    Markman & Wachtel, 1988) – Constraint on the types of lexicons considered when learning the meaning of a new word – Biased to consider only those lexicons that have a 1-1 mapping between words and objects Pragmatic Inference Account (e.g. Clark, 1987; Diesendruck & Markson, 2001) – Principle of Conventionality: Speakers within the same speech community use the same words to refer to the same objects. – Principle of Contrast: Different linguistic forms refer to different meanings.
  5. Testing theories of disambiguation Diesendruck and Markson (2001) – Compare

    performance on a novel facts about an object relative to a novel referential label – Label condition ≈ fact condition – Evidence for pragmatic mechanism? Preissler and Carey (2005) – Test children with autism, who have impairments in pragmatic reasoning – Typically developing children ≈ children with autism, on disambiguation task – Evidence for domain-specific lexical constraint?
  6. 1. Mutual exclusivity constraint 2. Pragmatic inference account 3. …

    The Proposal: Multiple classes of theories may be describing distinct, but complementary mechanisms that jointly contribute to the disambiguation effect. What are the cognitive processes underlying disambiguation?
  7. Goal and Method • Explore multiple disambiguation mechanisms within a

    single formal framework • Facilitates understanding the empirical consequences of our assumptions – Particularly, how mechanisms interact with each other • Here, we formally instantiate aspects of each account (and gloss over other aspects) – Mutual Exclusivity: hierarchical constraint on lexicons – Pragmatics: in-the-moment inference on the basis of intentions • Method: hierarchical Bayesian modeling
  8. (1) Define space of lexicons (assume world with 2 words

    and 2 objects): (2) Observe situations (3) Determine the most likely lexicons, given situations (using Bayes’ rule) w1 Modeling the Disambiguation Task Known-Word Training o1 w2 o1 o2 Disambiguation Test
  9. Higher order constraints on lexicons instantiated as constraints on permissible

    lexicons. Instantiating a hierarchical constraint 1-1 Many-1 1-Many Null
  10. 0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9

    1 Lexicons Posterior probability 1−1 1−many many−1 null word object 1 2 1 2 1 2 1 2 1 2 1 2 1 2 1 2 1 2 1 2 1 2 1 2 1 2 1 2 1 2 1 2 0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1 Lexicons Posterior probability 1−1 1−many many−1 null word object 1 2 1 2 1 2 1 2 1 2 1 2 1 2 1 2 1 2 1 2 1 2 1 2 1 2 1 2 1 2 1 2 0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1 Lexicons Posterior probability 1−1 1−many many−1 null word object 1 2 1 2 1 2 1 2 1 2 1 2 1 2 1 2 1 2 1 2 1 2 1 2 1 2 1 2 1 2 1 2 0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1 Lexicons Posterior probability 1−1 1−many many−1 null word object 1 2 1 2 1 2 1 2 1 2 1 2 1 2 1 2 1 2 1 2 1 2 1 2 1 2 1 2 1 2 1 2 0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1 Lexicons Posterior probability 1−1 1−many many−1 null word object 1 2 1 2 1 2 1 2 1 2 1 2 1 2 1 2 1 2 1 2 1 2 1 2 1 2 1 2 1 2 1 2 0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1 Lexicons Posterior probability 1−1 1−many many−1 null word object 1 2 1 2 1 2 1 2 1 2 1 2 1 2 1 2 1 2 1 2 1 2 1 2 1 2 1 2 1 2 1 2 0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1 Lexicons Posterior probability 1−1 1−many many−1 null word object 1 2 1 2 1 2 1 2 1 2 1 2 1 2 1 2 1 2 1 2 1 2 1 2 1 2 1 2 1 2 1 2 Disambiguation at multiple levels
  11. Simple Probabilistic Route as Modified by Experience w1 o1 w1

    o1 w1 o1 w2 o1 o2 w1 o1 w1 o1 w2 o1 o2 w1 o1 w2 o1 o2
  12. Simple Probabilistic Route as Modified by Experience w1 o1 w1

    o1 w1 o1 w2 o1 o2 w1 o1 w1 o1 w2 o1 o2 w1 o1 w2 o1 o2
  13. Conclusion • Neither disambiguation mechanism, as instantiated, is necessary to

    create a bias, but either is sufficient. • Disambiguation is strongest when both mechanisms jointly contribute. • May be difficult to tease apart these two aspects empirically. – Weights may vary across task and person (e.g. age, language experience, pragmatic situation) • Model provides a means to make precise quantitative predictions – Potential to resolve inconsistency in the literature