A Bayesian framework for cross-situational word-learning
Author(s)
Goodman, Noah Daniel; Tenenbaum, Joshua B; Frank, Michael C.
Download3165-a-bayesian-framework-for-cross-situational-word-learning.pdf (330.3Kb)
PUBLISHER_POLICY
Publisher Policy
Article is made available in accordance with the publisher's policy and may be subject to US copyright law. Please refer to the publisher's site for terms of use.
Terms of use
Metadata
Show full item recordAbstract
For infants, early word learning is a chicken-and-egg problem. One way to learn a word is to observe that it co-occurs with a particular referent across different situations. Another way is to use the social context of an utterance to infer the intended referent of a word. Here we present a Bayesian model of cross-situational word learning, and an extension of this model that also learns which social cues are relevant to determining reference. We test our model on a small corpus of mother-infant interaction and find it performs better than competing models. Finally, we show that our model accounts for experimental phenomena including mutual exclusivity, fast-mapping, and generalization from social cues.
Date issued
2007-12Department
Massachusetts Institute of Technology. Department of Brain and Cognitive SciencesJournal
Advances in Neural Information Processing Systems 20 (NIPS 2007)
Publisher
Neural Information Processing Systems Foundation
Citation
Frank, Michael C., Noah D. Goodman, and Joshua B. Tenenbaum. "A Bayesian Framework for Cross-Situational Word-Learning." Advances in Neural Information Processing Systems 20 (NIPS 2007), Vancouver, British Columbia, Canada, 3-8 December, 2007. © 2007 Neural Information Processing Systems Foundation
Version: Final published version
ISSN
1049-5258