Show simple item record

dc.contributor.advisorEmery N. Brown.en_US
dc.contributor.authorLe Mau, Tuan.en_US
dc.contributor.otherMassachusetts Institute of Technology. Department of Brain and Cognitive Sciences.en_US
dc.date.accessioned2019-07-18T20:34:11Z
dc.date.available2019-07-18T20:34:11Z
dc.date.copyright2019en_US
dc.date.issued2019en_US
dc.identifier.urihttps://hdl.handle.net/1721.1/121827
dc.descriptionThesis: Ph. D., Massachusetts Institute of Technology, Department of Brain and Cognitive Sciences, 2019en_US
dc.descriptionCataloged from PDF version of thesis. "Some pages in the original document contain text that runs off the edge of the page. See Appendix A - pages 162-171"--Disclaimer Notice page.en_US
dc.descriptionIncludes bibliographical references (pages 147-159).en_US
dc.description.abstractIt is commonly assumed that there is a reliable one-to-one mapping between a certain configuration of facial movements and the specific emotional state that is supposedly signals. One common way to test this one-to-one hypothesis is to ask people to deliberately pose the facial configurations that they believe they use to express emotions. Participants are randomly sampled, without concern for their emotional expertise, and are given a single emotion word or a single, brief statement to describe each emotion category. They then deliberately pose the facial configuration that they believe they make when expressing instances of this category. Such studies routinely find that participants from different countries show moderate to strong evidence for a one-to-one mapping between an emotion category and a single facial configuration (its presumed facial expression).en_US
dc.description.abstractIn Study 1, we examined the facial configurations posed by emotion experts - famous actors who were provided with a diverse sample of richly described scenarios, full of context. Participants inferred the emotional meaning of the scenarios, which were then grouped into categories. Systematic coding of the facial poses for each emotion category revealed little evidence for the hypothesis that each category has a diagnostic facial expression. Instead, we observed a high degree of variability among expert's facial poses for any given emotion category, and little specificity for any pose. Furthermore, an unsupervised statistical analysis discovered 29 novel emotion categories with moderately consistent facial poses. In Study 2, participants were asked to infer the emotional meaning of each facial pose when presented alone, or when presented in the context of its eliciting scenario.en_US
dc.description.abstractIn fact, the majority of studies designed to test the one-to-one hypothesis ask people from various cultures to judge posed configurations of facial movements, such as a scowl (the proposed facial expression for anger), a frown (the proposed expression for sadness), and so on, on the assumption that these facial configurations, as universal expressions of emotional states, co-evolved with the ability to recognize and read them. These studies routinely show participants one facial configuration posed by multiple posers for each emotion category and observe variable findings, depending on the experimental method used. Our analyses indicated that participants's inferences about the emotional meaning of the facial poses were influenced more by their eliciting scenarios than by the physical morphology of the facial configurations.en_US
dc.description.abstractThese findings strongly replicate emerging evidence that the emotional meaning of any set of facial movements may be much more variable and context-dependent than hypothesized by the common one-to-one view which continues to influence the public understanding of emotion, and hence education, clinical practice, and applications in government and industry. Although more ecologically valid research on how people actually move their faces to express emotion is urgently needed, doing so was immensely difficult without the right tools that support the process of capturing facial data in real life, automatically processing these data, and finally supporting data verification and analysis. We developed a system of technological tools to support the investigations of facial movements during emotional episodes in naturalistic settings with the use of dynamic and longitudinal facial data. We then collected, pre-processed, verified and analyzed data from Youtube using our newly-developed tools.en_US
dc.description.abstractIn particular, we examined two talk show hosts and presented preliminary insights on the answers to questions that were previously very difficult to investigate.en_US
dc.description.statementofresponsibilityby Tuan Le Mau.en_US
dc.format.extent200 pagesen_US
dc.language.isoengen_US
dc.publisherMassachusetts Institute of Technologyen_US
dc.rightsMIT theses are protected by copyright. They may be viewed, downloaded, or printed from this source but further reproduction or distribution in any format is prohibited without written permission.en_US
dc.rights.urihttp://dspace.mit.edu/handle/1721.1/7582en_US
dc.subjectBrain and Cognitive Sciences.en_US
dc.titleTowards understanding facial movements in real lifeen_US
dc.typeThesisen_US
dc.description.degreePh. D.en_US
dc.contributor.departmentMassachusetts Institute of Technology. Department of Brain and Cognitive Sciencesen_US
dc.identifier.oclc1108619276en_US
dc.description.collectionPh.D. Massachusetts Institute of Technology, Department of Brain and Cognitive Sciencesen_US
dspace.imported2019-07-18T20:34:07Zen_US
mit.thesis.degreeDoctoralen_US
mit.thesis.departmentBrainen_US


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record