Show simple item record

dc.contributor.authorSkerry, Amy E.
dc.contributor.authorSaxe, Rebecca R.
dc.date.accessioned2015-06-09T15:50:57Z
dc.date.available2015-06-09T15:50:57Z
dc.date.issued2014-11
dc.date.submitted2014-09
dc.identifier.issn0270-6474
dc.identifier.issn1529-2401
dc.identifier.urihttp://hdl.handle.net/1721.1/97243
dc.description.abstractAlthough the emotions of other people can often be perceived from overt reactions (e.g., facial or vocal expressions), they can also be inferred from situational information in the absence of observable expressions. How does the human brain make use of these diverse forms of evidence to generate a common representation of a target's emotional state? In the present research, we identify neural patterns that correspond to emotions inferred from contextual information and find that these patterns generalize across different cues from which an emotion can be attributed. Specifically, we use functional neuroimaging to measure neural responses to dynamic facial expressions with positive and negative valence and to short animations in which the valence of a character's emotion could be identified only from the situation. Using multivoxel pattern analysis, we test for regions that contain information about the target's emotional state, identifying representations specific to a single stimulus type and representations that generalize across stimulus types. In regions of medial prefrontal cortex (MPFC), a classifier trained to discriminate emotional valence for one stimulus (e.g., animated situations) could successfully discriminate valence for the remaining stimulus (e.g., facial expressions), indicating a representation of valence that abstracts away from perceptual features and generalizes across different forms of evidence. Moreover, in a subregion of MPFC, this neural representation generalized to trials involving subjectively experienced emotional events, suggesting partial overlap in neural responses to attributed and experienced emotions. These data provide a step toward understanding how the brain transforms stimulus-bound inputs into abstract representations of emotion.en_US
dc.description.sponsorshipNational Institutes of Health (U.S.) (Grant 1R01 MH096914-01A1)en_US
dc.language.isoen_US
dc.publisherSociety for Neuroscienceen_US
dc.relation.isversionofhttp://dx.doi.org/10.1523/jneurosci.1676-14.2014en_US
dc.rightsArticle is made available in accordance with the publisher's policy and may be subject to US copyright law. Please refer to the publisher's site for terms of use.en_US
dc.sourceSociety for Neuroscienceen_US
dc.titleA Common Neural Code for Perceived and Inferred Emotionen_US
dc.typeArticleen_US
dc.identifier.citationSkerry, A. E., and R. Saxe. “A Common Neural Code for Perceived and Inferred Emotion.” Journal of Neuroscience 34, no. 48 (November 26, 2014): 15997–16008.en_US
dc.contributor.departmentMassachusetts Institute of Technology. Department of Brain and Cognitive Sciencesen_US
dc.contributor.mitauthorSkerry, Amy E.en_US
dc.contributor.mitauthorSaxe, Rebecca R.en_US
dc.relation.journalJournal of Neuroscienceen_US
dc.eprint.versionFinal published versionen_US
dc.type.urihttp://purl.org/eprint/type/JournalArticleen_US
eprint.statushttp://purl.org/eprint/status/PeerRevieweden_US
dspace.orderedauthorsSkerry, A. E.; Saxe, R.en_US
dc.identifier.orcidhttps://orcid.org/0000-0003-2377-1791
mit.licensePUBLISHER_POLICYen_US
mit.metadata.statusComplete


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record