Show simple item record

dc.contributor.authorBaltrusaitis, Tadas
dc.contributor.authorMcDuff, Daniel Jonathan
dc.contributor.authorBanda, Ntombikayise
dc.contributor.authorMahmoud, Marwa
dc.contributor.authorel Kaliouby, Rana
dc.contributor.authorRobinson, Peter
dc.contributor.authorPicard, Rosalind W.
dc.date.accessioned2011-12-06T17:57:00Z
dc.date.available2011-12-06T17:57:00Z
dc.date.issued2011-03
dc.identifier.isbn978-1-4244-9140-7
dc.identifier.urihttp://hdl.handle.net/1721.1/67458
dc.description.abstractWe present a real-time system for detecting facial action units and inferring emotional states from head and shoulder gestures and facial expressions. The dynamic system uses three levels of inference on progressively longer time scales. Firstly, facial action units and head orientation are identified from 22 feature points and Gabor filters. Secondly, Hidden Markov Models are used to classify sequences of actions into head and shoulder gestures. Finally, a multi level Dynamic Bayesian Network is used to model the unfolding emotional state based on probabilities of different gestures. The most probable state over a given video clip is chosen as the label for that clip. The average F1 score for 12 action units (AUs 1, 2, 4, 6, 7, 10, 12, 15, 17, 18, 25, 26), labelled on a frame by frame basis, was 0.461. The average classification rate for five emotional states (anger, fear, joy, relief, sadness) was 0.440. Sadness had the greatest rate, 0.64, anger the smallest, 0.11.en_US
dc.description.sponsorshipThales Research and Technology (UK)en_US
dc.description.sponsorshipBradlow Foundation Trusten_US
dc.description.sponsorshipProcter & Gamble Companyen_US
dc.language.isoen_US
dc.publisherInstitute of Electrical and Electronics Engineersen_US
dc.relation.isversionofhttp://dx.doi.org/10.1109/FG.2011.5771372en_US
dc.rightsCreative Commons Attribution-Noncommercial-Share Alike 3.0en_US
dc.rights.urihttp://creativecommons.org/licenses/by-nc-sa/3.0/en_US
dc.sourceJavier Hernandez Riveraen_US
dc.titleReal-Time Inference of Mental States from Facial Expressions and Upper Body Gesturesen_US
dc.typeArticleen_US
dc.identifier.citationBaltrusaitis, Tadas et al. “Real-time Inference of Mental States from Facial Expressions and Upper Body Gestures.” Face and Gesture 2011. Santa Barbara, CA, USA, 2011. 909-914.en_US
dc.contributor.departmentMassachusetts Institute of Technology. Media Laboratoryen_US
dc.contributor.approverPicard, Rosalind W.
dc.contributor.mitauthorMcDuff, Daniel Jonathan
dc.contributor.mitauthorel Kaliouby, Rana
dc.contributor.mitauthorPicard, Rosalind W.
dc.relation.journal2011 IEEE International Conference on Automatic Face & Gesture Recognition and Workshops (FG 2011)en_US
dc.eprint.versionAuthor's final manuscripten_US
dc.type.urihttp://purl.org/eprint/type/ConferencePaperen_US
dspace.orderedauthorsBaltrusaitis, Tadas; McDuff, Daniel; Banda, Ntombikayise; Mahmoud, Marwa; Kaliouby, Rana el; Robinson, Peter; Picard, Rosalinden
dc.identifier.orcidhttps://orcid.org/0000-0002-5661-0022
mit.licenseOPEN_ACCESS_POLICYen_US
mit.metadata.statusComplete


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record