Show simple item record

dc.contributor.authorMcDuff, Daniel Jonathan
dc.contributor.authorel Kaliouby, Rana
dc.contributor.authorKassam, Karim
dc.contributor.authorPicard, Rosalind W.
dc.date.accessioned2011-12-06T18:03:15Z
dc.date.available2011-12-06T18:03:15Z
dc.date.issued2011-05
dc.date.submitted2011-03
dc.identifier.isbn978-1-4244-9140-7
dc.identifier.urihttp://hdl.handle.net/1721.1/67459
dc.description.abstractFacial and head actions contain significant affective information. To date, these actions have mostly been studied in isolation because the space of naturalistic combinations is vast. Interactive visualization tools could enable new explorations of dynamically changing combinations of actions as people interact with natural stimuli. This paper describes a new open-source tool that enables navigation of and interaction with dynamic face and gesture data across large groups of people, making it easy to see when multiple facial actions co-occur, and how these patterns compare and cluster across groups of participants. We share two case studies that demonstrate how the tool allows researchers to quickly view an entire corpus of data for single or multiple participants, stimuli and actions. Acume yielded patterns of actions across participants and across stimuli, and helped give insight into how our automated facial analysis methods could be better designed. The results of these case studies are used to demonstrate the efficacy of the tool. The open-source code is designed to directly address the needs of the face and gesture research community, while also being extensible and flexible for accommodating other kinds of behavioral data. Source code, application and documentation are available at http://affect.media.mit.edu/acume.en_US
dc.description.sponsorshipProcter & Gamble Companyen_US
dc.language.isoen_US
dc.publisherInstitute of Electrical and Electronics Engineersen_US
dc.relation.isversionofhttp://dx.doi.org/10.1109/FG.2011.5771464en_US
dc.rightsCreative Commons Attribution-Noncommercial-Share Alike 3.0en_US
dc.rights.urihttp://creativecommons.org/licenses/by-nc-sa/3.0/en_US
dc.sourceJavier Hernandez Riveraen_US
dc.titleAcume: A New Visualization Tool for Understanding Facial Expression and Gesture Dataen_US
dc.typeArticleen_US
dc.identifier.citationMcDuff, Daniel et al. “Acume: A New Visualization Tool for Understanding Facial Expression and Gesture Data.” Face and Gesture 2011. Santa Barbara, CA, USA, 2011. 591-596.en_US
dc.contributor.departmentMassachusetts Institute of Technology. Media Laboratoryen_US
dc.contributor.mitauthorMcDuff, Daniel Jonathan
dc.contributor.mitauthorel Kaliouby, Rana
dc.contributor.mitauthorPicard, Rosalind W.
dc.relation.journal2011 IEEE International Conference on Automatic Face & Gesture Recognition and Workshops (FG 2011)en_US
dc.eprint.versionAuthor's final manuscripten_US
dc.type.urihttp://purl.org/eprint/type/ConferencePaperen_US
dspace.orderedauthorsMcDuff, Daniel; Kaliouby, Rana el; Kassam, Karim; Picard, Rosalinden
dc.identifier.orcidhttps://orcid.org/0000-0002-5661-0022
mit.licenseOPEN_ACCESS_POLICYen_US
mit.metadata.statusComplete


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record