Measuring Voter's Candidate Preference Based on Affective Responses to Election Debates
Author(s)
McDuff, Daniel Jonathan; El Kaliouby, Rana; Kodra, Evan; Picard, Rosalind W.
DownloadPicard_Measuring voter's.pdf (4.225Mb)
OPEN_ACCESS_POLICY
Open Access Policy
Creative Commons Attribution-Noncommercial-Share Alike
Terms of use
Metadata
Show full item recordAbstract
In this paper we present the first analysis of facial responses to electoral debates measured automatically over the Internet. We show that significantly different responses can be detected from viewers with different political preferences and that similar expressions at significant moments can have very different meanings depending on the actions that appear subsequently. We used an Internet based framework to collect 611 naturalistic and spontaneous facial responses to five video clips from the 3rd presidential debate during the 2012 American presidential election campaign. Using this framework we were able to collect over 60% of these video responses (374 videos) within one day of the live debate and over 80% within three days. No participants were compensated for taking the survey. We present and evaluate a method for predicting independent voter preference based on automatically measured facial responses and self-reported preferences from the viewers. We predict voter preference with an average accuracy of over 73% (AUC 0.779).
Date issued
2013-09Department
Massachusetts Institute of Technology. Media Laboratory; Program in Media Arts and Sciences (Massachusetts Institute of Technology)Journal
Proceedings of the 2013 Humaine Association Conference on Affective Computing and Intelligent Interaction
Publisher
Institute of Electrical and Electronics Engineers (IEEE)
Citation
McDuff, Daniel, Rana El Kaliouby, Evan Kodra, and Rosalind Picard. “Measuring Voter’s Candidate Preference Based on Affective Responses to Election Debates.” 2013 Humaine Association Conference on Affective Computing and Intelligent Interaction (September 2013).
Version: Author's final manuscript
ISBN
978-0-7695-5048-0
ISSN
2156-8103