MIT Libraries homeMIT Libraries logoDSpace@MIT

MIT
View Item 
  • DSpace@MIT Home
  • MIT Libraries
  • MIT Theses
  • Doctoral Theses
  • View Item
  • DSpace@MIT Home
  • MIT Libraries
  • MIT Theses
  • Doctoral Theses
  • View Item
JavaScript is disabled for your browser. Some features of this site may not work without it.

Crowdsourcing affective responses for predicting media effectiveness

Author(s)
McDuff, Daniel Jonathan
Thumbnail
DownloadFull printable version (7.984Mb)
Other Contributors
Massachusetts Institute of Technology. Department of Architecture. Program in Media Arts and Sciences.
Advisor
Rosalind W. Picard.
Terms of use
M.I.T. theses are protected by copyright. They may be viewed from this source for any purpose, but reproduction or distribution in any format is prohibited without written permission. See provided URL for inquiries about permission. http://dspace.mit.edu/handle/1721.1/7582
Metadata
Show full item record
Abstract
Emotion is key to the effectiveness of media, whether it be in influencing memory, likability or persuasion. Stories and narratives, even if fictional, have the ability to induce a genuine emotional response. However, the understanding of the role of emotions in media and advertising effectiveness has been limited due to the difficulty in measuring emotions in real-life contexts. Video advertising is a ubiquitous form of a short story, usually 30-60 seconds in length, designed to influence, persuade, entertain and engage, in which media with emotional content is frequently used. The lack of understanding of the effects of emotion in advertising results in large amounts of wasted time, money and other resources; in this thesis I present several studies measuring responses to advertising. Facial expressions, heart rate, respiration rate and heart rate variability can inform us about the emotional valence, arousal and engagement of a person. In this thesis I demonstrate how automatically-detected naturalistic and spontaneous facial responses and physiological responses can be used to predict the effectiveness of stories. I present a framework for automatically measuring facial and physiological responses in addition to self-report and behavioral measures to content (e.g. video advertisements) over the Internet in order to understand the role of emotions in story effectiveness. Specifically, I will present analysis of the first large scale data of facial, physiological, behavioral and self report responses to video content collected "in-the-wild" using the cloud. I have developed models for evaluating the effectiveness of media content (e.g. likability, persuasion and short-term sales impact) based on the automatically extracted features. This work shows success in predicting measures of story effectiveness that are useful in creation of content whether that be in copy-testing or content development.
Description
Thesis: Ph. D., Massachusetts Institute of Technology, School of Architecture and Planning, Program in Media Arts and Sciences, 2014.
 
This electronic version was submitted by the student author. The certified thesis is available in the Institute Archives and Special Collections.
 
Cataloged from student-submitted PDF version of thesis.
 
Includes bibliographical references (pages 209-225).
 
Date issued
2014
URI
http://hdl.handle.net/1721.1/91305
Department
Program in Media Arts and Sciences (Massachusetts Institute of Technology)
Publisher
Massachusetts Institute of Technology
Keywords
Architecture. Program in Media Arts and Sciences.

Collections
  • Doctoral Theses

Browse

All of DSpaceCommunities & CollectionsBy Issue DateAuthorsTitlesSubjectsThis CollectionBy Issue DateAuthorsTitlesSubjects

My Account

Login

Statistics

OA StatisticsStatistics by CountryStatistics by Department
MIT Libraries homeMIT Libraries logo

Find us on

Twitter Facebook Instagram YouTube RSS

MIT Libraries navigation

SearchHours & locationsBorrow & requestResearch supportAbout us
PrivacyPermissionsAccessibility
MIT
Massachusetts Institute of Technology
Content created by the MIT Libraries, CC BY-NC unless otherwise noted. Notify us about copyright concerns.