MIT Libraries logoDSpace@MIT

MIT
View Item 
  • DSpace@MIT Home
  • MIT Open Access Articles
  • MIT Open Access Articles
  • View Item
  • DSpace@MIT Home
  • MIT Open Access Articles
  • MIT Open Access Articles
  • View Item
JavaScript is disabled for your browser. Some features of this site may not work without it.

Generating annotations for how-to videos using crowdsourcing

Author(s)
Nguyen, Phu
Thumbnail
DownloadAccepted version (2.013Mb)
Terms of use
Creative Commons Attribution-Noncommercial-Share Alike http://creativecommons.org/licenses/by-nc-sa/4.0/
Metadata
Show full item record
Abstract
How-to videos can be valuable for learning, but searching for and following along with them can be difficult. Having labeled events such as the tools used in how-to videos could improve video indexing, searching, and browsing. We introduce a crowdsourcing annotation tool for Photoshop how-to videos with a three-stage method that consists of: (1) gathering timestamps of important events, (2) labeling each event, and (3) capturing how each event affects the task of the tutorial. Our ultimate goal is to generalize our method to be applied to other domains of how-to videos. We evaluate our annotation tool with Amazon Mechanical Turk workers to investigate the accuracy, costs, and feasibility of our three-stage method for annotating large numbers of video tutorials. Improvements can be made for stages 1 and 3, but stage 2 produces accurate labels over 90% of the time using majority voting. We have observed that changes in the instructions and interfaces of each task can improve the accuracy of the results significantly.
Date issued
2013-05
URI
https://hdl.handle.net/1721.1/121444
Department
Massachusetts Institute of Technology. Computer Science and Artificial Intelligence Laboratory
Journal
Proceeding CHI EA '13 CHI '13 Extended Abstracts on Human Factors in Computing Systems
Publisher
Association for Computing Machinery (ACM)
Citation
Nguyen, Phu. "Generating annotations for how-to videos using crowdsourcing." In Proceeding CHI EA '13 CHI '13 Extended Abstracts on Human Factors in Computing Systems, Paris, France, April 27- May 02, 2013, Pages 835-840.
Version: Author's final manuscript
ISBN
978-1-4503-1952-2

Collections
  • MIT Open Access Articles

Browse

All of DSpaceCommunities & CollectionsBy Issue DateAuthorsTitlesSubjectsThis CollectionBy Issue DateAuthorsTitlesSubjects

My Account

Login

Statistics

OA StatisticsStatistics by CountryStatistics by Department
MIT Libraries
PrivacyPermissionsAccessibilityContact us
MIT
Content created by the MIT Libraries, CC BY-NC unless otherwise noted. Notify us about copyright concerns.