Tracking body and hands for gesture recognition: NATOPS aircraft handling signals database
Author(s)
Song, Yale; Demirdjian, David; Davis, Randall
Downloadfg2011-Tracking.pdf (8.367Mb)
OPEN_ACCESS_POLICY
Open Access Policy
Creative Commons Attribution-Noncommercial-Share Alike
Terms of use
Metadata
Show full item recordAbstract
We present a unified framework for body and hand tracking, the output of which can be used for understanding simultaneously performed body-and-hand gestures. The framework uses a stereo camera to collect 3D images, and tracks body and hand together, combining various existing techniques to make tracking tasks efficient. In addition, we introduce a multi-signal gesture database: the NATOPS aircraft handling signals. Unlike previous gesture databases, this data requires knowledge about both body and hand in order to distinguish gestures. It is also focused on a clearly defined gesture vocabulary from a real-world scenario that has been refined over many years. The database includes 24 body-and-hand gestures, and provides both gesture video clips and the body and hand features we extracted.
Date issued
2011-03Department
Massachusetts Institute of Technology. Computer Science and Artificial Intelligence Laboratory; Massachusetts Institute of Technology. Department of Electrical Engineering and Computer ScienceJournal
Proceedings of the Automatic Face & Gesture Recognition and Workshops (FG 2011)
Publisher
Institute of Electrical and Electronics Engineers
Citation
Song, Yale, David Demirdjian, and Randall Davis. Tracking Body and Hands for Gesture Recognition: NATOPS Aircraft Handling Signals Database. In Face and Gesture 2011, 500-506. Institute of Electrical and Electronics Engineers, 2011.
Version: Author's final manuscript
Other identifiers
INSPEC Accession Number: 12007776
ISBN
978-1-4244-9140-7