Tracking body and hands for gesture recognition: NATOPS aircraft handling signals database
Author(s)Song, Yale; Demirdjian, David; Davis, Randall
MetadataShow full item record
We present a unified framework for body and hand tracking, the output of which can be used for understanding simultaneously performed body-and-hand gestures. The framework uses a stereo camera to collect 3D images, and tracks body and hand together, combining various existing techniques to make tracking tasks efficient. In addition, we introduce a multi-signal gesture database: the NATOPS aircraft handling signals. Unlike previous gesture databases, this data requires knowledge about both body and hand in order to distinguish gestures. It is also focused on a clearly defined gesture vocabulary from a real-world scenario that has been refined over many years. The database includes 24 body-and-hand gestures, and provides both gesture video clips and the body and hand features we extracted.
DepartmentMassachusetts Institute of Technology. Computer Science and Artificial Intelligence Laboratory; Massachusetts Institute of Technology. Department of Electrical Engineering and Computer Science
Proceedings of the Automatic Face & Gesture Recognition and Workshops (FG 2011)
Institute of Electrical and Electronics Engineers
Song, Yale, David Demirdjian, and Randall Davis. Tracking Body and Hands for Gesture Recognition: NATOPS Aircraft Handling Signals Database. In Face and Gesture 2011, 500-506. Institute of Electrical and Electronics Engineers, 2011.
Author's final manuscript
INSPEC Accession Number: 12007776