Continuous Body and Hand Gesture Recognition for Natural Human-Computer Interaction
Author(s)
Song, Yale; Demirdjian, David; Davis, Randall
DownloadAccepted version (6.682Mb)
Terms of use
Metadata
Show full item recordAbstract
We present a new approach to gesture recognition that tracks body and hands simultaneously and recognizes gestures continuously from an unseg-mented and unbounded input stream. Our system estimates 3D coordinates of upper body joints and classifies the appearance of hands into a set of canonical shapes. A novel multi-layered filtering technique with a temporal sliding window is developed to enable online sequence labeling and segmentation. Experimental results on the NATOPS dataset show the effectiveness of the approach. We also report on our recent work on multimodal gesture recognition and deep-hierarchical sequence representation learning that achieve the state-of-the-art performances on several real-world datasets.
Date issued
2010Department
Massachusetts Institute of Technology. Computer Science and Artificial Intelligence LaboratoryCitation
Song, Yale, Demirdjian, David and Davis, Randall. 2010. "Continuous Body and Hand Gesture Recognition for Natural Human-Computer Interaction."
Version: Author's final manuscript