Decoding Surface Touch Typing from Hand-Tracking
Author(s)
Richardson, Mark; Durasoff, Matt; Wang, Robert
Download3379337.3415816.pdf (21.61Mb)
Publisher with Creative Commons License
Publisher with Creative Commons License
Creative Commons Attribution
Terms of use
Metadata
Show full item recordAbstract
We propose a novel text decoding method that enables touch typing on an uninstrumented flat surface. Rather than relying on physical keyboards or capacitive touch, our method takes as input hand motion of the typist, obtained through hand-tracking, and decodes this motion directly into text. We use a temporal convolutional network to represent a motion model that maps the hand motion, represented as a sequence of hand pose features, into text characters. To enable touch typing without the haptic feedback of a physical keyboard, we had to address more erratic typing motion due to drift of the fingers. Thus, we incorporate a language model as a text prior and use beam search to efficiently combine our motion and language models to decode text from erratic or ambiguous hand motion. We collected a dataset of 20 touch typists and evaluated our model on several baselines, including contact-based text decoding and typing on a physical keyboard. Our proposed method is able to leverage continuous hand pose information to decode text more accurately than contact-based methods and an offline study shows parity (73 WPM, 2.38% UER) with typing on a physical keyboard. Our results show that hand-tracking has the potential to enable rapid text entry in mobile environments.
Description
UIST ’20, October 20–23, 2020, Virtual Event, USA
Date issued
2020-10-20Publisher
ACM|Proceedings of the 33rd Annual ACM Symposium on User Interface Software and Technology CD-ROM
Citation
Richardson, Mark, Durasoff, Matt and Wang, Robert. 2020. "Decoding Surface Touch Typing from Hand-Tracking."
Version: Final published version
ISBN
978-1-4503-7514-6
Collections
The following license files are associated with this item: