Show simple item record

dc.contributor.authorDelPreto, Joseph
dc.contributor.authorRus, Daniela
dc.date.accessioned2021-11-01T17:31:18Z
dc.date.available2021-11-01T17:31:18Z
dc.date.issued2020-03-06
dc.identifier.urihttps://hdl.handle.net/1721.1/136995
dc.description.abstractAs the capacity for machines to extend human capabilities continues to grow, the communication channels used must also expand. Allowing machines to interpret nonverbal commands such as gestures can help make interactions more similar to interactions with another person. Yet to be pervasive and effective in realistic scenarios, such interfaces should not require significant sensing infrastructure or per-user setup time. The presented work takes a step towards these goals by using wearable muscle and motion sensors to detect gestures without dedicated calibration or training procedures. An algorithm is presented for clustering unlabeled streaming data in real time, and it is applied to adaptively thresholding muscle and motion signals acquired via electromyography (EMG) and an inertial measurement unit (IMU). This enables plug-and-play online detection of arm stiffening, fist clenching, rotation gestures, and forearm activation. It also augments a neural network pipeline, trained only on strategically chosen training data from previous users, to detect left, right, up, and down gestures. Together, these pipelines offer a plug-and-play gesture vocabulary suitable for remotely controlling a robot. Experiments with 6 subjects evaluate classifier performance and interface efficacy. Classifiers correctly identified 97.6% of 1,200 cued gestures, and a drone correctly responded to 81.6% of 1,535 unstructured gestures as subjects remotely controlled it through target hoops during 119 minutes of total flight time.en_US
dc.language.isoen
dc.publisherACMen_US
dc.relation.isversionof10.1145/3319502.3374823en_US
dc.rightsCreative Commons Attribution-NonCommercial-NoDerivs Licenseen_US
dc.rights.urihttp://creativecommons.org/licenses/by-nc-nd/4.0/en_US
dc.sourceACMen_US
dc.titlePlug-and-Play Gesture Control Using Muscle and Motion Sensorsen_US
dc.typeArticleen_US
dc.identifier.citationDelPreto, Joseph and Rus, Daniela. 2020. "Plug-and-Play Gesture Control Using Muscle and Motion Sensors." ACM/IEEE International Conference on Human-Robot Interaction.
dc.contributor.departmentMassachusetts Institute of Technology. Computer Science and Artificial Intelligence Laboratory
dc.relation.journalACM/IEEE International Conference on Human-Robot Interactionen_US
dc.eprint.versionFinal published versionen_US
dc.type.urihttp://purl.org/eprint/type/ConferencePaperen_US
eprint.statushttp://purl.org/eprint/status/NonPeerRevieweden_US
dc.date.updated2021-08-02T13:18:40Z
dspace.orderedauthorsDelPreto, J; Rus, Den_US
dspace.date.submission2021-08-02T13:18:43Z
mit.licensePUBLISHER_CC
mit.metadata.statusAuthority Work and Publication Information Neededen_US


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record