Show simple item record

dc.contributor.authorM. Hasani, Ramin
dc.contributor.authorH. Guenther, Frank
dc.contributor.authorDelPreto, Joseph Jeff
dc.contributor.authorSalazar Gomez, Andres Felipe
dc.contributor.authorGil, Stephanie
dc.contributor.authorRus, Daniela L
dc.date.accessioned2018-11-16T15:53:18Z
dc.date.available2018-11-16T15:53:18Z
dc.date.issued2018-06
dc.identifier.isbn978-0-9923747-4-7
dc.identifier.urihttp://hdl.handle.net/1721.1/119145
dc.description.abstractControl of robots in safety-critical tasks and situations where costly errors may occur is paramount for realizing the vision of pervasive human-robot collaborations. For these cases, the ability to use human cognition in the loop can be key for recuperating safe robot operation. This paper combines two streams of human biosignals, electrical muscle and brain activity via EMG and EEG, respectively, to achieve fast and accurate human intervention in a supervisory control task. In particular, this paper presents an end-to-end system for continuous rolling-window classification of gestures that allows the human to actively correct the robot on demand, discrete classification of Error-Related Potential signals (unconsciously produced by the human supervisor’s brain when observing a robot error), and a framework that integrates these two classification streams for fast and effective human intervention. The system also allows “plug-and-play” operation, demonstrating accurate performance even with new users whose biosignals have not been used for training the classifiers. The resulting hybrid control system for safety-critical situations is evaluated with 7 untrained human subjects in a supervisory control scenario where an autonomous robot performs a multi-target selection task.en_US
dc.description.sponsorshipBoeing Companyen_US
dc.language.isoen_US
dc.publisherRobotics: Science and Systems Foundationen_US
dc.relation.isversionofhttp://dx.doi.org/10.15607/RSS.2018.XIV.063en_US
dc.rightsCreative Commons Attribution-Noncommercial-Share Alikeen_US
dc.rights.urihttp://creativecommons.org/licenses/by-nc-sa/4.0/en_US
dc.sourceDelPreto, Josephen_US
dc.titlePlug-and-Play Supervisory Control Using Muscle and Brain Signals for Real-Time Gesture and Error Detectionen_US
dc.typeArticleen_US
dc.identifier.citationDelPreto, Joseph, et al. “Plug-and-Play Supervisory Control Using Muscle and Brain Signals for Real-Time Gesture and Error Detection.” Robotics: Science and Systems XIV, Robotics: Science and Systems Foundation, 2018.en_US
dc.contributor.departmentMassachusetts Institute of Technology. Computer Science and Artificial Intelligence Laboratoryen_US
dc.contributor.departmentMassachusetts Institute of Technology. Department of Electrical Engineering and Computer Scienceen_US
dc.contributor.approverDelPreto, Josephen_US
dc.contributor.mitauthorDelPreto, Joseph Jeff
dc.contributor.mitauthorSalazar Gomez, Andres Felipe
dc.contributor.mitauthorGil, Stephanie
dc.contributor.mitauthorRus, Daniela L
dc.relation.journalRobotics: Science and Systems XIVen_US
dc.eprint.versionAuthor's final manuscripten_US
dc.type.urihttp://purl.org/eprint/type/ConferencePaperen_US
eprint.statushttp://purl.org/eprint/status/NonPeerRevieweden_US
dspace.orderedauthorsDelPreto, Joseph; F. Salazar-Gomez, Andres; Gil, Stephanie; M. Hasani, Ramin; H. Guenther, Frank; Rus, Danielaen_US
dspace.embargo.termsNen_US
dc.identifier.orcidhttps://orcid.org/0000-0001-8162-5317
dc.identifier.orcidhttps://orcid.org/0000-0002-3964-2049
dc.identifier.orcidhttps://orcid.org/0000-0001-5473-3566
mit.licenseOPEN_ACCESS_POLICYen_US


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record