Show simple item record

dc.contributor.authorKatzschmann, Robert Kevin
dc.contributor.authorAraki, Brandon
dc.contributor.authorRus, Daniela L
dc.date.accessioned2018-03-26T13:41:51Z
dc.date.available2018-03-26T13:41:51Z
dc.date.issued2018-03
dc.date.submitted2017-08
dc.identifier.issn1534-4320
dc.identifier.issn1558-0210
dc.identifier.urihttp://hdl.handle.net/1721.1/114285
dc.description.abstractThis paper presents ALVU (Array of Lidars and Vibrotactile Units), a contactless, intuitive, hands-free, and discreet wearable device that allows visually impaired users to detect low- and high-hanging obstacles, as well as physical boundaries in their immediate environment. The solution allows for safe local navigation in both confined and open spaces by enabling the user to distinguish free space from obstacles. The device presented is composed of two parts: a sensor belt and a haptic strap. The sensor belt is an array of time-of-flight distance sensors worn around the front of a user’s waist, and the pulses of infrared light provide reliable and accurate measurements of the distances between the user and surrounding obstacles or surfaces. The haptic strap communicates the measured distances through an array of vibratory motors worn around the user’s upper abdomen, providing haptic feedback. The linear vibration motors are combined with a point-loaded pretensioned applicator to transmit isolated vibrations to the user. We validated the device’s capability in an extensive user study entailing 162 trials with 12 blind users. Users wearing the device successfully walked through hallways, avoided obstacles, and detected staircases. Keywords: haptic interfaces; navigation; robot sensing systems; vibrations; belts; sensor arrays; cameras; assistive device; sightless navigation; human-robot interaction; perception; haptic feedback arrayen_US
dc.description.sponsorshipNational Science Foundation (U.S.) (Grant IIS1226883)en_US
dc.language.isoen_US
dc.publisherInstitute of Electrical and Electronics Engineers (IEEE)en_US
dc.relation.isversionofhttp://dx.doi.org/10.1109/TNSRE.2018.2800665en_US
dc.rightsCreative Commons Attribution 3.0 Unported licenseen_US
dc.rights.urihttp://creativecommons.org/licenses/by/3.0/en_US
dc.sourceRobert Katzschmannen_US
dc.titleSafe Local Navigation for Visually Impaired Users with a Time-of-Flight and Haptic Feedback Deviceen_US
dc.typeArticleen_US
dc.identifier.citationKatzschmann, Robert et al. “Safe Local Navigation for Visually Impaired Users with a Time-of-Flight and Haptic Feedback Device.” IEEE Transactions on Neural Systems and Rehabilitation Engineering 26, 3 (March 2018): 583 - 593 © 2018 Institute of Electrical and Electronics Engineers (IEEE)en_US
dc.contributor.departmentMassachusetts Institute of Technology. Computer Science and Artificial Intelligence Laboratoryen_US
dc.contributor.departmentMassachusetts Institute of Technology. Department of Electrical Engineering and Computer Scienceen_US
dc.contributor.departmentMassachusetts Institute of Technology. Department of Mechanical Engineeringen_US
dc.contributor.mitauthorKatzschmann, Robert Kevin
dc.contributor.mitauthorAraki, Brandon
dc.contributor.mitauthorRus, Daniela L
dc.relation.journalIEEE Transactions on Neural Systems and Rehabilitation Engineeringen_US
dc.eprint.versionFinal published versionen_US
dc.type.urihttp://purl.org/eprint/type/JournalArticleen_US
eprint.statushttp://purl.org/eprint/status/PeerRevieweden_US
dspace.orderedauthorsKatzschmann, Robert; Araki, Brandon; Rus, Danielaen_US
dspace.embargo.termsNen_US
dc.identifier.orcidhttps://orcid.org/0000-0001-7143-7259
dc.identifier.orcidhttps://orcid.org/0000-0001-5473-3566
mit.licensePUBLISHER_CCen_US


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record