Show simple item record

dc.contributor.authorKatzschmann, Robert Kevin
dc.contributor.authorRus, Daniela L
dc.contributor.authorAraki, Minoru B
dc.date.accessioned2018-04-27T22:47:29Z
dc.date.available2018-04-27T22:47:29Z
dc.date.issued2018-01
dc.date.submitted2017-12
dc.identifier.issn1534-4320
dc.identifier.issn1558-0210
dc.identifier.urihttp://hdl.handle.net/1721.1/115073
dc.description.abstractThis paper presents ALVU (Array of Lidars and Vibrotactile Units), a contactless, intuitive, hands-free, and discreet wearable device that allows visually impaired users to detect low- and high-hanging obstacles, as well as physical boundaries in their immediate environment. The solution allows for safe local navigation in both confined and open spaces by enabling the user to distinguish free space from obstacles. The device presented is composed of two parts: a sensor belt and a haptic strap. The sensor belt is an array of time-of-flight distance sensors worn around the front of a user's waist, and the pulses of infrared light provide reliable and accurate measurements of the distances between the user and surrounding obstacles or surfaces. The haptic strap communicates the measured distances through an array of vibratory motors worn around the user's upper abdomen, providing haptic feedback. The linear vibration motors are combined with a point-loaded pretensioned applicator to transmit isolated vibrations to the user. We validated the device's capability in an extensive user study entailing 162 trials with 12 blind users. Users wearing the device successfully walked through hallways, avoided obstacles, and detected staircases.en_US
dc.description.sponsorshipAndrea Bocelli Foundationen_US
dc.description.sponsorshipNational Science Foundation (U.S.) (Grant NSF IIS1226883)en_US
dc.language.isoen_US
dc.publisherInstitute of Electrical and Electronics Engineers (IEEE)en_US
dc.relation.isversionofhttp://dx.doi.org/10.1109/TNSRE.2018.2800665en_US
dc.rightsCreative Commons Attributionen_US
dc.rights.urihttp://creativecommons.org/licenses/by/3.0/en_US
dc.sourceKatzschmannen_US
dc.titleSafe Local Navigation for Visually Impaired Users With a Time-of-Flight and Haptic Feedback Deviceen_US
dc.typeArticleen_US
dc.identifier.citationKatzschmann, Robert K., Brandon Araki, and Daniela Rus. “Safe Local Navigation for Visually Impaired Users With a Time-of-Flight and Haptic Feedback Device.” IEEE Transactions on Neural Systems and Rehabilitation Engineering 26, no. 3 (March 2018): 583–593. © 2018 IEEE.en_US
dc.contributor.departmentMassachusetts Institute of Technology. Computer Science and Artificial Intelligence Laboratory. Distributed Robotics Laboratoryen_US
dc.contributor.departmentMassachusetts Institute of Technology. Computer Science and Artificial Intelligence Laboratoryen_US
dc.contributor.departmentMassachusetts Institute of Technology. Department of Electrical Engineering and Computer Scienceen_US
dc.contributor.departmentMassachusetts Institute of Technology. Department of Mechanical Engineeringen_US
dc.contributor.mitauthorKatzschmann, Robert Kevin
dc.contributor.mitauthorRus, Daniela L
dc.contributor.mitauthorAraki, Minoru B
dc.relation.journalIEEE Transactions on Neural Systems and Rehabilitation Engineeringen_US
dc.eprint.versionFinal published versionen_US
dc.type.urihttp://purl.org/eprint/type/JournalArticleen_US
eprint.statushttp://purl.org/eprint/status/PeerRevieweden_US
dspace.orderedauthorsKatzschmann, Robert K.; Araki, Brandon; Rus, Danielaen_US
dspace.embargo.termsNen_US
dc.identifier.orcidhttps://orcid.org/0000-0001-7143-7259
dc.identifier.orcidhttps://orcid.org/0000-0001-5473-3566
dc.identifier.orcidhttps://orcid.org/0000-0002-3094-1587
mit.licensePUBLISHER_CCen_US


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record