dc.contributor.author | Katzschmann, Robert Kevin | |
dc.contributor.author | Rus, Daniela L | |
dc.contributor.author | Araki, Minoru B | |
dc.date.accessioned | 2018-04-27T22:47:29Z | |
dc.date.available | 2018-04-27T22:47:29Z | |
dc.date.issued | 2018-01 | |
dc.date.submitted | 2017-12 | |
dc.identifier.issn | 1534-4320 | |
dc.identifier.issn | 1558-0210 | |
dc.identifier.uri | http://hdl.handle.net/1721.1/115073 | |
dc.description.abstract | This paper presents ALVU (Array of Lidars and Vibrotactile Units), a contactless, intuitive, hands-free, and discreet wearable device that allows visually impaired users to detect low- and high-hanging obstacles, as well as physical boundaries in their immediate environment. The solution allows for safe local navigation in both confined and open spaces by enabling the user to distinguish free space from obstacles. The device presented is composed of two parts: a sensor belt and a haptic strap. The sensor belt is an array of time-of-flight distance sensors worn around the front of a user's waist, and the pulses of infrared light provide reliable and accurate measurements of the distances between the user and surrounding obstacles or surfaces. The haptic strap communicates the measured distances through an array of vibratory motors worn around the user's upper abdomen, providing haptic feedback. The linear vibration motors are combined with a point-loaded pretensioned applicator to transmit isolated vibrations to the user. We validated the device's capability in an extensive user study entailing 162 trials with 12 blind users. Users wearing the device successfully walked through hallways, avoided obstacles, and detected staircases. | en_US |
dc.description.sponsorship | Andrea Bocelli Foundation | en_US |
dc.description.sponsorship | National Science Foundation (U.S.) (Grant NSF IIS1226883) | en_US |
dc.language.iso | en_US | |
dc.publisher | Institute of Electrical and Electronics Engineers (IEEE) | en_US |
dc.relation.isversionof | http://dx.doi.org/10.1109/TNSRE.2018.2800665 | en_US |
dc.rights | Creative Commons Attribution | en_US |
dc.rights.uri | http://creativecommons.org/licenses/by/3.0/ | en_US |
dc.source | Katzschmann | en_US |
dc.title | Safe Local Navigation for Visually Impaired Users With a Time-of-Flight and Haptic Feedback Device | en_US |
dc.type | Article | en_US |
dc.identifier.citation | Katzschmann, Robert K., Brandon Araki, and Daniela Rus. “Safe Local Navigation for Visually Impaired Users With a Time-of-Flight and Haptic Feedback Device.” IEEE Transactions on Neural Systems and Rehabilitation Engineering 26, no. 3 (March 2018): 583–593. © 2018 IEEE. | en_US |
dc.contributor.department | Massachusetts Institute of Technology. Computer Science and Artificial Intelligence Laboratory. Distributed Robotics Laboratory | en_US |
dc.contributor.department | Massachusetts Institute of Technology. Computer Science and Artificial Intelligence Laboratory | en_US |
dc.contributor.department | Massachusetts Institute of Technology. Department of Electrical Engineering and Computer Science | en_US |
dc.contributor.department | Massachusetts Institute of Technology. Department of Mechanical Engineering | en_US |
dc.contributor.mitauthor | Katzschmann, Robert Kevin | |
dc.contributor.mitauthor | Rus, Daniela L | |
dc.contributor.mitauthor | Araki, Minoru B | |
dc.relation.journal | IEEE Transactions on Neural Systems and Rehabilitation Engineering | en_US |
dc.eprint.version | Final published version | en_US |
dc.type.uri | http://purl.org/eprint/type/JournalArticle | en_US |
eprint.status | http://purl.org/eprint/status/PeerReviewed | en_US |
dspace.orderedauthors | Katzschmann, Robert K.; Araki, Brandon; Rus, Daniela | en_US |
dspace.embargo.terms | N | en_US |
dc.identifier.orcid | https://orcid.org/0000-0001-7143-7259 | |
dc.identifier.orcid | https://orcid.org/0000-0001-5473-3566 | |
dc.identifier.orcid | https://orcid.org/0000-0002-3094-1587 | |
mit.license | PUBLISHER_CC | en_US |