Show simple item record

dc.contributor.authorWalter, Matthew R.
dc.contributor.authorAntone, Matthew
dc.contributor.authorChuangsuwanich, Ekapol
dc.contributor.authorCorrea, Andrew
dc.contributor.authorDavis, Randall
dc.contributor.authorFletcher, Luke
dc.contributor.authorFrazzoli, Emilio
dc.contributor.authorFriedman, Yuli
dc.contributor.authorHow, Jonathan P.
dc.contributor.authorJeon, Jeong hwan
dc.contributor.authorKaraman, Sertac
dc.contributor.authorLuders, Brandon
dc.contributor.authorRoy, Nicholas
dc.contributor.authorTellex, Stefanie
dc.contributor.authorTeller, Seth
dc.contributor.authorGlass, James R.
dc.date.accessioned2016-04-20T19:33:26Z
dc.date.available2016-04-20T19:33:26Z
dc.date.issued2014-09
dc.identifier.issn15564959
dc.identifier.issn1556-4967
dc.identifier.urihttp://hdl.handle.net/1721.1/102283
dc.description.abstractOne long-standing challenge in robotics is the realization of mobile autonomous robots able to operate safely in human workplaces, and be accepted by the human occupants. We describe the development of a multiton robotic forklift intended to operate alongside people and vehicles, handling palletized materials within existing, active outdoor storage facilities. The system has four novel characteristics. The first is a multimodal interface that allows users to efficiently convey task-level commands to the robot using a combination of pen-based gestures and natural language speech. These tasks include the manipulation, transport, and placement of palletized cargo within dynamic, human-occupied warehouses. The second is the robot's ability to learn the visual identity of an object from a single user-provided example and use the learned model to reliably and persistently detect objects despite significant spatial and temporal excursions. The third is a reliance on local sensing that allows the robot to handle variable palletized cargo and navigate within dynamic, minimally prepared environments without a global positioning system. The fourth concerns the robot's operation in close proximity to people, including its human supervisor, pedestrians who may cross or block its path, moving vehicles, and forklift operators who may climb inside the robot and operate it manually. This is made possible by interaction mechanisms that facilitate safe, effective operation around people. This paper provides a comprehensive description of the system's architecture and implementation, indicating how real-world operational requirements motivated key design choices. We offer qualitative and quantitative analyses of the robot operating in real settings and discuss the lessons learned from our effort.en_US
dc.language.isoen_US
dc.publisherWiley Blackwellen_US
dc.relation.isversionofhttp://dx.doi.org/10.1002/rob.21539en_US
dc.rightsCreative Commons Attribution-Noncommercial-Share Alikeen_US
dc.rights.urihttp://creativecommons.org/licenses/by-nc-sa/4.0/en_US
dc.sourceProf. Davis via Phoebe Ayresen_US
dc.titleA Situationally Aware Voice-commandable Robotic Forklift Working Alongside People in Unstructured Outdoor Environmentsen_US
dc.typeArticleen_US
dc.identifier.citationWalter, Matthew R., Matthew Antone, Ekapol Chuangsuwanich, Andrew Correa, Randall Davis, Luke Fletcher, Emilio Frazzoli, et al. “A Situationally Aware Voice-Commandable Robotic Forklift Working Alongside People in Unstructured Outdoor Environments.” J. Field Robotics 32, no. 4 (September 19, 2014): 590–628.en_US
dc.contributor.departmentMassachusetts Institute of Technology. Computer Science and Artificial Intelligence Laboratoryen_US
dc.contributor.departmentMassachusetts Institute of Technology. Department of Aeronautics and Astronauticsen_US
dc.contributor.departmentMassachusetts Institute of Technology. Department of Electrical Engineering and Computer Scienceen_US
dc.contributor.approverDavis, Randallen_US
dc.contributor.mitauthorWalter, Matthew R.en_US
dc.contributor.mitauthorAntone, Matthewen_US
dc.contributor.mitauthorChuangsuwanich, Ekapolen_US
dc.contributor.mitauthorCorrea, Andrewen_US
dc.contributor.mitauthorDavis, Randallen_US
dc.contributor.mitauthorFletcher, Lukeen_US
dc.contributor.mitauthorFrazzoli, Emilioen_US
dc.contributor.mitauthorFriedman, Yulien_US
dc.contributor.mitauthorGlass, James R.en_US
dc.contributor.mitauthorHow, Jonathan P.en_US
dc.contributor.mitauthorJeon, Jeong hwanen_US
dc.contributor.mitauthorKaraman, Sertacen_US
dc.contributor.mitauthorLuders, Brandonen_US
dc.contributor.mitauthorRoy, Nicholasen_US
dc.contributor.mitauthorTellex, Stefanieen_US
dc.contributor.mitauthorTeller, Sethen_US
dc.relation.journalJournal of Field Roboticsen_US
dc.eprint.versionAuthor's final manuscripten_US
dc.type.urihttp://purl.org/eprint/type/JournalArticleen_US
eprint.statushttp://purl.org/eprint/status/PeerRevieweden_US
dspace.orderedauthorsWalter, Matthew R.; Antone, Matthew; Chuangsuwanich, Ekapol; Correa, Andrew; Davis, Randall; Fletcher, Luke; Frazzoli, Emilio; Friedman, Yuli; Glass, James; How, Jonathan P.; Jeon, Jeong hwan; Karaman, Sertac; Luders, Brandon; Roy, Nicholas; Tellex, Stefanie; Teller, Sethen_US
dc.identifier.orcidhttps://orcid.org/0000-0002-3097-360X
dc.identifier.orcidhttps://orcid.org/0000-0002-4896-8411
dc.identifier.orcidhttps://orcid.org/0000-0001-8576-1930
dc.identifier.orcidhttps://orcid.org/0000-0001-5232-7281
dc.identifier.orcidhttps://orcid.org/0000-0002-2225-7275
dc.identifier.orcidhttps://orcid.org/0000-0002-0505-1400
dc.identifier.orcidhttps://orcid.org/0000-0002-8293-0492
dc.identifier.orcidhttps://orcid.org/0000-0003-1606-0799
mit.licenseOPEN_ACCESS_POLICYen_US


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record