MIT Libraries logoDSpace@MIT

MIT
View Item 
  • DSpace@MIT Home
  • MIT Open Access Articles
  • MIT Open Access Articles
  • View Item
  • DSpace@MIT Home
  • MIT Open Access Articles
  • MIT Open Access Articles
  • View Item
JavaScript is disabled for your browser. Some features of this site may not work without it.

Appearance-based object reacquisition for mobile manipulation

Author(s)
Walter, Matthew R.; Friedman, Yuli; Antone, Matthew; Teller, Seth
Thumbnail
DownloadTeller_Appearance-based.pdf (2.733Mb)
PUBLISHER_POLICY

Publisher Policy

Article is made available in accordance with the publisher's policy and may be subject to US copyright law. Please refer to the publisher's site for terms of use.

Terms of use
Article is made available in accordance with the publisher's policy and may be subject to US copyright law. Please refer to the publisher's site for terms of use.
Metadata
Show full item record
Abstract
This paper describes an algorithm enabling a human supervisor to convey task-level information to a robot by using stylus gestures to circle one or more objects within the field of view of a robot-mounted camera. These gestures serve to segment the unknown objects from the environment. Our method's main novelty lies in its use of appearance-based object “reacquisition” to reconstitute the supervisory gestures (and corresponding segmentation hints), even for robot viewpoints spatially and/or temporally distant from the viewpoint underlying the original gesture. Reacquisition is particularly challenging within relatively dynamic and unstructured environments. The technical challenge is to realize a reacquisition capability robust enough to appearance variation to be useful in practice. Whenever the supervisor indicates an object, our system builds a feature-based appearance model of the object. When the object is detected from subsequent viewpoints, the system automatically and opportunistically incorporates additional observations, revising the appearance model and reconstituting the rough contours of the original circling gesture around that object. Our aim is to exploit reacquisition in order to both decrease the user burden of task specification and increase the effective autonomy of the robot. We demonstrate and analyze the approach on a robotic forklift designed to approach, manipulate, transport and place palletized cargo within an outdoor warehouse. We show that the method enables gesture reuse over long timescales and robot excursions (tens of minutes and hundreds of meters).
Date issued
2010-06
URI
http://hdl.handle.net/1721.1/63168
Department
Massachusetts Institute of Technology. Computer Science and Artificial Intelligence Laboratory; Massachusetts Institute of Technology. Department of Electrical Engineering and Computer Science
Journal
IEEE Computer Society Conference on Computer Vision and Pattern Recognition (2010 : San Francisco, Calif.). Workshops.
Publisher
Institute of Electrical and Electronics Engineers / IEEE Computer Society
Citation
Walter, M.R. et al. “Appearance-based object reacquisition for mobile manipulation.” Computer Vision and Pattern Recognition Workshops (CVPRW), 2010 IEEE Computer Society Conference on. 2010. 1-8. Copyright © 2010, IEEE
Version: Final published version
Other identifiers
INSPEC Accession Number: 11466676
ISBN
978-1-4244-7029-7

Collections
  • MIT Open Access Articles

Browse

All of DSpaceCommunities & CollectionsBy Issue DateAuthorsTitlesSubjectsThis CollectionBy Issue DateAuthorsTitlesSubjects

My Account

Login

Statistics

OA StatisticsStatistics by CountryStatistics by Department
MIT Libraries
PrivacyPermissionsAccessibilityContact us
MIT
Content created by the MIT Libraries, CC BY-NC unless otherwise noted. Notify us about copyright concerns.