Show simple item record

dc.contributor.authorHsiao, Kaijen
dc.contributor.authorLozano-Pérez, Tomás
dc.date.accessioned2005-12-14T19:19:57Z
dc.date.available2005-12-14T19:19:57Z
dc.date.issued2006-01
dc.identifier.urihttp://hdl.handle.net/1721.1/30251
dc.description.abstractHumans often learn to manipulate objects by observing other people. In much the same way, robots can use imitation learning to pick up useful skills. A system is detailed here for using imitation learning to teach a robot to grasp objects using both hand and whole-body grasps, which use the arms and torso as well as hands. Demonstration grasp trajectories are created by teleoperating a simulated robot to pick up simulated objects. When presented with a new object, the system compares it against the objects in a stored database to pick a demonstrated grasp used on a similar object. Both objects are modeled as a combination of primitives—boxes, cylinders, and spheres—and by considering the new object to be a transformed version of the demonstration object, contact points are mapped from one object to the other. The best kinematically feasible grasp candidate is chosen with the aid of a grasp quality metric. To test the success of the chosen grasp, a full, collision-free grasp trajectory is found and an attempt is made to execute in the simulation. The implemented system successfully picks up 92 out of 100 randomly generated test objects in simulation.en
dc.description.sponsorshipSingapore-MIT Alliance (SMA)en
dc.format.extent423043 bytes
dc.format.mimetypeapplication/pdf
dc.language.isoenen
dc.relation.ispartofseriesComputer Science (CS)en
dc.subjectImitation learningen
dc.subjectgraspingen
dc.subjectexample-based graspingen
dc.subjectwhole-body graspingen
dc.titleImitation Learning of Whole-Body Graspsen
dc.typeArticleen


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record