| dc.contributor.author | Boroushaki, Tara | |
| dc.contributor.author | Leng, Junshan | |
| dc.contributor.author | Clester, Ian | |
| dc.contributor.author | Rodriguez, Alberto | |
| dc.contributor.author | Adib, Fadel | |
| dc.date.accessioned | 2022-11-21T19:42:31Z | |
| dc.date.available | 2022-11-21T19:42:31Z | |
| dc.date.issued | 2021 | |
| dc.identifier.uri | https://hdl.handle.net/1721.1/146572 | |
| dc.description.abstract | We present the design, implementation, and evaluation of RF-Grasp, a robotic
system that can grasp fully-occluded objects in unknown and unstructured
environments. Unlike prior systems that are constrained by the line-of-sight
perception of vision and infrared sensors, RF-Grasp employs RF (Radio
Frequency) perception to identify and locate target objects through occlusions,
and perform efficient exploration and complex manipulation tasks in
non-line-of-sight settings.
RF-Grasp relies on an eye-in-hand camera and batteryless RFID tags attached
to objects of interest. It introduces two main innovations: (1) an RF-visual
servoing controller that uses the RFID's location to selectively explore the
environment and plan an efficient trajectory toward an occluded target, and (2)
an RF-visual deep reinforcement learning network that can learn and execute
efficient, complex policies for decluttering and grasping.
We implemented and evaluated an end-to-end physical prototype of RF-Grasp. We
demonstrate it improves success rate and efficiency by up to 40-50% over a
state-of-the-art baseline. We also demonstrate RF-Grasp in novel tasks such
mechanical search of fully-occluded objects behind obstacles, opening up new
possibilities for robotic manipulation. Qualitative results (videos) available
at rfgrasp.media.mit.edu | en_US |
| dc.language.iso | en | |
| dc.publisher | Institute of Electrical and Electronics Engineers (IEEE) | en_US |
| dc.relation.isversionof | 10.1109/ICRA48506.2021.9560956 | en_US |
| dc.rights | Creative Commons Attribution-Noncommercial-Share Alike | en_US |
| dc.rights.uri | http://creativecommons.org/licenses/by-nc-sa/4.0/ | en_US |
| dc.source | MIT web domain | en_US |
| dc.title | Robotic Grasping of Fully-Occluded Objects using RF Perception | en_US |
| dc.type | Article | en_US |
| dc.identifier.citation | Boroushaki, Tara, Leng, Junshan, Clester, Ian, Rodriguez, Alberto and Adib, Fadel. 2021. "Robotic Grasping of Fully-Occluded Objects using RF Perception." 2021 IEEE International Conference on Robotics and Automation (ICRA). | |
| dc.contributor.department | Massachusetts Institute of Technology. Media Laboratory | |
| dc.contributor.department | Massachusetts Institute of Technology. Department of Mechanical Engineering | |
| dc.contributor.department | Massachusetts Institute of Technology. Department of Electrical Engineering and Computer Science | |
| dc.relation.journal | 2021 IEEE International Conference on Robotics and Automation (ICRA) | en_US |
| dc.eprint.version | Author's final manuscript | en_US |
| dc.type.uri | http://purl.org/eprint/type/ConferencePaper | en_US |
| eprint.status | http://purl.org/eprint/status/NonPeerReviewed | en_US |
| dc.date.updated | 2022-11-21T18:31:03Z | |
| dspace.orderedauthors | Boroushaki, T; Leng, J; Clester, I; Rodriguez, A; Adib, F | en_US |
| dspace.date.submission | 2022-11-21T18:31:12Z | |
| mit.license | OPEN_ACCESS_POLICY | |
| mit.metadata.status | Authority Work and Publication Information Needed | en_US |