dc.contributor.author | Yang, Liangjing | |
dc.contributor.author | Paranawithana, Ishara | |
dc.contributor.author | Youcef-Toumi, Kamal | |
dc.contributor.author | Tan, U-Xuan | |
dc.date.accessioned | 2019-01-15T17:04:56Z | |
dc.date.available | 2019-01-15T17:04:56Z | |
dc.date.issued | 2018-10 | |
dc.identifier.issn | 1545-5955 | |
dc.identifier.issn | 1558-3783 | |
dc.identifier.uri | http://hdl.handle.net/1721.1/120060 | |
dc.description.abstract | In this paper, an automatic vision-guided micromanipulation approach to facilitate versatile deployment and portable setup is proposed. This paper is motivated by the importance of micromanipulation and the limitations in existing automation technology in micromanipulation. Despite significant advancements in micromanipulation techniques, there remain bottlenecks in integrating and adopting automation for this application. An underlying reason for the gaps is the difficulty in deploying and setting up such systems. To address this, we identified two important design requirements, namely, portability and versatility of the micromanipulation platform. A self-contained vision-guided approach requiring no complicated preparation or setup is proposed. This is achieved through an uncalibrated self-initializing workflow algorithm also capable of assisted targeting. The feasibility of the solution is demonstrated on a low-cost portable microscope camera and compact actuated microstages. Results suggest subpixel accuracy in localizing the tool tip during initialization steps. The self-focus mechanism could recover intentional blurring of the tip by autonomously manipulating it 95.3% closer to the focal plane. The average error in visual servo is less than a pixel with our depth compensation mechanism showing better maintaining of similarity score in tracking. Cell detection rate in a 1637-frame video stream is 97.7% with subpixels localization uncertainty. Our work addresses the gaps in existing automation technology in the application of robotic vision-guided micromanipulation and potentially contributes to the way cell manipulation is performed. Note to Practitioners - This paper introduces an automatic method for micromanipulation using visual information from microscopy. We design an automatic workflow, which consists of: 1) self-initialization; 2) vision-guided manipulation; and 3) assisted targeting, and demonstrate versatile deployment of the micromanipulator on a portable microscope camera setup. Unlike existing systems, our proposed method does not require any tedious calibration or expensive setup making it mobile and low cost. This overcomes the constraints of traditional practices that confine automated cell manipulation to a laboratory setting. By extending the application beyond the laboratory environment, automated micromanipulation technology can be made more ubiquitous and expands readily to facilitate field study. | en_US |
dc.description.sponsorship | SUTD-MIT International Design Centre (IDC) | en_US |
dc.publisher | Institute of Electrical and Electronics Engineers (IEEE) | en_US |
dc.relation.isversionof | http://dx.doi.org/10.1109/TASE.2017.2754517 | en_US |
dc.rights | Creative Commons Attribution-Noncommercial-Share Alike | en_US |
dc.rights.uri | http://creativecommons.org/licenses/by-nc-sa/4.0/ | en_US |
dc.source | Other repository | en_US |
dc.title | Automatic Vision-Guided Micromanipulation for Versatile Deployment and Portable Setup | en_US |
dc.type | Article | en_US |
dc.identifier.citation | Yang, Liangjing, Ishara Paranawithana, Kamal Youcef-Toumi, and U-Xuan Tan. “Automatic Vision-Guided Micromanipulation for Versatile Deployment and Portable Setup.” IEEE Transactions on Automation Science and Engineering 15, no. 4 (October 2018): 1609–1620. | en_US |
dc.contributor.department | Massachusetts Institute of Technology. Department of Mechanical Engineering | en_US |
dc.contributor.mitauthor | Yang, Liangjing | |
dc.contributor.mitauthor | Youcef-Toumi, Kamal | |
dc.relation.journal | IEEE Transactions on Automation Science and Engineering | en_US |
dc.eprint.version | Author's final manuscript | en_US |
dc.type.uri | http://purl.org/eprint/type/JournalArticle | en_US |
eprint.status | http://purl.org/eprint/status/PeerReviewed | en_US |
dc.date.updated | 2019-01-15T16:14:51Z | |
dspace.orderedauthors | Yang, Liangjing; Paranawithana, Ishara; Youcef-Toumi, Kamal; Tan, U-Xuan | en_US |
dspace.embargo.terms | N | en_US |
mit.license | OPEN_ACCESS_POLICY | en_US |