Show simple item record

dc.contributor.authorYang, Liangjing
dc.contributor.authorParanawithana, Ishara
dc.contributor.authorYoucef-Toumi, Kamal
dc.contributor.authorTan, U-Xuan
dc.date.accessioned2019-01-15T17:04:56Z
dc.date.available2019-01-15T17:04:56Z
dc.date.issued2018-10
dc.identifier.issn1545-5955
dc.identifier.issn1558-3783
dc.identifier.urihttp://hdl.handle.net/1721.1/120060
dc.description.abstractIn this paper, an automatic vision-guided micromanipulation approach to facilitate versatile deployment and portable setup is proposed. This paper is motivated by the importance of micromanipulation and the limitations in existing automation technology in micromanipulation. Despite significant advancements in micromanipulation techniques, there remain bottlenecks in integrating and adopting automation for this application. An underlying reason for the gaps is the difficulty in deploying and setting up such systems. To address this, we identified two important design requirements, namely, portability and versatility of the micromanipulation platform. A self-contained vision-guided approach requiring no complicated preparation or setup is proposed. This is achieved through an uncalibrated self-initializing workflow algorithm also capable of assisted targeting. The feasibility of the solution is demonstrated on a low-cost portable microscope camera and compact actuated microstages. Results suggest subpixel accuracy in localizing the tool tip during initialization steps. The self-focus mechanism could recover intentional blurring of the tip by autonomously manipulating it 95.3% closer to the focal plane. The average error in visual servo is less than a pixel with our depth compensation mechanism showing better maintaining of similarity score in tracking. Cell detection rate in a 1637-frame video stream is 97.7% with subpixels localization uncertainty. Our work addresses the gaps in existing automation technology in the application of robotic vision-guided micromanipulation and potentially contributes to the way cell manipulation is performed. Note to Practitioners - This paper introduces an automatic method for micromanipulation using visual information from microscopy. We design an automatic workflow, which consists of: 1) self-initialization; 2) vision-guided manipulation; and 3) assisted targeting, and demonstrate versatile deployment of the micromanipulator on a portable microscope camera setup. Unlike existing systems, our proposed method does not require any tedious calibration or expensive setup making it mobile and low cost. This overcomes the constraints of traditional practices that confine automated cell manipulation to a laboratory setting. By extending the application beyond the laboratory environment, automated micromanipulation technology can be made more ubiquitous and expands readily to facilitate field study.en_US
dc.description.sponsorshipSUTD-MIT International Design Centre (IDC)en_US
dc.publisherInstitute of Electrical and Electronics Engineers (IEEE)en_US
dc.relation.isversionofhttp://dx.doi.org/10.1109/TASE.2017.2754517en_US
dc.rightsCreative Commons Attribution-Noncommercial-Share Alikeen_US
dc.rights.urihttp://creativecommons.org/licenses/by-nc-sa/4.0/en_US
dc.sourceOther repositoryen_US
dc.titleAutomatic Vision-Guided Micromanipulation for Versatile Deployment and Portable Setupen_US
dc.typeArticleen_US
dc.identifier.citationYang, Liangjing, Ishara Paranawithana, Kamal Youcef-Toumi, and U-Xuan Tan. “Automatic Vision-Guided Micromanipulation for Versatile Deployment and Portable Setup.” IEEE Transactions on Automation Science and Engineering 15, no. 4 (October 2018): 1609–1620.en_US
dc.contributor.departmentMassachusetts Institute of Technology. Department of Mechanical Engineeringen_US
dc.contributor.mitauthorYang, Liangjing
dc.contributor.mitauthorYoucef-Toumi, Kamal
dc.relation.journalIEEE Transactions on Automation Science and Engineeringen_US
dc.eprint.versionAuthor's final manuscripten_US
dc.type.urihttp://purl.org/eprint/type/JournalArticleen_US
eprint.statushttp://purl.org/eprint/status/PeerRevieweden_US
dc.date.updated2019-01-15T16:14:51Z
dspace.orderedauthorsYang, Liangjing; Paranawithana, Ishara; Youcef-Toumi, Kamal; Tan, U-Xuanen_US
dspace.embargo.termsNen_US
mit.licenseOPEN_ACCESS_POLICYen_US


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record