Automatic Vision-Guided Micromanipulation for Versatile Deployment and Portable Setup
Author(s)
Yang, Liangjing; Paranawithana, Ishara; Youcef-Toumi, Kamal; Tan, U-Xuan
Downloadtase_accepted.pdf (1.666Mb)
OPEN_ACCESS_POLICY
Open Access Policy
Creative Commons Attribution-Noncommercial-Share Alike
Terms of use
Metadata
Show full item recordAbstract
In this paper, an automatic vision-guided micromanipulation approach to facilitate versatile deployment and portable setup is proposed. This paper is motivated by the importance of micromanipulation and the limitations in existing automation technology in micromanipulation. Despite significant advancements in micromanipulation techniques, there remain bottlenecks in integrating and adopting automation for this application. An underlying reason for the gaps is the difficulty in deploying and setting up such systems. To address this, we identified two important design requirements, namely, portability and versatility of the micromanipulation platform. A self-contained vision-guided approach requiring no complicated preparation or setup is proposed. This is achieved through an uncalibrated self-initializing workflow algorithm also capable of assisted targeting. The feasibility of the solution is demonstrated on a low-cost portable microscope camera and compact actuated microstages. Results suggest subpixel accuracy in localizing the tool tip during initialization steps. The self-focus mechanism could recover intentional blurring of the tip by autonomously manipulating it 95.3% closer to the focal plane. The average error in visual servo is less than a pixel with our depth compensation mechanism showing better maintaining of similarity score in tracking. Cell detection rate in a 1637-frame video stream is 97.7% with subpixels localization uncertainty. Our work addresses the gaps in existing automation technology in the application of robotic vision-guided micromanipulation and potentially contributes to the way cell manipulation is performed. Note to Practitioners - This paper introduces an automatic method for micromanipulation using visual information from microscopy. We design an automatic workflow, which consists of: 1) self-initialization; 2) vision-guided manipulation; and 3) assisted targeting, and demonstrate versatile deployment of the micromanipulator on a portable microscope camera setup. Unlike existing systems, our proposed method does not require any tedious calibration or expensive setup making it mobile and low cost. This overcomes the constraints of traditional practices that confine automated cell manipulation to a laboratory setting. By extending the application beyond the laboratory environment, automated micromanipulation technology can be made more ubiquitous and expands readily to facilitate field study.
Date issued
2018-10Department
Massachusetts Institute of Technology. Department of Mechanical EngineeringJournal
IEEE Transactions on Automation Science and Engineering
Publisher
Institute of Electrical and Electronics Engineers (IEEE)
Citation
Yang, Liangjing, Ishara Paranawithana, Kamal Youcef-Toumi, and U-Xuan Tan. “Automatic Vision-Guided Micromanipulation for Versatile Deployment and Portable Setup.” IEEE Transactions on Automation Science and Engineering 15, no. 4 (October 2018): 1609–1620.
Version: Author's final manuscript
ISSN
1545-5955
1558-3783