Show simple item record

dc.contributor.authorYang, Liangjing
dc.contributor.authorParanawithana, Ishara
dc.contributor.authorYoucef-Toumi, Kamal
dc.contributor.authorTan, U-Xuan
dc.date.accessioned2019-01-15T16:44:52Z
dc.date.available2019-01-15T16:44:52Z
dc.date.issued2017-09
dc.identifier.isbn978-1-5386-2682-5
dc.identifier.urihttp://hdl.handle.net/1721.1/120055
dc.description.abstractIn this paper, we propose a workflow algorithm for timely tracking of the tool tip during cell manipulation using a template-based approach augmented with low level feature detection. Doing so addresses the problem of adverse influences on template-based tracking during tool-cell interaction while maintaining an efficient track-servo framework. This consideration is important in developing autonomous robotic vision-guided micromanipulators. Our method facilitates vision-guided micromanipulation autonomously without manual interventions even during tool-cell interaction. This is done by decomposing the process to four scenarios that operate on their respective mode. The self-initializing mode is first used to localize and focus a region of interest (ROI) which the tip lies in. Once in focus, the tip is manipulated using a unified visual track-servo template-based approach. A reinitialization mechanism will be triggered to prevent tracking from being interrupted by partial cell occlusion of the tracking ROI. This mechanism uses the self-initializing concept combining motion cue and low-level feature detection to localize the needle tip. Following the reinitialization, we further recover tracking of the needle tip using a mechanism that updates the base template. This adaptive approach ensures uninterrupted tracking even when the cell is interacting with the tool and under deformation. Results demonstrated that with the newly incorporated mechanisms, the localized position improved from an error of more than 50% to less than 10% of the specimen size. When there is no specimen in the scene the new workflow shows no adverse effect on the localization through 270 tracked frames. By incorporating reinitialization and recovery to this workflow algorithm, we hope to initiate the first step towards uncalibrated autonomous vision-guided micromanipulation process.en_US
dc.publisherInstitute of Electrical and Electronics Engineers (IEEE)en_US
dc.relation.isversionofhttp://dx.doi.org/10.1109/IROS.2017.8202283en_US
dc.rightsCreative Commons Attribution-Noncommercial-Share Alikeen_US
dc.rights.urihttp://creativecommons.org/licenses/by-nc-sa/4.0/en_US
dc.sourceOther repositoryen_US
dc.titleSelf-initialization and recovery for uninterrupted tracking in vision-guided micromanipulationen_US
dc.typeArticleen_US
dc.identifier.citationYang, Liangjing, Ishara Paranawithana, Kamal Youcef-Toumi, and U-Xuan Tan. “Self-Initialization and Recovery for Uninterrupted Tracking in Vision-Guided Micromanipulation.” 2017 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS) (September 2017).en_US
dc.contributor.departmentMassachusetts Institute of Technology. Department of Mechanical Engineeringen_US
dc.contributor.mitauthorYang, Liangjing
dc.contributor.mitauthorYoucef-Toumi, Kamal
dc.relation.journal2017 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS)en_US
dc.eprint.versionAuthor's final manuscripten_US
dc.type.urihttp://purl.org/eprint/type/ConferencePaperen_US
eprint.statushttp://purl.org/eprint/status/NonPeerRevieweden_US
dc.date.updated2019-01-15T16:27:07Z
dspace.orderedauthorsYang, Liangjing; Paranawithana, Ishara; Youcef-Toumi, Kamal; Tan, U-Xuanen_US
dspace.embargo.termsNen_US
mit.licenseOPEN_ACCESS_POLICYen_US


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record