Show simple item record

dc.contributor.authorYang, Liangjing
dc.contributor.authorParanawithana, Ishara
dc.contributor.authorYoucef-Toumi, Kamal
dc.contributor.authorTan, U-Xuan
dc.date.accessioned2021-10-27T20:34:35Z
dc.date.available2021-10-27T20:34:35Z
dc.date.issued2020
dc.identifier.urihttps://hdl.handle.net/1721.1/136263
dc.description.abstract© 2004-2012 IEEE. This article proposes a confidence-based approach for combining two visual tracking techniques to minimize the influence of unforeseen visual tracking failures to achieve uninterrupted vision-based control. Despite research efforts in vision-guided micromanipulation, existing systems are not designed to overcome visual tracking failures, such as inconsistent illumination condition, regional occlusion, unknown structures, and nonhomogenous background scene. There remains a gap in expanding current procedures beyond the laboratory environment for practical deployment of vision-guided micromanipulation system. A hybrid tracking method, which combines motion-cue feature detection and score-based template matching, is incorporated in an uncalibrated vision-guided workflow capable of self-initializing and recovery during the micromanipulation. Weighted average, based on the respective confidence indices of the motion-cue feature localization and template-based trackers, is inferred from the statistical accuracy of feature locations and the similarity score-based template matches. Results suggest improvement of the tracking performance using hybrid tracking under the conditions. The mean errors of hybrid tracking are maintained at subpixel level under adverse experimental conditions while the original template matching approach has mean errors of 1.53, 1.73, and 2.08 pixels. The method is also demonstrated to be robust in the nonhomogeneous scene with an array of plant cells. By proposing a self-contained fusion method that overcomes unforeseen visual tracking failures using pure vision approach, we demonstrated the robustness in our developed low-cost micromanipulation platform. Note to Practitioners - Cell manipulation is traditionally done in highly specialized facilities and controlled environment. Existing vision-based methods do not readily fulfill the need for the unique requirements in cell manipulation including prospective plant cell-related applications. There is a need for robust visual tracking to overcome visual tracking failure during the automated vision-guided micromanipulation. To address the gap in maintaining continuous tracking for vision-guided micromanipulation under unforeseen visual tracking failures, we proposed a purely visual data-driven hybrid tracking approach. Our proposed confidence-based approach combines two tracking techniques to minimize the influence of scene uncertainties, hence, achieving uninterrupted vision-based control. Because of its readily deployable design, the method can be generalized for a wide range of vision-guided micromanipulation applications. This method has the potential to significantly expand the capability of cell manipulation technology to even include prospective applications associated with plant cells, which are yet to be explored.
dc.language.isoen
dc.publisherInstitute of Electrical and Electronics Engineers (IEEE)
dc.relation.isversionof10.1109/TASE.2019.2932724
dc.rightsCreative Commons Attribution 4.0 International license
dc.rights.urihttps://creativecommons.org/licenses/by/4.0/
dc.sourceIEEE
dc.titleConfidence-Based Hybrid Tracking to Overcome Visual Tracking Failures in Calibration-Less Vision-Guided Micromanipulation
dc.typeArticle
dc.contributor.departmentMassachusetts Institute of Technology. Department of Mechanical Engineering
dc.relation.journalIEEE Transactions on Automation Science and Engineering
dc.eprint.versionFinal published version
dc.type.urihttp://purl.org/eprint/type/JournalArticle
eprint.statushttp://purl.org/eprint/status/PeerReviewed
dc.date.updated2020-08-14T14:30:19Z
dspace.orderedauthorsYang, L; Paranawithana, I; Youcef-Toumi, K; Tan, U-X
dspace.date.submission2020-08-14T14:30:21Z
mit.journal.volume17
mit.journal.issue1
mit.licensePUBLISHER_CC
mit.metadata.statusAuthority Work and Publication Information Needed


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record