Cited time in webofscience Cited time in scopus

Full metadata record

DC Field Value Language
dc.contributor.author Ullah, Ihsan -
dc.contributor.author Chikontwe, Philip -
dc.contributor.author Park, Sang Hyun -
dc.date.accessioned 2019-12-12T08:48:57Z -
dc.date.available 2019-12-12T08:48:57Z -
dc.date.created 2019-11-06 -
dc.date.issued 2019-10 -
dc.identifier.citation IEEE Access, v.7, pp.159743 - 159753 -
dc.identifier.issn 2169-3536 -
dc.identifier.uri http://hdl.handle.net/20.500.11750/10958 -
dc.description.abstract Studies are proceeded to stabilize cardiac surgery using thin micro-guidewires and catheter robots. To control the robot to a desired position and pose, it is necessary to accurately track the robot tip in real time but tracking and accurately delineating the thin and small tip is challenging. To address this problem, a novel image analysis-based tracking method using deep convolutional neural networks (CNN) has been proposed in this paper. The proposed tracker consists of two parts; (1) a detection network for rough detection of the tip position and (2) a segmentation network for accurate tip delineation near the tip position. To learn a robust real-time tracker, we extract small image patches, including the tip in successive frames and then learn the informative spatial and motion features for the segmentation network. During inference, the tip bounding box is first estimated in the initial frame via the detection network, thereafter tip delineation is consecutively performed through the segmentation network in the following frames. The proposed method enables accurate delineation of the tip in real time and automatically restarts tracking via the detection network when tracking fails in challenging frames. Experimental results show that the proposed method achieves better tracking accuracy than existing methods, with a considerable real-time speed of 19ms. -
dc.language English -
dc.publisher Institute of Electrical and Electronics Engineers Inc. -
dc.title Real-time Tracking of Guidewire Robot Tips using Deep Convolutional Neural Networks on Successive Localized Frames -
dc.type Article -
dc.identifier.doi 10.1109/ACCESS.2019.2950263 -
dc.identifier.wosid 000497167600074 -
dc.identifier.scopusid 2-s2.0-85078290453 -
dc.type.local Article(Overseas) -
dc.type.rims ART -
dc.description.journalClass 1 -
dc.citation.publicationname IEEE Access -
dc.contributor.nonIdAuthor Ullah, Ihsan -
dc.contributor.nonIdAuthor Chikontwe, Philip -
dc.identifier.citationVolume 7 -
dc.identifier.citationStartPage 159743 -
dc.identifier.citationEndPage 159753 -
dc.identifier.citationTitle IEEE Access -
dc.type.journalArticle Article -
dc.description.isOpenAccess N -
dc.subject.keywordAuthor Catheters -
dc.subject.keywordAuthor Tracking -
dc.subject.keywordAuthor Image segmentation -
dc.subject.keywordAuthor Real-time systems -
dc.subject.keywordAuthor Motion segmentation -
dc.subject.keywordAuthor Feature extraction -
dc.subject.keywordAuthor Convolutional neural networks -
dc.subject.keywordAuthor micro-robot tracking -
dc.subject.keywordAuthor guidewire tracking -
dc.subject.keywordAuthor patch-wise segmentation -
dc.subject.keywordPlus NAVIGATION -
dc.contributor.affiliatedAuthor Ullah, Ihsan -
dc.contributor.affiliatedAuthor Chikontwe, Philip -
dc.contributor.affiliatedAuthor Park, Sang Hyun -

qrcode

  • twitter
  • facebook
  • mendeley

Items in Repository are protected by copyright, with all rights reserved, unless otherwise indicated.

BROWSE