×

Gastroscopic image graph: application to noninvasive multitarget tracking under gastroscopy. (English) Zbl 1307.92244

Summary: Gastroscopic examination is one of the most common methods for gastric disease diagnosis. In this paper, a multitarget tracking approach is proposed to assist endoscopists in identifying lesions under gastroscopy. This approach analyzes numerous preobserved gastroscopic images and constructs a gastroscopic image graph. In this way, the deformation registration between gastroscopic images is regarded as a graph search problem. During the procedure, the endoscopist marks suspicious lesions on the screen and the graph is utilized to locate and display the lesions in the appropriate frames based on the calculated registration model. Compared to traditional gastroscopic lesion surveillance methods (e.g., tattooing or probe-based optical biopsy), this approach is noninvasive and does not require additional instruments. In order to assess and quantify the performance, this approach was applied to stomach phantom data and in vivo data. The clinical experimental results demonstrated that the accuracy at angularis, antral, and stomach body was \(6.3 \pm 2.4\) mm, \(7.6 \pm 3.1\) mm, and \(7.9 \pm 1.6\) mm, respectively. The mean accuracy was 7.31 mm, average targeting time was 56 ms, and the \(P\) value was 0.032, which makes it an attractive candidate for clinical practice. Furthermore, this approach provides a significant reference for endoscopic target tracking of other soft tissue organs.

MSC:

92C55 Biomedical imaging and signal processing
Full Text: DOI

References:

[1] DOI: 10.1109/TMI.2012.2212718 · doi:10.1109/TMI.2012.2212718
[2] Computer Methods in Biomechanics and Biomedical Engineering 17 (2) pp 1– (2012)
[3] DOI: 10.1136/gut.2005.081497 · doi:10.1136/gut.2005.081497
[4] DOI: 10.1007/s10916-011-9769-z · doi:10.1007/s10916-011-9769-z
[5] DOI: 10.1016/S0016-5107(91)70627-5 · doi:10.1016/S0016-5107(91)70627-5
[6] DOI: 10.1007/s00464-004-2217-0 · doi:10.1007/s00464-004-2217-0
[7] Endoscopic Surgery and Allied Technologies 2 (1) pp 42– (1994)
[8] DOI: 10.1007/BF02050141 · doi:10.1007/BF02050141
[9] DOI: 10.1016/j.media.2011.11.005 · doi:10.1016/j.media.2011.11.005
[10] DOI: 10.1016/j.cviu.2006.10.010 · doi:10.1016/j.cviu.2006.10.010
[11] DOI: 10.1155/2013/834192 · Zbl 1307.92264 · doi:10.1155/2013/834192
[12] DOI: 10.1007/978-3-642-40763-5_9 · doi:10.1007/978-3-642-40763-5_9
[13] DOI: 10.1136/gut.2008.157461 · doi:10.1136/gut.2008.157461
[14] DOI: 10.1053/j.gastro.2012.09.054 · doi:10.1053/j.gastro.2012.09.054
[15] DOI: 10.1109/TMI.2002.803111 · doi:10.1109/TMI.2002.803111
[16] DOI: 10.1016/j.neuroimage.2009.02.043 · doi:10.1016/j.neuroimage.2009.02.043
[17] DOI: 10.1016/j.neuroimage.2011.07.036 · doi:10.1016/j.neuroimage.2011.07.036
[18] DOI: 10.1109/TMI.2013.2265603 · doi:10.1109/TMI.2013.2265603
[19] DOI: 10.1155/2013/236315 · doi:10.1155/2013/236315
[20] DOI: 10.1109/34.310690 · doi:10.1109/34.310690
[21] DOI: 10.1023/B:VISI.0000027790.02288.f2 · doi:10.1023/B:VISI.0000027790.02288.f2
[22] DOI: 10.1007/11744023_32 · doi:10.1007/11744023_32
[23] Lecture Notes in Computer Science 5305, in: CenSurE: center surround extremas for realtime feature detection and matching pp 102– (2008) · doi:10.1007/978-3-540-88693-8_8
[24] DOI: 10.1145/358669.358692 · doi:10.1145/358669.358692
[25] DOI: 10.1007/BF01386390 · Zbl 0092.16002 · doi:10.1007/BF01386390
[26] DOI: 10.1109/TMI.2010.2076299 · doi:10.1109/TMI.2010.2076299
[27] DOI: 10.1016/j.neuroimage.2009.09.027 · doi:10.1016/j.neuroimage.2009.09.027
[28] DOI: 10.1016/j.neuroimage.2004.04.020 · doi:10.1016/j.neuroimage.2004.04.020
This reference list is based on information provided by the publisher or from digital mathematics libraries. Its items are heuristically matched to zbMATH identifiers and may contain data conversion errors. In some cases that data have been complemented/enhanced by data from zbMATH Open. This attempts to reflect the references listed in the original paper as accurately as possible without claiming completeness or a perfect matching.