Abstract
Purpose
Noise-free ultrasound images are essential for organ monitoring during regional ultrasound-guided therapy. When the affected area is located under the ribs, however, acoustic shadow is caused by the reflection of sound from hard tissues such as bone, and the image is output with missing information in this region. Therefore, in the present study, we attempt to complement the image in the missing area.
Methods
The overall flow of the complementation method to generate a shadow-free composite image is as follows. First, we constructed a binary classification method for the presence or absence of acoustic shadow on a phantom kidney based on a convolutional neural network. Second, we created a composite shadow-free image by searching for a suitable image from a time-series database and superimposing the corresponding area without shadow onto the missing area of the target image. In addition, we constructed and verified an automatic kidney mask generation method utilizing U-Net.
Results
The complementation accuracy for kidney tracking could be enhanced by template matching. Zero-mean normalized cross-correlation (ZNCC) values after complementation were higher than that of before complementation under four different data generation conditions: (i) changing the position of the bed of the robotic ultrasound diagnostic system in the translational direction, (ii) changing the probe angle in the translational direction, (iii) with the addition of rotational motion of the probe to condition (ii). Although there was large variation in the shape of the kidney contour in condition (iii), the proposed method improved the ZNCC value from 0.5437 to 0.5807.
Conclusions
The effectiveness of the proposed method was demonstrated in phantom experiments. Verification of its effectiveness in real organs is necessary in future study.
References
Powers J, Kremkau F (2011) Medical ultrasound systems. Interface focus 1(4):477–489
Flatters SJL (2008) Characterization of a model of persistent postoperative pain evoked by skin/muscle incision and retraction (SMIR). PAIN® 135(1–2):119–130
Kennedy JE (2005) High-intensity focused ultrasound in the treatment of solid tumours. Nat Rev Cancer 5(4):321–327
Hellier P et al (2010) An automatic geometrical and statistical method to detect acoustic shadows in intraoperative ultrasound brain images. Med Image Anal 14(2):195–204
Koizumi N, Seo J, Lee D, Nomiya A, Yoshinaka K, Sugita N, Matsumoto Y, Homma Y, Mitsuishi M (2010) Integration of diagnostics and therapy by ultrasound and robot technology. In: International symposium on micro-nanomechatronics and human science. IEEE, pp 53–58
Gamal E, Ahmed FE, Elmogy M, Atwan A (2016) Current trends in medical image registration and fusion. Egypt Inf J 17(1):99–124
Simonyan K, Zisserman A (2014) Very deep convolutional networks for large-scale image recognition. Arxiv preprint arXiv:1409.1556
Ronneberger O, Fischer P, Brox T (2015) U-net: convolutional networks for biomedical image segmentation. In: International conference on medical image computing and computer-assisted intervention. Springer, Cham
Otsuka A, Koizumi N, Hosoi I, Tsukihara H, Nishiyama Y (2018) Method for extracting acoustic shadows to construct an organ composite model in ultrasound images. In: 15th international conference on ubiquitous robots. IEEE, pp 719–722
Hu R, Singla R, Deeba F, Rohling RN (2019) Acoustic shadow detection: study and statistics of B-mode and radiofrequency data. Ultrasound Med Biol 45(8):2248–2257
Yasutomi S, Arakaki T, Matsuoka R, Sakai A, Komatsu R, Shozu K, Komatsu M (2021) Shadow estimation for ultrasound images using auto-encoding structures and synthetic shadows. Appl Sci 11(3):1127
Meng Q, Baumgartner C, Sinclair M, Housden J, Rajchl M, Gomez A, Kainz B (2018) Automatic shadow detection in 2d ultrasound images. In: Data driven treatment response assessment and preterm, perinatal, and paediatric image analysis. Springer, Cham, pp 66–75
Kobayashi K, Sasaki Y, Eura F, Kondo R, Tomita K, Kobayashi T, Watanabe Y, Otsuka A, Tsukihara H, Matsumoto N, Numata K, Nagaoka H, Iwai T, Iijima H, Nishiyama Y, Koizumi N (2019) Development of bed-type ultrasound diagnosis and therapeutic robot. In: 2019 IEEE international conference on cyborg and bionic systems. IEEE, pp 171–176
Abolmaesumi P, Salcudean SE, Zhu WH, Sirouspour MR, DiMaio SP (2002) Image-guided control of a robot for medical ultrasound. IEEE Trans Robot Autom 18(1):11–23
Melodelima D, N'Djin WA, Miller NR, Bamber JC, Chapelon JY (2009) Comparative study of the effects of respiratory notion on in-vivo HIFU treatments in the liver. In: IEEE international ultrasonics symposium, pp 1314–1317
Koizumi N et al (2014) Remote ultrasound diagnostic system (RUDS). J Robot Mechatron 26(3):396–397
Hazelaar C, Dahele M, Mostafavi H, van der Weide L, Slotman B, Verbakel W (2018) Markerless positional verification using template matching and triangulation of kV images acquired during irradiation for lung tumors treated in breath-hold. Phys Med Biol 63(11):115005
Hartmann W, et al. (2017) Learned multi-patch similarity. In: Proceedings of the IEEE international conference on computer vision
Kingma DP, Ba J (2014) Adam: a method for stochastic optimization. Arxiv preprint 375. arXiv:1412.6980 376
Acknowledgements
The authors gratefully acknowledge the support by Hideyo Miyazaki (Center Hospital of the National Center for Global Health and Medicine), Hideyuki Iijima, Toshiyuki Iwai, Hidetoshi Nagaoka (Obayashi Mfg. Co., Ltd.), the financial support by JSPS KAKENHI JP20H02113 Grant Number and the Saitama Prefecture New Technology and Product Development Subsidy Project.
Author information
Authors and Affiliations
Corresponding author
Ethics declarations
Conflict of interest
The authors declare that they have no conflict of interest.
Ethical approval
This article does not contain any studies with human participants or animals performed by any of the authors.
Informed consent
This article does not contain patient data.
Additional information
Publisher's Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
About this article
Cite this article
Matsuyama, M., Koizumi, N., Otsuka, A. et al. A novel complementation method of an acoustic shadow region utilizing a convolutional neural network for ultrasound-guided therapy. Int J CARS 17, 107–119 (2022). https://doi.org/10.1007/s11548-021-02525-8
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s11548-021-02525-8