×

A comparative study of variational autoencoders, normalizing flows, and score-based diffusion models for electrical impedance tomography. (English) Zbl 07892356

Summary: Electrical Impedance Tomography (EIT) is a widely employed imaging technique in industrial inspection, geophysical prospecting, and medical imaging. However, the inherent nonlinearity and ill-posedness of EIT image reconstruction present challenges for classical regularization techniques, such as the critical selection of regularization terms and the lack of prior knowledge. Deep generative models (DGMs) have been shown to play a crucial role in learning implicit regularizers and prior knowledge. This study aims to investigate the potential of three DGMs – variational autoencoder networks, normalizing flow, and score-based diffusion model – to learn implicit regularizers in learning-based EIT imaging. We first introduce background information on EIT imaging and its inverse problem formulation. Next, we propose three algorithms for performing EIT inverse problems based on corresponding DGMs. Finally, we present numerical and visual experiments, which reveal that (1) no single method consistently outperforms the others across all settings, and (2) when reconstructing an object with two anomalies using a well-trained model based on a training dataset containing four anomalies, the conditional normalizing flow (CNF) model exhibits the best generalization in low-level noise, while the conditional score-based diffusion model (CSD*) demonstrates the best generalization in high-level noise settings. We hope our preliminary efforts will encourage other researchers to assess their DGMs in EIT and other nonlinear inverse problems.

MSC:

78A46 Inverse problems (including inverse scattering) in optics and electromagnetic theory
68U10 Computing methodologies for image processing

References:

[1] A. Adler and D. Holder, Electrical Impedance Tomography: Methods, History and Applications, 2nd ed, CRC Press, Boca Raton, 2021.
[2] L. Ardizzone, C. Lüth, J. Kruse, C. Rother and U. Köthe, Guided image generation with conditional invertible neural networks, preprint (2019), https://arxiv.org/abs/1907.02392.
[3] P. Bohra, T.-A. Pham, J. Dong and M. Unser, Bayesian inversion for nonlinear imaging models using deep generative priors, IEEE Trans. Comput. Imaging. 8 (2022), 1237-1249.
[4] H. Chung, J. Huh, G. Kim, Y. K. Park and J. C. Ye, Missing cone artifact removal in odt using unsupervised deep learning in the projection domain, IEEE Trans. Comput. Imaging 7 (2021), 747-758.
[5] H. Chung, J. Kim, M. T. Mccann, M. L. Klasky and J. C. Ye, Diffusion posterior sampling for general noisy inverse problems, The Eleventh International Conference on Learning Representations, (2023), https://openreview.net/forum?id=OnD9zGAGT0k.
[6] H. Chung, B. Sim and J. C. Ye, Come-closer-diffuse-faster: Accelerating conditional diffusion models for inverse problems through stochastic contraction, Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, IEEE Press, Piscataway (2022), 12413-12422.
[7] F. Colibazzi, D. Lazzaro, S. Morigi and A. Samoré, Learning nonlinear electrical impedance tomography, J. Sci. Comput. 90 (2022), 1-23. · Zbl 1487.65132
[8] A. Denker, M. Schmidt, J. Leuschner and P. Maass, Conditional invertible neural networks for medical imaging, J. Imaging 7 (2021), Paper No. 243.
[9] L. Dinh, D. Krueger and Y. Bengio, NICE: Non-linear independent components estimation, 3rd International Conference on Learning Representations Workshop Track Proceedings, (2015), https://dblp.org/rec/journals/corr/DinhKB14.
[10] L. Dinh, J. Sohl-Dickstein and S. Bengio, Density estimation using real NVP, International Conference on Learning Representations, (2017), https://openreview.net/forum?id=HkpbnH9lx.
[11] M. Gehre, T. Kluth, A. Lipponen, B. Jin, A. Seppänen, J. P. Kaipio and P. Maass, Sparsity reconstruction in electrical impedance tomography: An experimental evaluation, J. Comput. Appl. Math. 236 (2012), 2126-2136. · Zbl 1251.78008
[12] I. Goodfellow, J. Pouget-Abadie, M. Mirza, B. Xu, D. Warde-Farley, S. Ozair, A. Courville and Y. Bengio, Generative adversarial networks, Commun. ACM 63 (2020), no. 11, 139-144.
[13] U. Grenander and M. I. Miller, Representations of knowledge in complex systems, J. Roy. Stat. Soc. Ser. B (Methodological) 56 (1994), 549-581. · Zbl 0814.62009
[14] R. Guo, T. Huang, M. Li, H. Zhang and Y. C. Eldar, Physics-embedded machine learning for electromagnetic data imaging: Examining three types of data-driven imaging methods, IEEE Signal Process. Mag. 40 (2023), 18-31.
[15] R. Guo and J. Jiang, Construct deep neural networks based on direct sampling methods for solving electrical impedance tomography, SIAM J. Sci. Comput. 43 (2021), B678-B711. · Zbl 1479.65020
[16] W. Herzberg, D. B. Rowe, A. Hauptmann and S. J. Hamilton, Graph convolutional networks for model-based learning in nonlinear inverse problems, IEEE Trans. Comput. Imaging 7 (2021), 1341-1353.
[17] B. Kawar, J. Song, S. Ermon and M. Elad, JPEG artifact correction using denoising diffusion restoration models, NeurIPS 2022 Workshop on Score-Based Methods, (2022), https://deepai.org/publication/jpeg-artifact-correction-using-denoising-diffusion-restoration-models.
[18] D. P. Kingma and M. Welling, Auto-encoding variational bayes, International Conference on Learning Representations, (2014), https://openreview.net/forum?id=33X9fd2-9FyZd.
[19] B. Leah and S. Nir, Strong solutions for PDE-based tomography by unsupervised learning, SIAM J. Imaging Sci. 343 (2021), 128-155. · Zbl 1541.65136
[20] D. Li, Differentiable gaussianization layers for inverse problems regularized by deep generative models, preprint (2021), https://arxiv.org/abs/2112.03860.
[21] B. Liu, B. Yang, C. Xu, J. Xia, M. Dai, Z. Ji, F. You, X. Dong, X. Shi and F. Fu, Pyeit: A python based framework for electrical impedance tomography, SoftwareX 7 (2018), 304-308.
[22] G. Parisi, Correlation functions and computer simulations, Nuclear Phys. B 180 (1981), 378-384.
[23] C. Saharia, J. Ho, W. Chan, T. Salimans, D. J. Fleet and M. Norouzi, Image super-resolution via iterative refinement, IEEE Trans. Pattern Anal. Mach. Intell. 45 (2023), 4713-4726.
[24] J. K. Seo, K. C. Kim, A. Jargal, K. Lee and B. Harrach, A learning-based method for solving ill-posed nonlinear inverse problems: A simulation study of lung eit, SIAM J. Imaging Sci. 12 (2019), 1275-1295.
[25] V. Singh, S. Jandial, A. Chopra, S. Ramesh, B. Krishnamurthy and V. N. Balasubramanian, On conditioning the input noise for controlled image generation with diffusion models, preprint (2022), https://arxiv.org/abs/2205.03859.
[26] E. Somersalo and J. Kaipio, Statistical and Computational Inverse Problems, Appl. Math. Sci. 160, Springer, New York, 2005. · Zbl 1068.65022
[27] J. Song, A. Vahdat, M. Mardani and J. Kautz, Pseudoinverse-guided diffusion models for inverse problems, International Conference on Learning Representations, (2023), https://openreview.net/pdf/210093330709030207aa90dbfe2a1f525ac5fb7d.pdf.
[28] Y. Song, C. Durkan, I. Murray and S. Ermon, Maximum likelihood training of score-based diffusion models, Advances in Neural Information Processing Systems, (2021), https://papers.nips.cc/paper/2021/file/0a9fdbb17feb6ccb7ec405cfb85222c4-Paper.pdf.
[29] Y. Song, L. Shen, L. Xing and S. Ermon, Solving inverse problems in medical imaging with score-based generative models, International Conference on Learning Representations, (2022), https://openreview.net/forum?id=vaRCHVj0uGI.
[30] Y. Song, J. Sohl-Dickstein, D. P. Kingma, A. Kumar, S. Ermon and B. Poole, Score-based generative modeling through stochastic differential equations, International Conference on Learning Representations, (2021), https://openreview.net/pdf/ef0eadbe07115b0853e964f17aa09d811cd490f1.pdf.
[31] P. Vincent, A connection between score matching and denoising autoencoders, Neural Comput. 23 (2011), 1661-1674. · Zbl 1218.68133
[32] H. Wang, C. Wang and W. Yin, A pre-iteration method for the inverse problem in electrical impedance tomography, IEEE Trans. Instrumen. Measurement 53 (2004), 1093-1096.
[33] Z. Wang, A. C. Bovik, H. R. Sheikh and E. P. Simoncelli, Image quality assessment: From error visibility to structural similarity, IEEE Trans. Image Process. 13 (2004), 600-612.
[34] C. Winkler, D. Worrall, E. Hoogeboom and M. Welling, Learning likelihoods with conditional normalizing flows, preprint (2020), https://arxiv.org/abs/1912.00042.
[35] K. Zhang, R. Guo, M. Li, F. Yang, S. Xu and A. Abubakar, Supervised descent learning for thoracic electrical impedance tomography, IEEE Trans. Biomed. Eng. 68 (2020), 1360-1369.
This reference list is based on information provided by the publisher or from digital mathematics libraries. Its items are heuristically matched to zbMATH identifiers and may contain data conversion errors. In some cases that data have been complemented/enhanced by data from zbMATH Open. This attempts to reflect the references listed in the original paper as accurately as possible without claiming completeness or a perfect matching.