×

Leveraging joint sparsity in hierarchical Bayesian learning. (English) Zbl 1541.62066

Summary: We present a hierarchical Bayesian learning approach to infer jointly sparse parameter vectors from multiple measurement vectors. Our model uses separate conditionally Gaussian priors for each parameter vector and common gamma-distributed hyperparameters to enforce joint sparsity. The resulting joint-sparsity-promoting priors are combined with existing Bayesian inference methods to generate a new family of algorithms. Our numerical experiments, which include a multicoil magnetic resonance imaging application, demonstrate that our new approach consistently outperforms commonly used hierarchical Bayesian methods.

MSC:

62F15 Bayesian inference
65F22 Ill-posedness and regularization problems in numerical linear algebra
65K10 Numerical optimization and variational techniques
68T05 Learning and adaptive systems in artificial intelligence

References:

[1] Adcock, B., Gelb, A., Song, G., and Sui, Y., Joint sparse recovery based on variances, SIAM J. Sci. Comput., 41 (2019), pp. A246-A268. · Zbl 1454.94020
[2] Asim, M., Daniels, M., Leong, O., Ahmed, A., and Hand, P., Invertible generative models for inverse problems: Mitigating representation error and dataset bias, Proceedings in the International Conference on Machine Learning, , PMLR, 2020, pp. 399-409.
[3] Babacan, S. D., Molina, R., and Katsaggelos, A. K., Sparse Bayesian image restoration, in Proceedings of the 2010 IEEE International Conference on Image Processing, , IEEE, 2010, pp. 3577-3580.
[4] Beck, A., First-Order Methods in Optimization, SIAM, Philadelphia, 2017. · Zbl 1384.65033
[5] Beck, A. and Teboulle, M., A fast iterative shrinkage-thresholding algorithm for linear inverse problems, SIAM J. Imaging Sci., 2 (2009), pp. 183-202. · Zbl 1175.94009
[6] Calvetti, D., Pascarella, A., Pitolli, F., Somersalo, E., and Vantaggi, B., A hierarchical Krylov-Bayes iterative inverse solver for MEG with physiological preconditioning, Inverse Problems, 31 (2015), 125005. · Zbl 1329.35346
[7] Calvetti, D., Pitolli, F., Somersalo, E., and Vantaggi, B., Bayes meets Krylov: Statistically inspired preconditioners for CGLS, SIAM Rev., 60 (2018), pp. 429-461. · Zbl 1392.65047
[8] Calvetti, D., Pragliola, M., and Somersalo, E., Sparsity promoting hybrid solvers for hierarchical Bayesian inverse problems, SIAM J. Sci. Comput., 42 (2020), pp. A3761-A3784. · Zbl 07303422
[9] Calvetti, D., Pragliola, M., Somersalo, E., and Strang, A., Sparse reconstructions from few noisy data: Analysis of hierarchical Bayesian models with generalized gamma hyperpriors, Inverse Problems, 36 (2020), 025010. · Zbl 1464.62365
[10] Calvetti, D. and Somersalo, E., A Gaussian hypermodel to recover blocky objects, Inverse Problems, 23 (2007), 733. · Zbl 1112.62018
[11] Calvetti, D. and Somersalo, E., An Introduction to Bayesian Scientific Computing: Ten Lectures on Subjective Computing, Vol. 2, Springer, New York, 2007. · Zbl 1137.65010
[12] Calvetti, D. and Somersalo, E., Computationally Efficient Sampling Methods for Sparsity Promoting Hierarchical Bayesian Models, preprint, arXiv:2303.16988, 2023.
[13] Calvetti, D., Somersalo, E., and Strang, A., Hierachical Bayesian models and sparsity: \( \ell_2\)-magic, Inverse Problems, 35 (2019), 035003. · Zbl 1490.62078
[14] Candès, E. J., Wakin, M. B., and Boyd, S. P., Enhancing sparsity by reweighted \(\ell_1\) minimization, J. Fourier Anal. Appl., 14 (2008), pp. 877-905. · Zbl 1176.94014
[15] Carvalho, C. M., Polson, N. G., and Scott, J. G., Handling sparsity via the horseshoe, in Proceedings of the Twelth International Conference on Artificial Intelligence and Statistics, , PMLR, 2009, pp. 73-80.
[16] Chantas, G. K., Galatsanos, N. P., and Likas, A. C., Bayesian restoration using a new nonstationary edge-preserving image prior, IEEE Trans. Image Process., 15 (2006), pp. 2987-2997.
[17] Chartrand, R. and Yin, W., Iteratively reweighted algorithms for compressive sensing, in Proceedings of the 2008 IEEE International Conference on Acoustics, Speech and Signal Processing, , IEEE, 2008, pp. 3869-3872.
[18] Chun, I. Y. and Adcock, B., Compressed sensing and parallel acquisition, IEEE Trans. Inform. Theory, 63 (2017), pp. 4860-4882. · Zbl 1372.94328
[19] Chun, I. Y., Adcock, B., and Talavage, T. M., Efficient compressed sensing SENSE pMRI reconstruction with joint sparsity promotion, IEEE Trans. Med. Imaging, 35 (2015), pp. 354-368.
[20] Churchill, V. and Gelb, A., Detecting edges from non-uniform Fourier data via sparse Bayesian learning, J. Sci. Comput., 80 (2019), pp. 762-783. · Zbl 1431.94013
[21] Cotter, S. F., Rao, B. D., Engan, K., and Kreutz-Delgado, K., Sparse solutions to linear inverse problems with multiple measurement vectors, IEEE Trans. Signal Process., 53 (2005), pp. 2477-2488. · Zbl 1372.65123
[22] Daubechies, I., DeVore, R., Fornasier, M., and Güntürk, C. S., Iteratively reweighted least squares minimization for sparse recovery, Comm. Pure Appl. Math., 63 (2010), pp. 1-38. · Zbl 1202.65046
[23] Donoho, D. L., Compressed sensing, IEEE Trans. Inform. Theory, 52 (2006), pp. 1289-1306. · Zbl 1288.94016
[24] Eldar, Y. C. and Kutyniok, G., Compressed Sensing: Theory and Applications, Cambridge University Press, Cambridge, 2012.
[25] Eldar, Y. C. and Mishali, M., Robust recovery of signals from a structured union of subspaces, IEEE Trans. Inform. Theory, 55 (2009), pp. 5302-5316. · Zbl 1367.94087
[26] Evensen, G., Data Assimilation: The Ensemble Kalman Filter, Vol. 2, Springer, New York, 2009.
[27] Foucart, S. and Rauhut, H., A mathematical introduction to compressive sensing, Bull. Amer. Math., 54 (2017), pp. 151-165. · Zbl 1352.00019
[28] Gelb, A. and Scarnati, T., Reducing effects of bad data using variance based joint sparsity recovery, J. Sci. Comput., 78 (2019), pp. 94-120. · Zbl 1410.65120
[29] Glaubitz, J., Gelb, A., and Song, G., Generalized sparse Bayesian learning and application to image reconstruction, SIAM/ASA J. Uncertain. Quantif., 11 (2023), pp. 262-284. · Zbl 1514.94006
[30] Guerquin-Kern, M., Lejeune, L., Pruessmann, K. P., and Unser, M., Realistic analytical phantoms for parallel magnetic resonance imaging, IEEE Trans. Med. Imaging, 31 (2011), pp. 626-636.
[31] Kaipio, J. and Somersalo, E., Statistical and Computational Inverse Problems, , Springer, New York, 2006.
[32] Kim, H., Sanz-Alonso, D., and Strang, A., Hierarchical ensemble Kalman methods with sparsity-promoting generalized gamma hyperpriors, Found. Data Sci., 5 (2023), pp. 366-388. · Zbl 07805180
[33] Li, C., Dunlop, M., and Stadler, G., Bayesian neural network priors for edge-preserving inversion, Inverse Probl. Imaging, 16 (2022), pp. 1229-1254. · Zbl 07584864
[34] Neal, R. M., Priors for infinite networks, in Bayesian Learning for Neural Networks, Springer, New York, 1996, pp. 29-53. · Zbl 0888.62021
[35] Owen, A. B., Monte Carlo Theory, Methods and Examples, Stanford University Press, Redwood City, CA, 2013.
[36] Saad, Y., Iterative Methods for Sparse Linear Systems, SIAM, Philadelphia, 2003. · Zbl 1031.65046
[37] Sanders, T., Gelb, A., and Platte, R. B., Composite SAR imaging using sequential joint sparsity, J. Comput. Phys., 338 (2017), pp. 357-370. · Zbl 1415.65050
[38] Scarnati, T. and Gelb, A., Joint image formation and two-dimensional autofocusing for synthetic aperture radar data, J. Comput. Phys., 374 (2018), pp. 803-821. · Zbl 1416.94020
[39] Scarnati, T. and Gelb, A., Accurate and efficient image reconstruction from multiple measurements of Fourier samples, J. Comput. Math., 38 (2020), 797. · Zbl 1474.94026
[40] Si, Z., Liu, Y., and Strang, A., Path-Following Methods for Maximum a Posteriori Estimators in Bayesian Hierarchical Models: How Estimates Depend on Hyperparameters, preprint, arXiv:2211.07113, 2022.
[41] Spantini, A., Baptista, R., and Marzouk, Y., Coupling techniques for nonlinear ensemble filtering, SIAM Rev., 64 (2022), pp. 921-953. · Zbl 1503.93045
[42] Stuart, A. M., Inverse problems: A Bayesian perspective, Acta Numer., 19 (2010), pp. 451-559. · Zbl 1242.65142
[43] Tikhonov, A. N., Goncharsky, A., Stepanov, V., and Yagola, A. G., Numerical Methods for the Solution of Ill-Posed Problems, , Springer, New York, 2013.
[44] Tipping, M. E., Sparse Bayesian learning and the relevance vector machine, J. Mach. Learn. Res., 1 (2001), pp. 211-244. · Zbl 0997.68109
[45] Uribe, F., Dong, Y., and Hansen, P. C., Horseshoe priors for edge-preserving linear Bayesian inversion, SIAM J. Sci. Comput., 45 (2023), pp. B337-B365. · Zbl 07695777
[46] Vono, M., Dobigeon, N., and Chainais, P., High-dimensional Gaussian sampling: A review and a unifying approach based on a stochastic proximal point algorithm, SIAM Rev., 64 (2022), pp. 3-56. · Zbl 1515.65017
[47] Wipf, D. P. and Rao, B. D., Sparse Bayesian learning for basis selection, IEEE Trans. Signal Process., 52 (2004), pp. 2153-2164. · Zbl 1369.94318
[48] Wipf, D. P. and Rao, B. D., An empirical Bayesian strategy for solving the simultaneous sparse approximation problem, IEEE Trans. Signal Process., 55 (2007), pp. 3704-3716. · Zbl 1391.62010
[49] Wright, S. J., Coordinate descent algorithms, Math. Program., 151 (2015), pp. 3-34. · Zbl 1317.49038
[50] Xiao, Y., Gelb, A., and Song, G., Sequential edge detection using joint Hierarchical Bayesian learning, J. Sci. Comput., 96 (2023), 80. · Zbl 07722600
[51] Xiao, Y. and Glaubitz, J., Sequential image recovery using joint hierarchical Bayesian learning, J. Sci. Comput., 96 (2023), 4. · Zbl 07698875
[52] Xiao, Y., Glaubitz, J., Gelb, A., and Song, G., Sequential image recovery from noisy and under-sampled Fourier data, J. Sci. Comput., 91 (2022), 79. · Zbl 1492.94026
[53] Zhang, J., Gelb, A., and Scarnati, T., Empirical Bayesian inference using a support informed prior, SIAM/ASA J. Uncertain. Quantif., 10 (2022), pp. 745-774. · Zbl 1493.62031
This reference list is based on information provided by the publisher or from digital mathematics libraries. Its items are heuristically matched to zbMATH identifiers and may contain data conversion errors. In some cases that data have been complemented/enhanced by data from zbMATH Open. This attempts to reflect the references listed in the original paper as accurately as possible without claiming completeness or a perfect matching.