×

Sparse polynomial chaos expansions: literature survey and benchmark. (English) Zbl 1464.65008

Summary: Sparse polynomial chaos expansions (PCE) are a popular surrogate modelling method that takes advantage of the properties of PCE, the sparsity-of-effects principle, and powerful sparse regression solvers to approximate computer models with many input parameters, relying on only a few model evaluations. Within the last decade, a large number of algorithms for the computation of sparse PCE have been published in the applied math and engineering literature. We present an extensive review of the existing methods and develop a framework for classifying the algorithms. Furthermore, we conduct a unique benchmark on a selection of methods to identify which approaches work best in practical applications. Comparing their accuracy on several benchmark models of varying dimensionality and complexity, we find that the choice of sparse regression solver and sampling scheme for the computation of a sparse PCE surrogate can make a significant difference of up to several orders of magnitude in the resulting mean-squared error. Different methods seem to be superior in different regimes of model dimensionality and experimental design size.

MSC:

65C20 Probabilistic models, generic numerical methods in probability and statistics
62K20 Response surface designs
62P30 Applications of statistics in engineering and industry; control charts

References:

[1] S. Abraham, M. Raisee, G. Ghorbaniasl, F. Contino, and F. Lacor, A robust and efficient stepwise regression method for building sparse polynomial chaos expansions, J. Comput. Phys., 332 (2017), pp. 461- 474. · Zbl 1384.62216
[2] N. Alemazkoor and H. Meidani, Divide and conquer: An incremental sparsity promoting compressive sampling approach for polynomial chaos expansions, Comput. Methods Appl. Mech. Engrg., 318 (2017), pp. 937-956. · Zbl 1440.60006
[3] N. Alemazkoor and H. Meidani, A near-optimal sampling strategy for sparse recovery of polynomial chaos expansions, J. Comput. Phys., 371 (2018), pp. 137-151. · Zbl 1415.94368
[4] N. Alemazkoor and H. Meidani, A preconditioning approach for improved estimation of sparse polynomial chaos expansions, Comput. Methods Appl. Mech. Engrg., 342 (2018), pp. 474-489. · Zbl 1440.94011
[5] Y. Arjoune, N. Kaabouch, H. El Ghazi, and A. Tamtaoui, Compressive sensing: Performance comparison of sparse recovery algorithms, in 2017 IEEE CCWC, IEEE, 2017, pp. 1-7.
[6] D. Babacan, MATLAB code for “Bayesian compressive sensing using Laplace priors,” IEEE Trans. Image Process., 19 (2010), pp. 53-63, http://www.dbabacan.info/software.html (accessed 28 August 2019). · Zbl 1371.94480
[7] S. Babacan, R. Molina, and A. Katsaggelos, Bayesian compressive sensing using Laplace priors, IEEE Trans. Image Process., 19 (2010), pp. 53-63. · Zbl 1371.94480
[8] R. Baptista, V. Stolbunov, and P. B. Nair, Some greedy algorithms for sparse polynomial chaos expansions, J. Comput. Phys., 387 (2019), pp. 303-325. · Zbl 1452.65012
[9] D. Baumann and K. Baumann, Reliable estimation of prediction errors for QSAR models under model uncertainty using double cross-validation, J. Cheminf., 6 (2014), 47.
[10] M. Berchier, Orthogonal Matching Pursuit for Sparse Polynomial Chaos Expansions, Semester project, ETH Zürich, 2015.
[11] M. Berveiller, Stochastic Finite Elements: Intrusive and Non-intrusive Methods for Reliability Analysis, Ph.D. thesis, Université Blaise Pascal, Clermont-Ferrand, 2005.
[12] M. Berveiller, B. Sudret, and M. Lemaire, Stochastic finite element: A non intrusive approach by regression, Eur. J. Comput. Mech., 15 (2006), pp. 81-92. · Zbl 1325.74171
[13] B. Bhattacharyya, Global sensitivity analysis: A Bayesian learning based polynomial chaos approach, J. Comput. Phys., 415 (2020), 109539. · Zbl 1440.62093
[14] E. G. Birgin, J. M. Martínez, and M. Raydan, Nonmonotone spectral projected gradient methods on convex sets, SIAM J. Optim., 10 (2000), pp. 1196-1211, https://doi.org/10.1137/S1052623497330963. · Zbl 1047.90077
[15] G. Blatman and B. Sudret, Sparse polynomial chaos expansions and adaptive stochastic finite elements using a regression approach, Comptes Rendus Mécanique, 336 (2008), pp. 518-523. · Zbl 1138.74046
[16] G. Blatman and B. Sudret, An adaptive algorithm to build up sparse polynomial chaos expansions for stochastic finite element analysis, Prob. Eng. Mech., 25 (2010), pp. 183-197.
[17] G. Blatman and B. Sudret, Efficient computation of global sensitivity indices using sparse polynomial chaos expansions, Reliab. Eng. Syst. Saf., 95 (2010), pp. 1216-1229.
[18] G. Blatman and B. Sudret, Adaptive sparse polynomial chaos expansion based on least angle regression, J. Comput. Phys., 230 (2011), pp. 2345-2367. · Zbl 1210.65019
[19] A. M. Bruckstein, D. L. Donoho, and M. Elad, From sparse solutions of systems of equations to sparse modeling of signals and images, SIAM Rev., 51 (2009), pp. 34-81, https://doi.org/10.1137/060657704. · Zbl 1178.68619
[20] E. Candès and J. Romberg, \( \ell_1\)-MAGIC, https://statweb.stanford.edu/ candes/software/l1magic/, 2005 (accessed 22 January 2020).
[21] E. Candès, J. Romberg, and T. Tao, Robust uncertainty principles: Exact signal reconstruction from highly incomplete frequency information, IEEE Trans. Inform. Theory, 52 (2006), pp. 489-509. · Zbl 1231.94017
[22] E. J. Candès and Y. Plan, A probabilistic and RIPless theory of compressed sensing, IEEE Trans. Inform. Theory, 57 (2011), pp. 7235-7254. · Zbl 1365.94174
[23] E. J. Candès and M. B. Wakin, An introduction to compressive sampling: A sensing/sampling paradigm that goes against the common knowledge in data acquisition, IEEE Signal Process. Mag., 25 (2008), pp. 21-30.
[24] E. J. Candès, M. B. Wakin, and S. P. Boyd, Enhancing sparsity by reweighted \(\ell_1\) minimization, J. Fourier Anal. Appl., 14 (2008), pp. 877-905. · Zbl 1176.94014
[25] I. Carron, Compressive Sensing: The Big Picture, https://sites.google.com/site/igorcarron2/cs, 2013 (accessed 22 January 2020).
[26] O. Chapelle, V. Vapnik, and Y. Bengio, Model selection for small sample regression, Mach. Learn., 48 (2002), pp. 9-23. · Zbl 0998.68114
[27] T. Chatterjee, S. Chakraborty, and R. Chowdhury, A critical review of surrogate assisted robust design optimization, Arch. Comput. Methods Eng., 26 (2019), pp. 245-274.
[28] K. Cheng and Z. Lu, Adaptive sparse polynomial chaos expansions for global sensitivity analysis based on support vector regression, Comput. Struct., 194 (2018), pp. 86-96.
[29] K. Cheng and Z. Lu, Sparse polynomial chaos expansion based on D-MORPH regression, Appl. Math. Comput., 323 (2018), pp. 17-30.
[30] A. Cohen and G. Migliorati, Optimal weighted least-squares methods, SMAI J. Comput. Math., 3 (2017), pp. 181-203. · Zbl 1416.62177
[31] R. D. Cook and C. J. Nachtsheim, A comparison of algorithms for constructing exact D-optimal designs, Technometrics, 22 (1980), pp. 315-324. · Zbl 0459.62061
[32] W. Dai and O. Milenkovic, Subspace pursuit for compressive sensing signal reconstruction, IEEE Trans. Inform. Theory, 55 (2009), pp. 2230-2249. · Zbl 1367.94082
[33] P. Diaz, DOPT_PCE, https://github.com/CU-UQ/DOPT_PCE, 2018 (accessed 22 January 2020).
[34] P. Diaz, A. Doostan, and J. Hampton, Sparse polynomial chaos expansions via compressed sensing and D-optimal design, Comput. Methods Appl. Mech. Engrg., 336 (2018), pp. 640-666. · Zbl 1441.65005
[35] D. Donoho, I. Drori, V. Stodden, Y. Tsaig, and M. Shahram, SparseLab-Seeking Sparse Solutions to Linear Systems of Equations, http://sparselab.stanford.edu/, 2007 (accessed 22 January 2020).
[36] D. L. Donoho, Compressed sensing, IEEE Trans. Inform. Theory, 52 (2006), pp. 1289-1306. · Zbl 1288.94016
[37] A. Doostan and H. Owhadi, A non-adapted sparse approximation of PDEs with stochastic inputs, J. Comput. Phys., 230 (2011), pp. 3015-3034. · Zbl 1218.65008
[38] V. Dubourg, Adaptive Surrogate Models for Reliability Analysis and Reliability-Based Design Optimization, Ph.D. thesis, Université Blaise Pascal, Clermont-Ferrand, France, 2011.
[39] S. Dutta and A. H. Gandomi, Design of experiments for uncertainty quantification based on polynomial chaos expansion metamodels, in Handbook of Probabilistic Models, Elsevier, 2020, pp. 369-381.
[40] O. Dykstra, The augmentation of experimental data to maximize \([{X}'{X}]\), Technometrics, 13 (1971), pp. 682-688.
[41] B. Echard, N. Gayton, M. Lemaire, and N. Relun, A combined importance sampling and Kriging reliability method for small failure probabilities with time-demanding numerical models, Reliab. Eng. Syst. Saf., 111 (2013), pp. 232-240.
[42] B. Efron, T. Hastie, I. Johnstone, and R. Tibshirani, Least angle regression, Ann. Statist., 32 (2004), pp. 407-499. · Zbl 1091.62054
[43] O. Ernst, A. Mugler, H.-J. Starkloff, and E. Ullmann, On the convergence of generalized polynomial chaos expansions, ESAIM Math. Model. Numer. Anal., 46 (2012), pp. 317-339. · Zbl 1273.65012
[44] N. Fajraoui, S. Marelli, and B. Sudret, Sequential design of experiment for sparse polynomial chaos expansions, SIAM/ASA J. Uncertain. Quantif., 5 (2017), pp. 1061-1085, https://doi.org/10.1137/16M1103488. · Zbl 06861783
[45] A. C. Faul and M. E. Tipping, Analysis of sparse Bayesian learning, in NIPS’01: Proceedings of the 14th International Conference on Neural Information Processing Systems: Natural and Synthetic, MIT Press, 2002, pp. 383-389.
[46] V. V. Fedorov, Theory of Optimal Experiments, Elsevier, 2013.
[47] M. A. Figueiredo, Adaptive sparseness for supervised learning, IEEE Trans. Pattern Anal. Mach. Intell., 25 (2003), pp. 1150-1159.
[48] M. A. Figueiredo and R. D. Nowak, Wavelet-based image estimation: An empirical Bayes approach using Jeffrey’s noninformative prior, IEEE Trans. Image Process., 10 (2001), pp. 1322-1331. · Zbl 1037.68775
[49] A. Forrester, A. Sobester, and A. Keane, Engineering Design via Surrogate Modelling: A Practical Guide, Wiley, 2008.
[50] R. G. Ghanem and P. Spanos, Stochastic Finite Elements: A Spectral Approach, Springer-Verlag, 1991; revised edition published by Dover Publications, Inc., 2003. · Zbl 0722.73080
[51] M. Gu and S. C. Eisenstat, Efficient algorithms for computing a strong rank-revealing QR factorization, SIAM J. Sci. Comput., 17 (1996), pp. 848-869, https://doi.org/10.1137/0917055. · Zbl 0858.65044
[52] L. Guo, A. Narayan, Y. Liu, and T. Zhou, Sparse approximation of data-driven polynomial chaos expansions: An induced sampling approach, Commun. Math. Res., 36 (2020), pp. 128-153. · Zbl 1474.65008
[53] L. Guo, A. Narayan, T. Zhou, and Y. Chen, Stochastic collocation methods via \(\ell_1\) minimization using randomized quadratures, SIAM J. Sci. Comput., 39 (2017), pp. A333-A359, https://doi.org/10.1137/16M1059680. · Zbl 1359.41016
[54] M. Hadigol and A. Doostan, Least squares polynomial chaos expansion: A review of sampling strategies, Comput. Methods Appl. Mech. Engrg., 332 (2018), pp. 382-407. · Zbl 1440.65007
[55] J. H. Halton, On the efficiency of certain quasi-random sequences of points in evaluating multi-dimensional integrals, Numer. Math., 2 (1960), pp. 84-90. · Zbl 0090.34505
[56] J. Hampton and A. Doostan, Coherence motivated sampling and convergence analysis of least squares polynomial chaos regression, Comput. Methods Appl. Mech. Engrg., 290 (2015), pp. 73-97. · Zbl 1426.62174
[57] J. Hampton and A. Doostan, Compressive sampling of polynomial chaos expansions: Convergence analysis and sampling strategies, J. Comput. Phys., 280 (2015), pp. 363-386. · Zbl 1349.94110
[58] J. Hampton and A. Doostan, COH-OPT, https://github.com/CU-UQ/COH-OPT, 2017 (accessed 22 January 2020).
[59] J. Hampton and A. Doostan, Basis adaptive sample efficient polynomial chaos (BASE-PC), J. Comput. Phys., 371 (2018), pp. 20-49. · Zbl 1415.65028
[60] W. V. Harper and S. K. Gupta, Sensitivity/Uncertainty Analysis of a Borehole Scenario Comparing Latin Hypercube Sampling and Deterministic Sensitivity Approaches, Tech. report BMI/ONWI-516, Battelle Memorial Inst., Office of Nuclear Waste Isolation, Columbus, OH, 1983.
[61] T. Hastie, R. Tibshirani, and J. Friedman, The elements of statistical learning: Data mining, inference and prediction, Springer, 2001. · Zbl 0973.62007
[62] Y. P. Hong and C.-T. Pan, Rank-revealing QR factorizations and the singular value decomposition, Math. Comput., 58 (1992), pp. 213-232. · Zbl 0743.65037
[63] S. Hosder, R. Walters, and M. Balch, Efficient sampling for non-intrusive polynomial chaos applications with multiple uncertain input variables, in Proceedings of the 48th AIAA/ASME/ASCE/AHS/ASC Structures, Structural Dynamics, and Materials Conference, 2007, AIAA 2007-1939.
[64] C. Hu and B. D. Youn, Adaptive-sparse polynomial chaos expansion for reliability analysis and design of complex engineering systems, Struct. Multidisc. Optim., 43 (2011), pp. 419-442. · Zbl 1274.74271
[65] R. Hu and M. Ludkovski, Sequential design for ranking response surfaces, SIAM/ASA J. Uncertain. Quantif., 5 (2017), pp. 212-239, https://doi.org/10.1137/15M1045168. · Zbl 1365.62319
[66] X. Huan, C. Safta, K. Sargsyan, Z. P. Vane, G. Lacaze, J. C. Oefelein, and H. N. Najm, Compressive sensing with cross-validation and stop-sampling for sparse polynomial chaos expansions, SIAM/ASA J. Uncertain. Quantif., 6 (2018), pp. 907-936, https://doi.org/10.1137/17M1141096. · Zbl 1403.62133
[67] S. S. Isukapalli, Uncertainty Analysis of Transport-Transformation Models, Ph.D. thesis, Rutgers, The State University of New Jersey, 1999.
[68] J. D. Jakeman, M. S. Eldred, and K. Sargsyan, Enhancing \(\ell_1\)-minimization estimates of polynomial chaos expansions using basis selection, J. Comput. Phys., 289 (2015), pp. 18-34. · Zbl 1352.65026
[69] J. D. Jakeman, A. Narayan, and T. Zhou, A generalized sampling and preconditioning scheme for sparse approximation of polynomial chaos expansions, SIAM J. Sci. Comput., 39 (2017), pp. A1114-A1144, https://doi.org/10.1137/16M1063885. · Zbl 1368.65025
[70] S. Ji, Y. Xue, and L. Carin, Bayesian compressive sensing, IEEE Trans. Signal Process., 56 (2008), pp. 2346-2356. · Zbl 1390.94231
[71] J. Kiefer and J. Wolfowitz, Optimum designs in regression problems, Ann. Math. Statist., 30 (1959), pp. 271-294. · Zbl 0090.11404
[72] K. Konakli and B. Sudret, Global sensitivity analysis using low-rank tensor approximations, Reliab. Eng. Syst. Saf., 156 (2016), pp. 64-83. · Zbl 1349.60056
[73] I. A. Kougioumtzoglou, I. Petromichelakis, and A. F. Psaros, Sparse representations and compressive sampling approaches in engineering mechanics: A review of theoretical concepts and diverse applications, Prob. Eng. Mech., 61 (2020), 103082.
[74] C. Lataniotis, S. Marelli, and B. Sudret, Extending classical surrogate modelling to high dimensions through supervised dimensionality reduction: A data-driven approach, Int. J. Uncertain. Quantif., 10 (2020), pp. 55-82. · Zbl 1498.62109
[75] C. C. Li and A. Der Kiureghian, Optimal discretization of random fields, J. Eng. Mech., 119 (1993), pp. 1136-1154.
[76] G. Li and H. Rabitz, D-MORPH regression: Application to modeling with unknown parameters more than observation data, J. Math. Chem., 48 (2010), pp. 1010-1035. · Zbl 1303.62032
[77] Z. Liu, D. Lesselier, B. Sudret, and J. Wiart, Surrogate modeling based on resampled polynomial chaos expansions, Reliab. Eng. Syst. Saf., 202 (2020), 107008.
[78] Z. Liu, D. Lesselier, B. Sudret, and J. Wiart, Surrogate modeling of indoor down-link human exposure based on sparse polynomial chaos expansion, Int. J. Uncertain. Quantif., 10 (2020), pp. 145-163. · Zbl 1498.62237
[79] N. Lüthen, S. Marelli, and B. Sudret, A Benchmark of Basis-Adaptive Sparse Polynomial Chaos Expansions for Engineering Regression Problems, preprint, https://arxiv.org/abs/2009.04800, 2021.
[80] S. Marelli and B. Sudret, UQLab: A framework for uncertainty quantification in MATLAB, in Vulnerability, Uncertainty, and Risk (Proc. 2nd Int. Conf. on Vulnerability, Risk Analysis and Management (ICVRAM2014), Liverpool, UK), 2014, pp. 2554-2563.
[81] S. Marelli and B. Sudret, UQLab User Manual-Polynomial Chaos Expansions, Tech. report UQLab-V1.3-104, Chair of Risk, Safety and Uncertainty Quantification, ETH Zurich, 2019.
[82] S. Marelli, P.-R. Wagner, C. Lataniotis, and B. Sudret, Stochastic spectral embedding, Int. J. Uncertain. Quantif, 11 (2021), pp. 25-47. · Zbl 1498.65027
[83] L. Mathelin and K. Gallivan, A compressed sensing approach for partial differential equations with random input data, Commun. Comput. Phys., 12 (2012), pp. 919-954. · Zbl 1388.65018
[84] M. D. McKay, R. J. Beckman, and W. J. Conover, A comparison of three methods for selecting values of input variables in the analysis of output from a computer code, Technometrics, 2 (1979), pp. 239-245. · Zbl 0415.62011
[85] A. Mikhalev and I. V. Oseledets, Rectangular maximum-volume submatrices and their applications, Linear Algebra Appl., 538 (2018), pp. 187-211. · Zbl 1374.15016
[86] D. C. Montgomery, Design and Analysis of Experiments, John Wiley and Sons, 2004.
[87] A. Narayan, J. Jakeman, and T. Zhou, A Christoffel function weighted least squares algorithm for collocation approximations, Math. Comp., 86 (2017), pp. 1913-1947. · Zbl 1361.65009
[88] D. Needell and J. A. Tropp, CoSaMP: Iterative signal recovery from incomplete and inaccurate samples, Appl. Comput. Harmon. A, 26 (2009), pp. 301-321. · Zbl 1163.94003
[89] N.-K. Nguyen and A. J. Miller, A review of some exchange algorithms for constructing discrete D-optimal designs, Comput. Statist. Data Anal., 14 (1992), pp. 489-498. · Zbl 0937.62628
[90] A. B. Owen, Controlling correlations in Latin hypercube samples, J. Amer. Statist. Assoc., 89 (1994), pp. 1517-1522. · Zbl 0813.65060
[91] J. Palmer, B. D. Rao, and D. P. Wipf, Perspectives on sparse Bayesian learning, in Advances in Neural Information Processing Systems, MIT Press, 2004, pp. 249-256.
[92] I. Papaioannou, M. Ehre, and D. Straub, PLS-based adaptation for efficient PCE representation in high dimensions, J. Comput. Phys., 387 (2019), pp. 186-204. · Zbl 1452.65014
[93] Y. C. Pati, R. Rezaiifar, and P. S. Krishnaprasad, Orthogonal matching pursuit: Recursive function approximation with applications to wavelet decomposition, in Proceedings of the 27th Asilomar Conference on Signals, Systems and Computers, IEEE, 1993, pp. 40-44.
[94] J. Peng, J. Hampton, and A. Doostan, A weighted \(\ell_1\)-minimization approach for sparse polynomial chaos expansions, J. Comput. Phys., 267 (2014), pp. 92-111. · Zbl 1349.65198
[95] Z. Perkó, L. Gilli, D. Lathouwers, and J. L. Kloosterman, Grid and basis adaptive polynomial chaos techniques for sensitivity and uncertainty analysis, J. Comput. Phys., 260 (2014), pp. 54-84. · Zbl 1349.65027
[96] L. Pronzato and W. G. Müller, Design of computer experiments: Space filling and beyond, Stat. Comput., 22 (2012), pp. 681-701. · Zbl 1252.62080
[97] S. Qaisar, R. M. Bilal, W. Iqbal, M. Naureen, and S. Lee, Compressive sensing: From theory to applications, a survey, J. Commun. Netw., 15 (2013), pp. 443-456.
[98] H. Rauhut and R. Ward, Sparse Legendre expansions via \(\ell_1\)-minimization, J. Approx. Theory, 164 (2012), pp. 517-533. · Zbl 1239.65018
[99] K. Sargsyan, C. Safta, H. Najm, B. Debusschere, D. Ricciuto, and P. Thornton, Dimensionality reduction for complex models via Bayesian compressive sensing, Int. J. Uncertain. Quantif., 4 (2014), pp. 63-93. · Zbl 1513.65004
[100] R. Schöbi, B. Sudret, and J. Wiart, Polynomial-chaos-based Kriging, Int. J. Uncertain. Quantif., 5 (2015), pp. 171-193. · Zbl 1498.82031
[101] M. W. Seeger and H. Nickisch, Compressed sensing and Bayesian experimental design, in Proceedings of the 25th International Conference on Machine Learning, ACM, 2008, pp. 912-919.
[102] P. Seshadri, A. Narayan, and S. Mahadevan, Effectively subsampled quadratures for least squares polynomial approximations, SIAM/ASA J. Uncertain. Quantif., 5 (2017), pp. 1003-1023, https://doi.org/10.1137/16M1057668. · Zbl 1384.93159
[103] B. Settles, Active learning, Synthesis Lectures on Artificial Intelligence and Machine Learning, 6 (2012), pp. 1-114. · Zbl 1270.68006
[104] Q. Shao, A. Younes, M. Fahs, and T. Mara, Bayesian sparse polynomial chaos expansion for global sensitivity analysis, Comput. Methods Appl. Mech. Eng., 318 (2017), pp. 474-496. · Zbl 1439.62088
[105] D. Shen, H. Wu, B. Xia, and D. Gan, Polynomial chaos expansion for parametric problems in engineering systems: A review, IEEE Syst. J., 14 (2020), pp. 45-4514.
[106] M. D. Shields and J. Zhang, The generalization of Latin hypercube sampling, Reliab. Eng. Syst. Saf., 148 (2016), pp. 96-108.
[107] Y. Shin and D. Xiu, Nonadaptive quasi-optimal points selection for least squares linear regression, SIAM J. Sci. Comput., 38 (2016), pp. A385-A411, https://doi.org/10.1137/15M1015868. · Zbl 06548919
[108] Y. Shin and D. Xiu, On a near optimal sampling strategy for least squares polynomial regression, J. Comput. Phys., 326 (2016), pp. 931-946. · Zbl 1384.62249
[109] I. M. Sobol’, Distribution of points in a cube and approximate evaluation of integrals, USSR Comput. Math. Math. Phys., 7 (1967), pp. 86-112. · Zbl 0185.41103
[110] B. Sudret, Global sensitivity analysis using polynomial chaos expansions, Reliab. Eng. Syst. Saf., 93 (2008), pp. 964-979.
[111] G. Tang and G. Iaccarino, Subsampled Gauss quadrature nodes for estimating polynomial chaos expansions, SIAM/ASA J. Uncertain. Quantif., 2 (2014), pp. 423-443, https://doi.org/10.1137/130913511. · Zbl 1308.41005
[112] A. Tarakanov and A. H. Elsheikh, Regression-based sparse polynomial chaos for uncertainty quantification of subsurface flow models, J. Comput. Phys., 399 (2019), 108909. · Zbl 1453.76208
[113] R. Tipireddy and R. Ghanem, Basis adaptation in homogeneous chaos spaces, J. Comput. Phys., 259 (2014), pp. 304-317. · Zbl 1349.60058
[114] M. E. Tipping, Sparse Bayesian learning and the relevance vector machine, J. Mach. Learn. Res., 1 (2001), pp. 211-244. · Zbl 0997.68109
[115] M. E. Tipping and A. C. Faul, Fast marginal likelihood maximisation for sparse Bayesian models, in Proceedings of the 9th International Workshop on Artificial Intelligence and Statistics, 2003.
[116] J. A. Tropp and A. C. Gilbert, Signal recovery from random measurements via orthogonal matching pursuit, IEEE Trans. Inform. Theory, 53 (2007), pp. 4655-4666. · Zbl 1288.94022
[117] P. Tsilifis, X. Huan, C. Safta, K. Sargsyan, G. Lacaze, J. C. Oefelein, H. N. Najm, and R. G. Ghanem, Compressive sensing adaptation for polynomial chaos expansions, J. Comput. Phys., 380 (2019), pp. 29-47. · Zbl 1451.62032
[118] P. Tsilifis, I. Papaioannou, D. Straub, and F. Nobile, Sparse polynomial chaos expansions using variational relevance vector machines, J. Comput. Phys., 416 (2020), 109498. · Zbl 1437.62114
[119] E. van den Berg and M. P. Friedlander, Probing the Pareto frontier for basis pursuit solutions, SIAM J. Sci. Comput., 31 (2008), pp. 890-912, https://doi.org/10.1137/080714488. · Zbl 1193.49033
[120] E. van den Berg and M. P. Friedlander, SPGL1: A Solver for Sparse Least Squares, Version 2.1, https://friedlander.io/spgl1/, 2020 (accessed 11 May 2020).
[121] D. P. Wipf and B. D. Rao, Sparse Bayesian learning for basis selection, IEEE Trans. Signal Process., 52 (2004), pp. 2153-2164. · Zbl 1369.94318
[122] D. Xiu and J. S. Hesthaven, High-order collocation methods for differential equations with random inputs, SIAM J. Sci. Comput., 27 (2005), pp. 1118-1139, https://doi.org/10.1137/040615201. · Zbl 1091.65006
[123] D. Xiu and G. E. Karniadakis, The Wiener-Askey polynomial chaos for stochastic differential equations, SIAM J. Sci. Comput., 24 (2002), pp. 619-644, https://doi.org/10.1137/S1064827501387826. · Zbl 1014.65004
[124] L. Yan, L. Guo, and D. Xiu, Stochastic collocation algorithms using \(\ell_1\)-minimization, Int. J. Uncertain. Quantif., 2 (2012), pp. 279-293. · Zbl 1291.65024
[125] X. Yang and G. E. Karniadakis, Reweighted \(\ell_1\) minimization method for stochastic elliptic differential equations, J. Comput. Phys., 248 (2013), pp. 87-108. · Zbl 1349.60113
[126] X. Yang, W. Li, and A. Tartakovsky, Sliced-inverse-regression-aided rotated compressive sensing method for uncertainty quantification, SIAM/ASA J. Uncertain. Quantif., 6 (2018), pp. 1532-1554, https://doi.org/10.1137/17M1148955. · Zbl 07003645
[127] P. Yin, Y. Lou, Q. He, and J. Xin, Minimization of \(\ell_{1-2}\) for compressed sensing, SIAM J. Sci. Comput., 37 (2015), pp. A536-A563, https://doi.org/10.1137/140952363. · Zbl 1316.90037
[128] V. P. Zankin, G. V. Ryzhakov, and I. V. Oseledets, Gradient Descent-Based D-Optimal Design for the Least-Squares Polynomial Approximation, preprint, https://arxiv.org/abs/1806.06631, 2018.
[129] S. Zein, B. Colson, and F. Glineur, An efficient sampling method for regression-based polynomial chaos expansion, Commun. Comput. Phys., 13 (2013), pp. 1173-1188. · Zbl 1378.62025
[130] Z. Zhang, Y. Xu, J. Yang, X. Li, and D. Zhang, A survey of sparse representation: Algorithms and applications, IEEE Access, 3 (2015), pp. 490-530.
[131] H. Zhao, Z. Gao, F. Xu, Y. Zhang, and J. Huang, An efficient adaptive forward-backward selection method for sparse polynomial chaos expansion, Comput. Methods Appl. Mech. Engrg., 355 (2019), pp. 456-491. · Zbl 1441.65011
[132] W. Zhao and L. Bu, Global sensitivity analysis with a hierarchical sparse metamodeling method, Mech. Syst. Signal Process., 115 (2019), pp. 769-781.
[133] T. Zhou, A. Narayan, and Z. Xu, Multivariate discrete least-squares approximations with a new type of collocation grid, SIAM J. Sci. Comput., 36 (2014), pp. A2401-A2422, https://doi.org/10.1137/130950434. · Zbl 1305.41012
[134] Y. Zhou, Z. Lu, and K. Cheng, Sparse polynomial chaos expansions for global sensitivity analysis with partial least squares and distance correlation, Struct. Multidiscip. Optim., 59 (2019), pp. 229-247.
[135] Y. Zhou, Z. Lu, K. Cheng, and C. Ling, An efficient and robust adaptive sampling method for polynomial chaos expansion in sparse Bayesian learning framework, Comput. Methods Appl. Mech. Engrg., 352 (2019), pp. 654-674. · Zbl 1441.62201
[136] Y. Zhou, Z. Lu, K. Cheng, and Y. Shi, An expanded sparse Bayesian learning method for polynomial chaos expansion, Mech. Syst. Signal Process., 128 (2019), pp. 153-171.
[137] Y. Zhou, Z. Lu, J. Hu, and Y. Hu, Surrogate modeling of high-dimensional problems via data-driven polynomial chaos expansions and sparse partial least square, Comput. Methods Appl. Mech. Engrg., 364 (2020), 112906. · Zbl 1442.65437
[138] X. Zhu and B. Sudret, Replication-based emulation of the response distribution of stochastic simulators using generalized lambda distributions, Int. J. Uncertain. Quantif., 10 (2020), pp. 249-275. · Zbl 1498.60068
This reference list is based on information provided by the publisher or from digital mathematics libraries. Its items are heuristically matched to zbMATH identifiers and may contain data conversion errors. In some cases that data have been complemented/enhanced by data from zbMATH Open. This attempts to reflect the references listed in the original paper as accurately as possible without claiming completeness or a perfect matching.