×

An efficient computational approach for prior sensitivity analysis and cross-validation. (English. French summary) Zbl 1190.62046

Summary: Prior sensitivity analysis and cross-validation are important tools in Bayesian statistics. However, due to the computational expense of implementing existing methods, these techniques are rarely used. The authors show how it is possible to use sequential Monte Carlo methods to create an efficient and automated algorithm to perform these tasks. They apply the algorithm to the computation of regularization path plots and to assess the sensitivity of the tuning parameter in \(g\)-prior model selection. They then demonstrate the algorithm in a cross-validation context and use it to select the shrinkage parameter in Bayesian regression.

MSC:

62F15 Bayesian inference
65C05 Monte Carlo methods
62A09 Graphical methods in statistics
Full Text: DOI

References:

[1] Albert, Bayesian analysis of binary and polytomous response data, Journal of the American Statistical Association 88 pp 669– (1993)
[2] Alqallaf, On cross-validation of Bayesian models, Canadian Journal of Statistics 29 pp 333– (2001) · Zbl 0974.62019
[3] Besag, Bayesian computation and stochastic systems (with discussion), Statistical Science 10 pp 58– (1995)
[4] Bhattacharya, Importance re-sampling MCMC for cross-validation in inverse problems, Bayesian Analysis 2 pp 385– (2007) · Zbl 1331.86025
[5] Chopin, A sequential particle filter method for static models, Biometrika 89 pp 539– (2002) · Zbl 1036.62062
[6] Chopin, Central limit theorem for sequential Monte Carlo methods and its application to Bayesian inference, Annals of Statistics 32 pp 2385– (2004) · Zbl 1079.65006
[7] Crooks, Nonequilibrium measurements of free energy differences for microscopically reversible Markovian systems, Journal of Statistical Physics 90 pp 1481– (1998) · Zbl 0946.82029
[8] Del Moral, âFeynman-Kac formulae: Genealogical and interacting particle systems with applications,â (2004)
[9] Del Moral, Sequential Monte Carlo samplers, Journal of the Royal Statistical Society: Series B 68 pp 411– (2006) · Zbl 1105.62034
[10] Doucet, âSequential Monte Carlo Methods in Practice,â (2001) · Zbl 0967.00022 · doi:10.1007/978-1-4757-3437-9
[11] Doucet, On sequential Monte Carlo sampling methods for Bayesian filtering, Statistics and Computing 10 pp 197– (2000)
[12] Efron, Least angle regression, Annals of Statistics 32 pp 407– (2004) · Zbl 1091.62054
[13] I. Epifani, S. MacEachern & M. Peruggia (2005). Case-deletion importance sampling estimators: Central limit theorems and related results. Technical Report No. 720, Department of Statistics, Ohio State University. · Zbl 1320.62046
[14] Geweke, Bayesian inference in econometric models using Monte Carlo integration, Journal of the American Statistical Association 88 pp 881– (1989) · Zbl 0683.62068
[15] Gilks, Following a moving target: Monte Carlo inference for dynamic Bayesian models, Journal of the Royal Statistical Society: Series B 63 pp 127– (2001) · Zbl 0976.62021
[16] Gustafson, Local sensitivity diagnostics for Bayesian inference, Annals of Statistics 23 pp 2153– (1995) · Zbl 0854.62024
[17] Gustafson, Local sensitivity of inferences to prior marginals, Journal of the American Statistical Association 91 pp 774– (1996) · Zbl 0869.62022
[18] Jasra, Stability of sequential Monte Carlo samplers via the Foster-Lyapunov condition, Statistics and Probability Letters 78 pp 3062– (2008) · Zbl 1319.60044
[19] A. M. Johansen & N. Whiteley (2009). A Modern perspective on auxiliary particle filters, In Proceedings of Workshop on Inference and Estimation in Probabilistic Time Series Models. Issac Newton Institute, June 2008.
[20] Kirkpatrick, Optimization by simulated annealing, Science 220 pp 671– (1983) · Zbl 1225.90162
[21] Kitagawa, Monte Carlo filter and smoother for Non-Gaussian, non-linear state space models, Journal of Computational and Graphical Statistics 5 pp 1– (1996)
[22] A. Lee, C. Yau, M. Giles, A. Doucet & C. Holmes (2009). On the Utility of Graphics Cards to Perform Massively Parallel Simulation of Advanced Monte Carlo Methods. arXiv:0905.2441v3.
[23] Liu, Sequential Monte Carlo methods for dynamic systems, Journal of the American Statistical Association 93 pp 1032– (1998) · Zbl 1064.65500
[24] Liu, âMonte Carlo Strategies in Scientific Computing,â (2001)
[25] Marin, Bayesian Care: a practical approach to computational Bayesian statistics (2007) · Zbl 1137.62013
[26] McDonald, Instabilities of regression estimates relating air pollution to mortality, Technometrics 15 pp 463– (1973)
[27] Neal, Annealed importance sampling, Statistics and Computing 11 pp 125– (2001)
[28] Park, The Bayesian Lasso, Journal of the American Statistical Association 103 pp 681– (2008) · Zbl 1330.62292
[29] Peruggia, On the variability of case-deletion importance sampling weights in the Bayesian Linear Model, Journal of the American Statistical Association 92 pp 199– (1997) · Zbl 0889.62020
[30] Pitt, Filtering via simulation: Auxiliary particle filters, Journal of the American Statistical Association 94 pp 590– (1999) · Zbl 1072.62639
[31] Stamey, Prostate specific antigen in the diagnosis and treatment of adenocarcinoma of the prostate, Journal of Urology 16 pp 1076– (1989)
[32] Tibshirani, Regression shrinkage and selection via the Lasso, Journal of the Royal Statistical Society: Series B 58 pp 267– (1996) · Zbl 0850.62538
[33] Vidakovic, âPractical Nonparametric and Semiparametric Bayesian Statistics.â pp 133– (1998) · doi:10.1007/978-1-4612-1732-9_7
[34] Zellner, On assessing prior distributions and Bayesian regression analysis with G-prior distributions, Bayesian Inference and Decision Techniques: Essays in Honor of Bruno de Finetti 6 pp 233– (1986)
This reference list is based on information provided by the publisher or from digital mathematics libraries. Its items are heuristically matched to zbMATH identifiers and may contain data conversion errors. In some cases that data have been complemented/enhanced by data from zbMATH Open. This attempts to reflect the references listed in the original paper as accurately as possible without claiming completeness or a perfect matching.