×

Estimating marginal likelihoods from the posterior draws through a geometric identity. (English) Zbl 1454.62091

Summary: This article develops a new estimator of the marginal likelihood that requires only a sample of the posterior distribution as the input from the analyst. This sample may come from any sampling scheme, such as Gibbs sampling or Metropolis-Hastings sampling. The presented approach can be implemented generically in almost any application of Bayesian modeling and significantly decreases the computational burdens associated with marginal likelihood estimation compared to existing techniques. The functionality of this method is demonstrated in the context of probit and logit regressions, on two mixtures of normals models, and also on a high-dimensional random intercept probit. Simulation results show that the simple approach presented here achieves excellent stability in low-dimensional models, and also clearly outperforms existing methods when the number of coefficients in the model increases.

MSC:

62F10 Point estimation
62F15 Bayesian inference
62-08 Computational methods for problems pertaining to statistics
65R20 Numerical methods for integral equations

References:

[1] J. H. Albert and S. Chib, Bayesian analysis of binary and polychotomous response data, J. Amer. Statist. Assoc. 88 (1993), no. 422, 669-679. · Zbl 0774.62031
[2] D. Ardia, N. Baştürk, L. Hoogerheide and H. K. van Dijk, A comparative study of Monte Carlo methods for efficient evaluation of marginal likelihood, Comput. Statist. Data Anal. 56 (2012), no. 11, 3398-3414. · Zbl 1255.62158
[3] J.-M. Bernardo and A. F. M. Smith, Bayesian Theory, Probab. Math. Stat., John Wiley & Sons, Chichester, 1994. · Zbl 0796.62002
[4] G. Celeux, M. Hurn and C. P. Robert, Computational and inferential difficulties with mixture posterior distributions, J. Amer. Statist. Assoc. 95 (2000), no. 451, 957-970. · Zbl 0999.62020
[5] S. Chib, Marginal likelihood from the Gibbs output, J. Amer. Statist. Assoc. 90 (1995), no. 432, 1313-1321. · Zbl 0868.62027
[6] S. Chib and I. Jeliazkov, Marginal likelihood from the Metropolis-Hastings output, J. Amer. Statist. Assoc. 96 (2001), no. 453, 270-281. · Zbl 1015.62020
[7] P. Congdon, Applied Bayesian Modelling, Probab. Math. Stat., John Wiley & Sons, Chichester, 2003. · Zbl 1023.62026
[8] S. Frühwirth-Schnatter, Bayesian model discrimination and Bayes factors for linear Gaussian state space models, J. Roy. Statist. Soc. Ser. B 57 (1995), no. 1, 237-246. · Zbl 0809.62023
[9] S. Frühwirth-Schnatter, Fully Bayesian analysis of switching Gaussian state space models, Ann. Inst. Statist. Math. 53 (2001), 31-49, · Zbl 0995.62087
[10] S. Frühwirth-Schnatter, Markov chain Monte Carlo estimation of classical and dynamic switching and mixture models, J. Amer. Statist. Assoc. 96 (2001), no. 453, 194-209. · Zbl 1015.62022
[11] S. Frühwirth-Schnatter, Estimating marginal likelihoods for mixture and Markov switching models using bridge sampling techniques, Econom. J. 7 (2004), no. 1, 143-167. · Zbl 1053.62087
[12] S. Frühwirth-Schnatter, Finite Mixture and Markov Switching Models, Springer Ser. Statist., Springer, New York, 2006. · Zbl 1108.62002
[13] S. Frühwirth-Schnatter and R. Frühwirth, Auxiliary mixture sampling with applications to logistic models, Comput. Statist. Data Anal. 51 (2007), no. 7, 3509-3528. · Zbl 1161.62387
[14] S. Frühwirth-Schnatter and H. Wagner, Marginal likelihoods for non-Gaussian models using auxiliary mixture sampling, Comput. Statist. Data Anal. 52 (2008), no. 10, 4608-4624. · Zbl 1452.62060
[15] A. Fussl, S. Frühwirth-Schnatter and R. Frühwirth, Efficient MCMC for binomial logit models, ACM Trans. Model. Comput. Simul. 23 (2013), no. 1, Article No. 3. · Zbl 1490.62197
[16] J. Geweke, Interpretation and inference in mixture models: simple MCMC works, Comput. Statist. Data Anal. 51 (2007), no. 7, 3529-3550. · Zbl 1161.62338
[17] P. J. Green, Reversible jump Markov chain Monte Carlo computation and Bayesian model determination, Biometrika 82 (1995), no. 4, 711-732. · Zbl 0861.62023
[18] T. Hastie, R. Tibshirani and J. Friedman, The Elements of Statistical Learning. Data Mining, Inference, and Prediction, 2nd ed., Springer Ser. Statist., Springer, New York, 2009. · Zbl 1273.62005
[19] R. E. Kass and A. E. Raftery, Bayes factors, J. Amer. Statist. Assoc. 90 (1995), no. 430, 773-795. · Zbl 0846.62028
[20] P. Lenk, Simulation pseudo-bias correction to the harmonic mean estimator of integrated likelihoods, J. Comput. Graph. Statist. 18 (2009), no. 4, 941-960.
[21] X.-L. Meng and W. H. Wong, Simulating ratios of normalizing constants via a simple identity: A theoretical exploration, Statist. Sinica 6 (1996), no. 4, 831-860. · Zbl 0857.62017
[22] M. A. Newton and A. E. Raftery, Approximate Bayesian inference with the weighted likelihood bootstrap. With discussion and a reply by the authors, J. Roy. Statist. Soc. Ser. B 56 (1994), no. 1, 3-48. · Zbl 0788.62026
[23] P. E. Rossi, G. M. Allenby and R. McCulloch, Bayesian Statistics and Marketing, Probab. Math. Stat., John Wiley & Sons, Chichester, 2005. · Zbl 1094.62037
[24] A. Skrondal and S. Rabe-Hesketh, Prediction in multilevel generalized linear models, J. Roy. Statist. Soc. Ser. A 172 (2009), no. 3, 659-687.
[25] T. A. B. Snijders and R. J. Bosker, Multilevel Analysis. An Introduction to Basic and Advanced Multilevel Modeling, 2nd ed., Sage Publications, Los Angeles, 2012. · Zbl 1296.62008
[26] M. Stephens, Dealing with label switching in mixture models, J. R. Stat. Soc. Ser. B Stat. Methodol. 62 (2000), no. 4, 795-809. · Zbl 0957.62020
[27] A. Zellner, An Introduction to Bayesian Inference in Econometrics, Probab. Math. Stat., John Wiley & Sons, New York, 1971. · Zbl 0246.62098
This reference list is based on information provided by the publisher or from digital mathematics libraries. Its items are heuristically matched to zbMATH identifiers and may contain data conversion errors. In some cases that data have been complemented/enhanced by data from zbMATH Open. This attempts to reflect the references listed in the original paper as accurately as possible without claiming completeness or a perfect matching.