×

Bayes inference in regression models with ARMA\((p,q)\) errors. (English) Zbl 0807.62065

Summary: We develop practical and exact methods of analyzing \(\text{ARMA} (p,q)\) regression error models in a Bayesian framework by using the Gibbs sampling and Metropolis-Hastings algorithms [W. K. Hastings, Biometrika 57, 97-109 (1970; Zbl 0219.65008)] and we prove that the kernel of the proposed Markov chain sampler converges to the true density. The procedures can be applied to pure ARMA time series models and to determine features of the likelihood function by choosing appropriate diffuse priors.
Our results are unconditional on the initial observations. We also show how the algorithm can be further simplified for the important special cases of stationary \(\text{AR}(p)\) and invertible \(\text{MA} (q)\) models. Recursive transformations developed in this paper to diagonalize the covariance matrix of the errors should prove useful in frequentist estimation. Examples with simulated and actual economic data are presented.

MSC:

62M10 Time series, auto-correlation, regression, etc. in statistics (GARCH)
62F15 Bayesian inference

Citations:

Zbl 0219.65008

Software:

AS 154
Full Text: DOI

References:

[1] Albert, J.; Chib, S.: Bayesian inference for autoregressive time series with mean and variance subject to Markov jumps. Journal of business and economic statistics 11, 1-15 (1993)
[2] Ansley, C. F.: An algorithm for the exact likelihood of a mixed autoregressive-moving average process. Biometrika 66, 59-65 (1979) · Zbl 0411.62059
[3] Baumol, W. J.: Economic dynamics. (1970)
[4] Broemeling, L. D.; Shaarway, S.: Bayesian inferences and forecasts with moving average processes. Communications in statistics, theory and methods 13, 1871-1888 (1984)
[5] Box, G. E. P.; Jenkins, G. M.: Time series analysis, forecasting and control. (1976) · Zbl 0363.62069
[6] Chib, S.: Bayes regression with autoregressive errors: A Gibbs sampling approach. Journal of econometrics 58, 275-294 (1993) · Zbl 0775.62068
[7] Chib, S.; Greenberg, E.: Estimating nonlinear latent variable models using Markov chain Monte Carlo. (1993)
[8] Chib, S.; Greenberg, E.: Hierarchical analysis of SUR models with extensions to correlated serial errors and time-varying parameter models. Journal of econometrics (1993) · Zbl 0833.62103
[9] Fuller, W.: The statistical analysis of time series. (1976) · Zbl 0353.62050
[10] Galbraith, J. W.; Zinde-Walsh, V.: The GLS transformation matrix and a semi-recursive estimator for the linear regression with ARMA errors. Econometric theory 8, 95-111 (1992)
[11] Gardner, G. A.; Harvey, A. C.; Phillips, G. D. A.: Algorithm AS154: an algorithm for the exact maximum likelihood estimation of autoregressive-moving average models by means of Kalman filters. Applied statistics 19, 311-322 (1979) · Zbl 0471.62098
[12] Gelfand, A. E.; Smith, A. F. M.: Sampling based approaches to calculating marginal densities. Journal of the American statistical association 85, 398-409 (1990) · Zbl 0702.62020
[13] Gelman, A.; Rubin, D. B.: Inference from iterative simulation using multiple sequences (with discussion). Statistical science 7, 457-511 (1992) · Zbl 1386.65060
[14] Geweke, J.: Evaluating the accuracy of sampling-based approaches to the calculation of posterior moments. Bayesian statistics 4, 169-193 (1992)
[15] Harvey, A.: Time series models. (1981) · Zbl 0464.62087
[16] Hastings, W. K.: Monte Carlo sampling methods using Markov chains and their applications. Biometrika 57, 97-109 (1970) · Zbl 0219.65008
[17] Jacquier, E.; Polson, N. G.; Rossi, P. E.: Bayesian analysis of stochastic volatility models. (1992) · Zbl 1082.62103
[18] Marriott, J.; Ravishanker, N.; Gelfand, A.; Pai, J.: Bayesian analysis of ARMA processes: complete sampling based inference under full likelihoods. (1992)
[19] Mcculloch, R. E.; Tsay, R. S.: Bayesian analysis of autoregressive time series via the Gibbs sampler. (1991) · Zbl 0800.62549
[20] Metropolis, N.; Rosenbluth, A. W.; Rosenbluth, M. N.; Teller, A. H.; Teller, E.: Equations of state calculations by fast computing machines. Journal of chemical physics 21, 1087-1092 (1953)
[21] Monahan, J. F.: Fully Bayesian analysis of ARMA time series models. Journal of econometrics 21, 307-331 (1983) · Zbl 0509.62084
[22] Müller, P.: A generic approach to posterior integration and Gibbs sampling. Journal of the American statistical association (1993)
[23] Otto, M. C.; Bell, W. R.; Burman, J. P.: An iterative GLS approach to maximum likelihood estimation of regression models with ARIMA errors. (1987)
[24] Pagan, A. R.; Nicholls, D. F.: Exact maximum likelihood estimation of regression models with finite order moving average errors. Review of economic studies 43, 383-388 (1976) · Zbl 0358.62046
[25] Ripley, B.: Stochastic simulation. (1987) · Zbl 0613.65006
[26] Ritter, C.; Tanner, M.: The Gibbs stopper and the griddy Gibbs sampler. Journal of the American statistical association 87, 861-868 (1992)
[27] Roberts, G. O.; Smith, A. F. M.: Simple conditions for the convergence of the Gibbs sampler and metropolis-Hastings algorithms. (1992) · Zbl 0803.60067
[28] Smith, A. F. M.; Gelfand, A. E.: Bayesian statistics without tears: A sampling-resampling perspective. The American statistician 46, 84-88 (1992)
[29] Tanner, M.; Wong, W. H.: The calculation of posterior distributions by data augmentation. Journal of the American statistical association 82, 528-549 (1987) · Zbl 0619.62029
[30] Tierney, L.: Markov chains for exploring posterior distributions. Annals of statistics (1993) · Zbl 0829.62080
[31] Zellner, A.: An introduction to Bayesian inference in econometrics. (1971) · Zbl 0246.62098
[32] Zellner, A.; Min, C.: Gibbs sampler convergence criteria (GSC2). (1992) · Zbl 0842.62018
This reference list is based on information provided by the publisher or from digital mathematics libraries. Its items are heuristically matched to zbMATH identifiers and may contain data conversion errors. In some cases that data have been complemented/enhanced by data from zbMATH Open. This attempts to reflect the references listed in the original paper as accurately as possible without claiming completeness or a perfect matching.