×

Time-varying additive model with autoregressive errors for locally stationary time series. (English) Zbl 07706271

Summary: In this article, we study the time-varying additive model with time-varying autoregression (tvAR) error in the locally stationary context, and propose the two-step estimation for it. B-spline method, which is computation efficient, is adopted to obtain the initial estimator of trend function and additive components. And then the structure of autoregression error is estimated by ULASSO, the consistency and asymptotical normality are proved. At last, with the initial estimator and the estimated error structure, the improved estimator of trend function and additive components is derived by local linear smoothing, and its asymptotic normality and oracle property are proved. Simulation studies validate the properties of the proposed estimators. A real data application illustrates the proposed model is applicable and more appropriate than the classical additive model in the presence of locally stationary regressors.

MSC:

62-XX Statistics

Software:

ITSM2000; itsmr
Full Text: DOI

References:

[1] Bellegem, S. V.; Dahlhaus, R., Semiparametric estimation by model selection for locally stationary processes, Journal of the Royal Statistical Society: Series B (Statistical Methodology), 68, 5, 721-46 (2006) · Zbl 1110.62119 · doi:10.1111/j.1467-9868.2006.00564.x
[2] Bontemps, C.; Simioni, M.; Surry, Y., Semiparametric hedonic price models: Assessing the effects of agricultural nonpoint source pollution, Journal of Applied Econometrics, 23, 6, 825-42 (2008) · doi:10.1002/jae.1022
[3] Bosq, D., Lecture Notes in Statistics, 110, Nonparametric statistics for stochastic processes: Estimation and prediction (1998), Springer-Verlag · Zbl 0902.62099
[4] Box, G. E. P.; Pierce, D. A., Distribution of residual autocorrelations in the regression model with autoregressive-moving average errors, Journal of the American Statistical Association, 65, 332, 1509-26 (1970) · Zbl 0224.62041 · doi:10.1080/01621459.1970.10481180
[5] Brockwell, P. J.; Davis, R. A., Introduction to time series and forecasting (2016), New York: Springer-Verlag, New York · Zbl 1355.62001
[6] Brockwell, P.; Davis, R., Time series theory and methods (1991), New York: Springer, New York · Zbl 0709.62080
[7] Chitturi, R. V., Distribution of residual autocorrelations in multiple autoregressive schemes, Journal of the American Statistical Association, 69, 348, 928-34 (1974) · Zbl 0296.62057 · doi:10.1080/01621459.1974.10480230
[8] Dahlhaus, R., Asymptotic statistical inference for nonstationary processes with evolutionary spectra, Lecture Notes in Statistics, 115, 145-59 (1996)
[9] De Boor, C., A practical guide to splines, 27 (1978), New York: Springer-Verlag, New York · Zbl 0406.41003
[10] DeVore, R. A.; Lorentz, G. G., Constructive approximation, 303 (1993) · Zbl 0797.41016
[11] Fan, J.; Zhang, C.; Zhang, J., Generalized likelihood ratio statistics and wilks phenomenon, The Annals of Statistics, 29, 1, 153-93 (2001) · doi:10.1214/aos/996986505
[12] Hansen, B. E., Uniform convergence rates for kernel estimation with dependent data, Econometric Theory, 24, 3, 726-48 (2008) · Zbl 1284.62252 · doi:10.1017/S0266466608080304
[13] Hastie, T.; Tibshirani, R., Generalized additive models, 43 (1990) · Zbl 0747.62061
[14] Hu, J.; Ding, H.; Liu, L.; Feng, J., Statistical inference of locally stationary functional coefficient models, Journal of Statistical Planning and Inference, 209, 27-43 (2020) · Zbl 1441.62099 · doi:10.1016/j.jspi.2020.02.006
[15] Hu, L.; Huang, T.; You, J., Supplementary material to “two-step estimation of time-varying additive model for locally stationary time series, Computational Statistics & Data Analysis, 130, 94-110 (2019) · Zbl 1469.62083 · doi:10.1016/j.csda.2018.08.023
[16] Hu, L.; Huang, T.; You, J., Two-step estimation of time-varying additive model for locally stationary time series, Computational Statistics & Data Analysis, 130, 94-110 (2019) · Zbl 1469.62083 · doi:10.1016/j.csda.2018.08.023
[17] Kim, W. (2001)
[18] Kulperger, R. J., Some remarks on regression with autoregressive errors and their residual processes, Journal of Applied Probability, 24, 3, 668-78 (1987) · Zbl 0633.62088 · doi:10.2307/3214098
[19] Li, J.; Li, T., Some theoretical results concerning time-varying nonparametric regression with local stationary regressors and error, Acta Mathematica Sinica (2020)
[20] Ma, S.; Yang, L., Spline-backfitted kernel smoothing of partially linear additive model, Journal of Statistical Planning and Inference, 141, 1, 204-19 (2011) · Zbl 1197.62130 · doi:10.1016/j.jspi.2010.05.028
[21] Mammen, E.; Linton, O.; Nielsen, J. P., The existence and asymptotic properties of a backfitting projection algorithm under weak conditions, The Annals of Statistics, 27, 5, 1443-90 (1999) · Zbl 0986.62028 · doi:10.1214/aos/1017939138
[22] Pei, Y.; Huang, T.; You, J., Nonparametric fixed effects model for panel data with locally stationary regressors, Journal of Econometrics, 202, 2, 286-305 (2018) · Zbl 1394.62125 · doi:10.1016/j.jeconom.2017.06.023
[23] Pierce, D. A., Distribution of residual autocorrelations in the regression model with autoregressive-moving average errors, Journal of Royal Statistical Society; Series B, 33, 140-46 (1971) · Zbl 0221.62017
[24] Pierce, D. A., Residual correlations and diagnostic checking in dynamic-disturbance time series models, Journal of the American Statistical Association, 67, 339, 636-40 (1972) · Zbl 0245.62086 · doi:10.1080/01621459.1972.10481266
[25] Qiu, D.; Shao, Q.; Yang, L., Efficient inference for autoregressive coefficients in the presence of trends, Journal of Multivariate Analysis, 114, 40-53 (2013) · Zbl 1255.62279 · doi:10.1016/j.jmva.2012.07.016
[26] Schröder, A. L.; Fryzlewicz, P., Adaptive trend estimation in financial time series via multiscale change-point-induced basis recovery, Statistics and Its Interface (2013) · Zbl 1326.91035
[27] Shao, Q.; Yang, L., Oracally efficient estimation and consistent model selection for auto-regressive moving average time series with trend, Journal of the Royal Statistical Society: Series B (Statistical Methodology), 79, 2, 507-24 (2017) · Zbl 1414.62380 · doi:10.1111/rssb.12170
[28] Stone, C. J., Additive regression and other nonparametric models, The Annals of Statistics, 13, 2, 689-705 (1985) · Zbl 0605.62065 · doi:10.1214/aos/1176349548
[29] Stone, C. J., The use of polynomial splines and their tensor products in multivariate function estimation, The Annals of Statistics, 22, 1, 118-71 (1994) · Zbl 0827.62038
[30] Truong, Y. K., Nonparametric curve estimation with time series errors, Journal of Statistical Planning and Inference, 28, 2, 167-83 (1991) · Zbl 0734.62047 · doi:10.1016/0378-3758(91)90024-9
[31] Vogt, M., Nonparametric regression for locally stationary time series, The Annals of Statistics, 40, 5, 2601-33 (2012) · Zbl 1373.62459 · doi:10.1214/12-AOS1043
[32] Wan, A. T. K.; You, J.; Zhang, R., A seemingly unrelated nonparametric additive model with autoregressive errors, Econometric Reviews, 35, 5, 894-928 (2016) · Zbl 1491.62032 · doi:10.1080/07474938.2014.998149
[33] Wang, D.; Kulasekera, K., Parametric component detection and variable selection in varying-coefficient partially linear models, Journal of Multivariate Analysis, 112, 117-29 (2012) · Zbl 1273.62093 · doi:10.1016/j.jmva.2012.05.006
[34] Wang, L.; Yang, L., Spline-backfitted kernel smoothing of nonlinear additive autoregression model, The Annals of Statistics, 35, 6, 2474-503 (2007) · Zbl 1129.62038 · doi:10.1214/009053607000000488
[35] Yao, J., Semi-parametric examination of industry risk: The australian evidence, Australian Economic Papers, 51, 4, 228-46 (2012) · doi:10.1111/1467-8454.12003
This reference list is based on information provided by the publisher or from digital mathematics libraries. Its items are heuristically matched to zbMATH identifiers and may contain data conversion errors. In some cases that data have been complemented/enhanced by data from zbMATH Open. This attempts to reflect the references listed in the original paper as accurately as possible without claiming completeness or a perfect matching.