×

Exploring the sources of uncertainty: why does bagging for time series forecasting work? (English) Zbl 1403.62169

Summary: In [“Bagging exponential smoothing methods using STL decomposition and Box-Cox transformation”, Int. J. Forecast. 32, No. 2, 303–312 (2016; doi:10.1016/j.ijforecast.2015.07.002)], the third author et al. successfully employed a bootstrap aggregation (bagging) technique for improving the performance of exponential smoothing. Each series is Box-Cox transformed, and decomposed by seasonal and trend decomposition using Loess (STL); then bootstrapping is applied on the remainder series before the trend and seasonality are added back, and the transformation reversed to create bootstrapped versions of the series. Subsequently, they apply automatic exponential smoothing on the original series and the bootstrapped versions of the series, with the final forecast being the equal-weight combination across all forecasts. In this study, we attempt to address the question: why does bagging for time series forecasting work? We assume three sources of uncertainty (model uncertainty, data uncertainty, and parameter uncertainty), and we separately explore the benefits of bagging for time series forecasting for each one of them. Our analysis considers 4004 time series (from the M- and M3-competitions) and two families of models. The results show that the benefits of bagging predominantly originate from the model uncertainty: the fact that different models might be selected as optimal for the bootstrapped series. As such, a suitable weighted combination of the most suitable models should be preferred to selecting a single model.

MSC:

62M10 Time series, auto-correlation, regression, etc. in statistics (GARCH)
91B84 Economic time series analysis

References:

[1] Ahmadi-Javid, A.; Jalali, Z.; Klassen, K. J., Outpatient appointment systems in healthcare: A review of optimization studies, European Journal of Operational Research, 258, 1, 3-34 (2017) · Zbl 1380.90106
[2] Bergmeir, C.; Hyndman, R. J.; Benítez, J. M., Bagging exponential smoothing methods using STL decomposition and Box-Cox transformation, International Journal of Forecasting, 32, 303-312 (2016)
[3] Box, G. E.P.; Cox, D. R., An analysis of transformations, Journal of the Royal Statistical Society, Series B, 26, 2, 211-252 (1964) · Zbl 0156.40104
[4] Box, G. E.P.; Draper, N. R., Empirical model-building and response surfaces (1987), John Wiley & Sons: John Wiley & Sons New York · Zbl 0614.62104
[5] Breiman, L., Bagging predictors, Machine Learning, 24, 2, 123-140 (1996) · Zbl 0858.68080
[6] Bühlmann, P., Sieve bootstrap for time series, Bernoulli, 3, 2, 123-148 (1997) · Zbl 0874.62102
[7] Cleveland, R. B.; Cleveland, W. S.; McRae, J. E.; Terpenning, I., STL: A seasonal-trend decomposition procedure based on loess, Journal of Official Statistics, 6, 3-73 (1990)
[8] Cleveland, W. S.; Grosse, E.; Shyu, W. M., Local regression models. statistical modelss (1992), S. Chapman & Hall/CRC
[9] Cordeiro, C.; Neves, M., Forecasting time series with BOOT.EXPOS procedure, REVSTAT - Statistical Journal, 7, 2, 135-149 (2009)
[10] Fildes, R.; Petropoulos, F., Simple versus complex selection rules for forecasting many time series, Journal of Business Research, 68, 8, 1692-1701 (2015)
[11] Guerrero, V. M., Time-series analysis supported by power transformations, Journal of Forecasting, 12, 37-48 (1993)
[12] Hastie, T.; Tibshirani, R.; Friedman, J., The elements of statistical learning: Data mining, inference, and prediction (2009), Springer · Zbl 1273.62005
[13] Hyndman, R. J. (2017). forecast: Forecasting functions for time series and linear models, R package version 8.0, http://github.com/robjhyndman/forecast; Hyndman, R. J. (2017). forecast: Forecasting functions for time series and linear models, R package version 8.0, http://github.com/robjhyndman/forecast
[14] Hyndman, R. J.; Athanasopoulos, G., Forecasting: principles and practice (2014), OTexts: OTexts Melbourne, Australia
[15] Hyndman, R. J.; Khandakar, Y., Automatic time series forecasting: The forecast package for R, Journal of Statistical Software, 27, 3, 1-22 (2008)
[16] Hyndman, R. J.; Koehler, A. B., Another look at measures of forecast accuracy, International Journal of Forecasting, 22, 4, 679-688 (2006)
[17] Jakubovskis, A., Strategic facility location, capacity acquisition, and technology choice decisions under demand uncertainty: Robust vs. non-robust optimization approaches, European Journal of Operational Research, 260, 3, 1095-1104 (2017) · Zbl 1403.90477
[18] Kolassa, S., Combining exponential smoothing forecasts using Akaike weights, International Journal of Forecasting, 27, 2, 238-251 (2011)
[19] Künsch, H. R., The jackknife and the bootstrap for general stationary observations, Annals of Statistics, 17, 3, 1217-1241 (1989) · Zbl 0684.62035
[20] Ma, S.; Fildes, R., A retail store SKU promotions optimization model for category multi-period profit maximization, European Journal of Operational Research, 260, 2, 680-692 (2017) · Zbl 1403.90454
[21] Makridakis, S.; Andersen, A.; Carbone, R.; Fildes, R.; Hibon, M.; Lewandowski, R., The accuracy of extrapolation (time series) methods: Results of a forecasting competition, Journal of Forecasting, 1, 2, 111-153 (1982)
[22] Makridakis, S.; Hibon, M., The M3-competition: Results, conclusions and implications, International Journal of Forecasting, 16, 4, 451-476 (2000)
[23] McMurry, T.; Politis, D. N., Banded and tapered estimates of autocovariance matrices and the linear process bootstrap, Journal of Time Series Analysis, 31, 471-482 (2010) · Zbl 1226.60052
[24] Nikolopoulos, K.; Petropoulos, F., Forecasting for big data: Doessub-optimality matter?, Computers & Operations Research (2018) · Zbl 1392.62347
[25] Petropoulos, F. (2014). Guest post: On the robustness of bagging exponential smoothing. Nikolaos Kourentzes, Forecasting Research Blog, http://kourentzes.com/forecasting/2014/10/31/; Petropoulos, F. (2014). Guest post: On the robustness of bagging exponential smoothing. Nikolaos Kourentzes, Forecasting Research Blog, http://kourentzes.com/forecasting/2014/10/31/
[26] Petropoulos, F.; Makridakis, S.; Assimakopoulos, V.; Nikolopoulos, K., ‘Horses for Courses’ in demand forecasting, European Journal of Operational Research, 237, 152-163 (2014) · Zbl 1304.62117
[27] Politis, D. N.; Romano, J. P., A circular block-resampling procedure for stationary data, Technical Report No. 370 (1991), Department of Statistics, Stanford University: Department of Statistics, Stanford University Stanford, California, USA
[28] R Core Team (2017). R: A language and environment for statistical computing. R Foundation for Statistical Computing, Vienna, Austria, http://www.R-project.org/; R Core Team (2017). R: A language and environment for statistical computing. R Foundation for Statistical Computing, Vienna, Austria, http://www.R-project.org/
[29] Syntetos, A. A.; Nikolopoulos, K.; Boylan, J. E., Judging the judges through accuracy-implication metrics: The case of inventory forecasting, International Journal of Forecasting, 26, 1, 134-143 (2010)
[30] Tashman, L. J., Out-of-sample tests of forecasting accuracy: An analysis and review, International Journal of Forecasting, 16, 4, 437-450 (2000)
This reference list is based on information provided by the publisher or from digital mathematics libraries. Its items are heuristically matched to zbMATH identifiers and may contain data conversion errors. In some cases that data have been complemented/enhanced by data from zbMATH Open. This attempts to reflect the references listed in the original paper as accurately as possible without claiming completeness or a perfect matching.