×

Weighted LAD-LASSO method for robust parameter estimation and variable selection in regression. (English) Zbl 1243.62029

Summary: The weighted least absolute deviation (WLAD) regression estimation method and the adaptive least absolute shrinkage and selection operator (LASSO) are combined to achieve robust parameter estimation and variable selection in regression simultaneously. Compared with the LAD-LASSO method, the weighted LAD-LASSO (WLAD-LASSO) method will resist to the heavy-tailed errors and outliers in explanatory variables. Properties of the WLAD-LASSO estimators are investigated. A small simulation study and an example are provided to demonstrate the superiority of the WLAD-LASSO method over the LAD-LASSO method in the presence of outliers in the explanatory variables and the heavy-tailed error distribution.

MSC:

62F12 Asymptotic properties of parametric estimators
62J05 Linear regression; mixed models
62F35 Robustness and adaptive procedures (parametric inference)
65C60 Computational problems in statistics (MSC2010)
62Q05 Statistical tables
Full Text: DOI

References:

[1] Ellis, S.; Morgenthaler, S., Leverage and breakdown in \(L_1\) regression, Journal of the American Statistical Association, 87, 143-148 (1992) · Zbl 0781.62101
[2] Fan, J.; Li, R., Variable selection via nonconcave penalized likelihood and its oracle properties, Journal of the American Statistical Association, 96, 1348-1360 (2001) · Zbl 1073.62547
[3] Gao, X., 2008, Penalized methods for high-dimensional least absolute deviations regression, unpublished Ph.D. Thesis, University of Iowa.; Gao, X., 2008, Penalized methods for high-dimensional least absolute deviations regression, unpublished Ph.D. Thesis, University of Iowa.
[4] Giloni, A.; Simonoff, J.; Sengupta, B., Robust weighted LAD regression, Computational Statistics \(\&\) Data Analysis, 50, 3124-3140 (2006) · Zbl 1445.62163
[5] Giloni, A.; Sengupta, B.; Simonoff, J., A Mathematical Programming Approach for Improving the Robustness of Least Sum of Ansolute Deviations Regression (2006), Wiley InterScience · Zbl 1127.62060
[6] Hawkins, D. M.; Bradu, D.; Kas, G. V., Location of several outliers in multiple regression using elemental subsets, Technometrics, 26, 197-208 (1984)
[7] Hurvich, C. M.; Tsai, C. L., Regression and time series model selection in small samples, Biometrika, 76, 297-307 (1989) · Zbl 0669.62085
[8] Hurvich, C. M.; Tsai, C. L., Model selection for least absolute deviations regressions in small samples, Statistics and Probability Letters, 9, 259-265 (1990)
[9] Hesterberg, T.; Choi, N. H.; Meier, L.; Fraley, C., Least angle and \(L_1\) penalized regression: a review, Statistics Surveys, 2, 61-93 (2008) · Zbl 1189.62070
[10] Hubert, M.; Rousseeuw, P., Robust regression with both continuous and binary regressors, Journal of Statistical Planning and Inference, 57, 153-163 (1997) · Zbl 0900.62174
[11] Miller, A., Subset Selection in Regression (2002), Chapman & Hall · Zbl 1051.62060
[12] Owen, A. B., A robust hybrid of lasso and ridge regression, Contemparary Mathematics, 443, 59-71 (2007) · Zbl 1134.62047
[13] Ronchetti, E.; Staudte, R. G., A robust version of Mallows \(C_p\), Journal of the American Statistical Association, 89, 550-559 (1994) · Zbl 0803.62026
[14] Rosset, S.; Zhu, J., Least angle regression: discussion, The Annals of Statistics, 32, 469-475 (2004)
[15] Rousseeuw, P. J.; Leroy, A. M., Robust Regression and Outlier Detection (1987), Wiley: Wiley New York · Zbl 0711.62030
[16] Shao, J., An asymptotic theory for linear model selection, Statistica Sinica, 7, 221-264 (1997) · Zbl 1003.62527
[17] Shi, P.; Tsai, C. L., Regression model selection: a residual likelihood approach, Journal of the Royal Statistical Society, Series B, 64, 237-252 (2002) · Zbl 1059.62074
[18] Shi, P.; Tsai, C. L., A joint regression variable and autoregressive order selection criterion, Journal of Time Series, 25, 923-941 (2004) · Zbl 1063.62098
[19] Tibshirani, R., Regression shrinkage and selection via the lasso, Journal of the Royal Statistical Society, Series B, 58, 267-288 (1996) · Zbl 0850.62538
[20] Wang, H.; Leng, C., Unified lasso estimation by least squares approximation, Journal of the American Statistical Association, 102, 1039-1048 (2007) · Zbl 1306.62167
[21] Wang, H.; Li, G.; Jiang, G., Robust regression shrinkage and consistent variable selection through the LAD-Lasso, Journal of Business & Economic Statistics, 25, 347-355 (2007)
[22] Xu, J., 2005, Parameter estimation, model selection and inference in \(L_1\); Xu, J., 2005, Parameter estimation, model selection and inference in \(L_1\)
[23] Xu, J.; Ying, Z., Simultaneous estimation and variable selection in median regression using Lasso-type penalty, Annals of the Institute of Statistical Mathematics, 62, 487-514 (2010) · Zbl 1440.62280
[24] Zou, H., The adaptive lasso and its oracle properties, Journal of the American Statistical Association, 101, 1418-1429 (2006) · Zbl 1171.62326
This reference list is based on information provided by the publisher or from digital mathematics libraries. Its items are heuristically matched to zbMATH identifiers and may contain data conversion errors. In some cases that data have been complemented/enhanced by data from zbMATH Open. This attempts to reflect the references listed in the original paper as accurately as possible without claiming completeness or a perfect matching.