×

On central matrix based methods in dimension reduction. (English. French summary) Zbl 1273.62094

Summary: Dimension reduction for regression analysis has been one of the most popular topics in the past two decades. It sees much progress with the introduction of the inverse regression, centered around the two key methods, sliced inverse regression (SIR) and sliced average variance estimation (SAVE). It is well known that SIR works poorly when the inverse conditional expectation \(E(X|Y)\) is close to being nonrandom. SAVE and its many generalizations, which do not suffer from this drawback, lag behind SIR in many other circumstances. Usually a certain weighted hybrid of SIR and SAVE is necessary to improve overall performance. However, it is difficult to find the optimal mixture weights in a hybrid, and most such hybrid methods, as well as SAVE, require the restrictive constant (conditional) variance condition. We propose a much weaker condition and a new accompanying algorithm. This enables us to create several new central matrices that perform very favourably to existing central matrix based methods without referring to hybrids.

MSC:

62G08 Nonparametric regression and quantile regression
62H12 Estimation in multivariate analysis
62G05 Nonparametric estimation
65C60 Computational problems in statistics (MSC2010)
Full Text: DOI

References:

[1] Cook, R. D. (1998). Regression Graphics: Ideas for Studying Regressions Through Graphics, New York, Wiley. · Zbl 0903.62001
[2] Cook, R. D. & Forzani, L. (2009). Likelihood‐based sufficient dimension reduction. Journal of the American Statistical Association, 104, 197-208. · Zbl 1388.62041
[3] Cook, R. D. & Li, B. (2002). Dimension reduction for conditional mean in regression. The Annals of Statistics, 30, 455-474. · Zbl 1012.62035
[4] Cook, R. D. & Ni, L. (2005). Sufficient dimension reduction via inverse regression: A minimum discrepancy approach. Journal of the American Statistical Association, 100, 410-428. · Zbl 1117.62312
[5] Cook, R. D. & Weisberg, S. (1991). Discussion of sliced inverse regression for dimension reduction. Journal of the American Statistical Association, 86, 28-33.
[6] Duan, N. & Li, K. C. (1991). Slicing regression: A link‐free regression method. The Annals of Statistics, 19, 505-530. · Zbl 0738.62070
[7] Ferré, L. & Yao, A‐F. (2005). Smoothed functional inverse regression. Statistica Sinica, 15, 665-683. · Zbl 1086.62054
[8] Friedman, J. H. & Sutetzle, W. (1981). Projection pursuit regression. Journal of the American Statistical Association, 76, 817-823.
[9] Hsing, T. & Carroll, R. J. (1992). An asymptotic theory for sliced inverse regression. The Annals of Statistics, 20, 1040-1061. · Zbl 0821.62019
[10] Johnson, M. E. (1987). Multivariate Statistical Simulation, John Wiley & Sons, New York. · Zbl 0604.62056
[11] Li, B., Zha, H., & Chiaromonte, F. (2005). Contour regression: A general approach to dimension reduction. The Annals of Statistics, 33, 1580-1616. · Zbl 1078.62033
[12] Li, B., & Wang, S. (2007). On directional regression for dimension reduction. Journal of the American Statistical Association, 102, 997-1008. · Zbl 1469.62300
[13] Li, K. C. (1991a). Sliced inverse regression for dimension reduction. Journal of the American Statistical Association, 86, 316-327. · Zbl 0742.62044
[14] Li, K. C. (1991b). Rejoinder of ‘sliced inverse regression for dimension reduction’. Journal of the American Statistical Association, 86, 337-342. · Zbl 1353.62040
[15] Li, K. C. (1992). On principal Hessian directions for data visualization and dimension reduction: Another application of Stein’s lemma. Journal of the American Statistical Association, 87, 1025-1039. · Zbl 0765.62003
[16] Li, L. & Yin, X. (2008). Sliced inverse regression with regularizations. Biometrics, 64, 124-131. · Zbl 1139.62055
[17] Scrucca, L. (2011). Model‐based SIR for dimension reduction. Computational Statistics & Data Analysis, 55, 3010-3026. · Zbl 1218.62037
[18] Wang, H. & Xia, Y. (2008). Sliced regression for dimension reduction. Journal of the American Statistical Association, 103, 811-821. · Zbl 1306.62168
[19] Wu, H‐M. (2008). Kernel sliced inverse regression with applications to classification. Journal of Computational and Graphical Statistics, 17, 590-610.
[20] Wu, Q., Liang, F., & Mukherjee, S. (2010). Localized sliced inverse regression. Journal of Computational and Graphical Statistics, 19, 4, 843-860.
[21] Xia, Y., Tong, H., Li, W. K., & Zhu, L.‐X. (2002). An adaptive estimation of dimension reduction space. Jounal of Royal Statistical Society, B, 64, 363-410. · Zbl 1091.62028
[22] Ye, Z. & Weiss, R. E. (2003). Using the bootstrap to select one of a new class of dimension reduction methods. Journal of the American Statistical Association, 98, 968-979. · Zbl 1045.62034
[23] Yin, X. & Cook, R. D. (2002) Dimension reduction for the conditional kth moment in regression. Jounal of Royal Statistical Society, B, 64, 159-175. · Zbl 1067.62042
[24] Zhu, L. X., Ohtaki, M., & Li, Y. X. (2007). On hybrid methods of inverse regression based algorithms. Computational Statistics & Data Analysis, 51, 2621-2635. · Zbl 1161.62332
[25] Zhu, L., Wang, T., Zhu, L., & Ferré, L. (2010). Sufficient dimension reduction through discretization‐expectation estimation. Biometrika, 97, 295-304. · Zbl 1205.62048
[26] Zhu, L. P., Zhu, L. X., & Feng, Z. H. (2010). Dimension reduction in regressions through cumulative slicing estimation. Journal of the American Statistical Association, 105, 1455-1466. · Zbl 1388.62121
[27] Zhu, L. X. & Fang, K. T. (1996). Asymptotics for kernel estimate of sliced inverse regression. The Annals of Statistics, 24, 1053-1068. · Zbl 0864.62027
[28] Zhu, L. X., Miao, B., & Peng, H. (2006). On sliced inverse regression with high‐dimensional covariates. Journal of the American Statistical Association, 101, 630-643. · Zbl 1119.62331
[29] Zhu, M. & Hastie, T. J. (2003). Feature extraction for nonparametric discriminant analysis. Journal of Computational and Graphical Statistics, 12, 101-120.
This reference list is based on information provided by the publisher or from digital mathematics libraries. Its items are heuristically matched to zbMATH identifiers and may contain data conversion errors. In some cases that data have been complemented/enhanced by data from zbMATH Open. This attempts to reflect the references listed in the original paper as accurately as possible without claiming completeness or a perfect matching.