×

On dimension folding of matrix- or array-valued statistical objects. (English) Zbl 1183.62091

Summary: We consider dimension reduction for regression or classification in which the predictors are matrix- or array-valued. This type of predictor arises when measurements are obtained for each combination of two or more underlying variables, for example the voltage measured at different channels and times in electroencephalography data. For these applications, it is desirable to preserve the array structure of the reduced predictor (e.g., time versus channel), but this cannot be achieved within the conventional dimension reduction formulation.
We introduce a dimension reduction method, to be called dimension folding, for matrix- and array-valued predictors that preserves the array structure. In an application of dimension folding to an electroencephalography data set, we correctly classify 97 out of 122 subjects as alcoholic or nonalcoholic based on their electroencephalography in a cross-validation sample.

MSC:

62H12 Estimation in multivariate analysis
62G08 Nonparametric regression and quantile regression
62A09 Graphical methods in statistics
92C55 Biomedical imaging and signal processing
62H99 Multivariate analysis

References:

[1] Billingsley, P. (1986). Probability and Measure , 2nd ed. Wiley, New York. · Zbl 0649.60001
[2] Bura, E. and Cook, R. D. (2001). Estimating the structural dimension of regressions via parametric inverse regression. J. R. Stat. Soc. Ser. B Stat. Methodol. 63 393-410. JSTOR: · Zbl 0979.62041 · doi:10.1111/1467-9868.00292
[3] Chiaromonte, F. and Martinelli, J. (2002). Dimension reduction strategies for analyzing global gene expression data with a response. Math. Biosci. 176 123-144. · Zbl 0999.62090 · doi:10.1016/S0025-5564(01)00106-7
[4] Chiaromonte, F. and Cook R. D. (2001). Sufficient dimension reduction and graphics in regression. Ann. Inst. Statist. Math. 54 768-795. · Zbl 1047.62066 · doi:10.1023/A:1022411301790
[5] Cook, R. D. (1994). On the interpretation of regression plots. J. Amer. Statist. Assoc. 89 177-189. JSTOR: · Zbl 0791.62066 · doi:10.2307/2291214
[6] Cook, R. D. (1996). Graphics for regressions with a binary response. J. Amer. Statist. Assoc. 91 983-992. JSTOR: · Zbl 0882.62060 · doi:10.2307/2291717
[7] Cook, R. D. (1998). Regression Graphics: Ideas for Studying Regressions through Graphics . Wiley, New York. · Zbl 0903.62001
[8] Cook, R. D. and Li, B. (2002). Dimension reduction for the conditional mean. Ann. Statist. 30 455-474. · Zbl 1012.62035 · doi:10.1214/aos/1021379861
[9] Cook, R. D. and Li, B. (2004). Determining the dimension of iterative Hessian transformation. Ann. Statist. 32 2501-2531. · Zbl 1069.62033 · doi:10.1214/009053604000000661
[10] Cook, R. D., Li, B. and Chiaromonte, F. (2007). Dimension reduction without matrix inversion. Biometrika 94 596-584. · Zbl 1135.62046 · doi:10.1093/biomet/asm038
[11] Cook, R. D., Li, B. and Chiaromonte, F. (2009). Envelope models for parsimonious and efficient multivariate linear regression (with discussion). Statist. Sinica . · Zbl 1259.62059
[12] Cook, R. D. and Ni, L. (2005). Sufficient dimension reduction via inverse regression: A minimum discrepancy approach. J. Amer. Statist. Assoc. 100 410-428. · Zbl 1117.62312 · doi:10.1198/016214504000001501
[13] Cook, R. D. and Weisberg, S. (1991). Discussion of “Sliced inverse regression for dimension reduction.” J. Amer. Statist. Assoc. 86 316-342. JSTOR: · Zbl 0742.62044 · doi:10.2307/2290563
[14] Dong, Y. and Li, B. (2009). Dimension reduction for nonelliptically distributed predictors: Second-order methods. Biometrika . Submitted. · Zbl 1160.62050 · doi:10.1214/08-AOS598
[15] Duan, N. and Li, K.-C. (1991). Slicing regression: A link-free regression method. Ann. Statist. 19 505-530. · Zbl 0738.62070 · doi:10.1214/aos/1176348109
[16] Johnson, R. A. and Wichern, D. W. (2007). Applied Multivariate Statistical Analysis . Pearson Prentice Hall, Upper Saddle River, NJ. · Zbl 1269.62044
[17] Ferre, L. and Yao, A. F. (2005). Smooth function inverse regression. Statist. Sinica 15 665-683. · Zbl 1086.62054
[18] Fung, K. F., He, X., Liu, L. and Shi, P. (2002). Dimension reduction based on canonical correlation. Statist. Sinica 12 1093-1113. · Zbl 1004.62058
[19] Hoerl, A. E. (1962). Application of ridge analysis to regression problems. Chemical Engineering Progress 58 54-59.
[20] Li, B. (2008). Comments on: Augmenting the bootstrap to analyze high dimensional genomic data. Test 17 19-21. · Zbl 1318.62325 · doi:10.1007/s11749-008-0099-5
[21] Li, B. and Dong, Y. (2009). Dimension reduction for nonelliptically distributed predictors. Ann. Statist. 37 1272-1298. · Zbl 1160.62050 · doi:10.1214/08-AOS598
[22] Li, B. and Wang, S. (2007). On directional regression for dimension reduction. J. Amer. Statist. Assoc. 102 2143-2172. · Zbl 1469.62300 · doi:10.1198/016214507000000536
[23] Li, B., Wen, S. and Zhu, L.-X. (2008). On a Projective Resampling method for dimension reduction with multivariate responses. J. Amer. Statist. Assoc. 103 1177-1186. · Zbl 1205.62067 · doi:10.1198/016214508000000445
[24] Li, B., Zha, H. and Chiaromonte, C. (2005). Contour regression: A general approach to dimension reduction. Ann. Statist. 33 1580-1616. · Zbl 1078.62033 · doi:10.1214/009053605000000192
[25] Li, L. and Li, H. (2004). Dimension reduction methods for microarrays with application to censored survival data. Bioinformatics 20 3406-3412. · Zbl 1022.68519
[26] Li, K.-C. (1991). Sliced inverse regression for dimension reduction (with discussion). J. Amer. Statist. Assoc. 86 316-342. JSTOR: · Zbl 0742.62044 · doi:10.2307/2290563
[27] Li, K.-C. (1992). On principal Hessian directions for data visualization and dimension reduction: Another application of Stein’s lemma. J. Amer. Statist. Assoc. 87 1025-1039. JSTOR: · Zbl 0765.62003 · doi:10.2307/2290640
[28] Magnus, J. R. and Neudecker, H. (1979). The commutation matrix: Some properties and applications. Ann. Statist. 2 381-394. · Zbl 0414.62040 · doi:10.1214/aos/1176344621
[29] Marquardt, D. W. (1970). Generalized inverses, ridge regression, biased linear estimation, and nonlinear estimation. Technometrics 12 591-612. · Zbl 0205.46102 · doi:10.2307/1267205
[30] Tyekucheva, F. and Chiaromonte, F. (2008). Augmenting the bootstrap to analyze high-dimensional genomic data. Test 1-18. · Zbl 1318.62330 · doi:10.1007/s11749-008-0098-6
[31] Ye, Z. and Weiss, R. E. (2003). Using the bootstrap to select one of a new class of dimension reduction methods. J. Amer. Statist. Assoc. 98 968-979. · Zbl 1045.62034 · doi:10.1198/016214503000000927
[32] Yin, X. and Cook, R. D. (2002). Dimension reduction for the conditional k th moment in regression. J. R. Stat. Soc. Ser. B Stat. Methodol. 64 159-175. JSTOR: · Zbl 1067.62042 · doi:10.1111/1467-9868.00330
[33] Yin, X., Li, B. and Cook, R. D. (2008). Successive direction extraction for estimating the central subspace in a multiple-index regression. J. Multivariate Anal. 99 1733-1757. · Zbl 1144.62030 · doi:10.1016/j.jmva.2008.01.006
[34] Zhou, J. (2009). Robust dimension reduction based on canonical correlation. J. Multivariate Anal. 100 195-209. · Zbl 1151.62055 · doi:10.1016/j.jmva.2008.04.003
[35] Zhu, L.-X. and Fang, K.-T. (1996). Asymptotics for kernel estimate of sliced inverse regression. Ann. Statist. 24 1053-1068. · Zbl 0864.62027 · doi:10.1214/aos/1032526955
This reference list is based on information provided by the publisher or from digital mathematics libraries. Its items are heuristically matched to zbMATH identifiers and may contain data conversion errors. In some cases that data have been complemented/enhanced by data from zbMATH Open. This attempts to reflect the references listed in the original paper as accurately as possible without claiming completeness or a perfect matching.