×

Reduced rank ridge regression and its kernel extensions. (English) Zbl 07260307

Summary: In multivariate linear regression, it is often assumed that the response matrix is intrinsically of lower rank. This could be because of the correlation structure among the prediction variables or the coefficient matrix being lower rank. To accommodate both, we propose a reduced rank ridge regression for multivariate linear regression. Specifically, we combine the ridge penalty with the reduced rank constraint on the coefficient matrix to come up with a computationally straightforward algorithm. Numerical studies indicate that the proposed method consistently outperforms relevant competitors. A novel extension of the proposed method to the reproducing kernel Hilbert space (RKHS) set-up is also developed.

MSC:

62-XX Statistics
68-XX Computer science

Software:

SDPT3

References:

[1] W. Massy, Principal components regression with exploratory statistical research, J Am Stat Assoc 60 (1965), 234-246.
[2] H. Wold, Soft modeling by latent variables: the non-linear iterative partial least squares approach, In Perspect Prob Stat, papers in Honor of M.S. Bartlett, S. Gani, ed. New York Academic Press, 1975. · Zbl 0331.62058
[3] H. Hotelling, The most predictable criterion, J Ed Psychol 26 (1935), 139-142.
[4] T. Anderson, Estimating linear restrictions on regression coefficients for multivariate normal distributions, Ann Math Stat 22 (1951), 327-351. · Zbl 0043.13902
[5] A. Izenman, Reduced-rank regression for the multivariate linear model, J Multivariate Anal 5(2) (1975), 248-264. · Zbl 0313.62042
[6] P. Davies, and M. Tso, Procedures for reduced-rank regression, Appl Stat 31(3) (1982), 244-255.
[7] G. Reinsel, and R. Velu, Multivariate Reduced-Rank Regression: Theory and Applications, New York, Springer, 1998. · Zbl 0909.62066
[8] M. Yuan, A. Ekici, Z. Lu, and R. Monteiro, Dimension reduction and coefficient estimation in multivariate linear regression, J R Stat Soc B 69(3) (2007), 329-346. · Zbl 07555355
[9] R. Tutuncu, K. Toh, and M. Todd, Solving semidefinitequadratic-linear programs using SDPT3, Math Program 95 (2003), 189-217. · Zbl 1030.90082
[10] L. Breiman, and J. Friedman, Predicting multivariate responses in multiple linear regression, J R Stat Soc B 59 (1997), 3-37. · Zbl 0897.62068
[11] A. Hoerl, and R. Kennard, Ridge regression: biased estimation for non-orthogonal problems, Technometrics 8 (1970), 27-51. · Zbl 0202.17205
[12] R. Tibshirani, Regression shrinkage and selection via the Lasso, J R Stat Soc B 58 (1996), 267-288. · Zbl 0850.62538
[13] H. Zou, and T. Hastie, Regularization and variable selection via the elastic Neta, J R Stat Soc B 67 (2005), 301-320. · Zbl 1069.62054
[14] B. Turlach, W. Venables, and S. Wright, Simultaneous variable selection, Technometrics 47(3) (2005), 349-363.
[15] J. Peng, J. Zhu, A. Bergamaschi, W. Han, D. Noh, J. Pollack, and P. Wang, Regularized multivariate regression for identifying master prediction with application to intergrative genomics study of breast cancer, Ann Appl Stat 4(1) (2009), 53-77. · Zbl 1189.62174
[16] B. Skagerberg, J. MacGregor, and C. Kiparissdes, Multivariate data analysis applied to low density polyethylene reactors, Chem Intell Lab Syst 14 (1992), 341-356.
This reference list is based on information provided by the publisher or from digital mathematics libraries. Its items are heuristically matched to zbMATH identifiers and may contain data conversion errors. In some cases that data have been complemented/enhanced by data from zbMATH Open. This attempts to reflect the references listed in the original paper as accurately as possible without claiming completeness or a perfect matching.