×

Dimensionality reduction approach to multivariate prediction. (English) Zbl 1429.62283

Summary: Dimensionality reduction methods used for prediction can be cast into a general framework by deriving them from a common objective function. Such a function yields continuum of different solutions, including all the known ones. Least-squares and maximum likelihood estimation of the model at the base of dimensionality reduction methods for prediction lead to an additive objective function. By letting this additive function be any convex linear combination of the two addends, another objective function from which a continuum of solutions can be obtained.

MSC:

62J05 Linear regression; mixed models
62H25 Factor analysis and principal components; correspondence analysis
Full Text: DOI

References:

[1] Anderson, T. W., An Introduction to Multivariate Statistical Analysis (1958), Wiley: Wiley New York · Zbl 0083.14601
[2] Brooks, R.; Stone, M., Joint continuum regression for multiple predictands, J. Amer. Statist. Assoc, 89, 428, 1374-1379 (1994) · Zbl 0825.62638
[3] Burnham, A. J.; Viveros, R.; MacGregor, J. F., Frameworks for latent variable multivariate regression, J. Chemom, 10, 31-45 (1996)
[4] Burnham, A. J.; Viveros, R.; MacGregor, J. F., Latent variable multivariate regression modelling, Chemom. Intell. Lab. Systems, 48, 167-180 (1999)
[5] Burnham, A. J.; MacGregor, J. F.; Viveros, R., Interpretation of regression coefficients under a latent variable regression model, J. Chemom, 15, 265 (2001)
[6] de Jong, S., Simplsan alternative approach to partial least squares regression, Chemom. Intell. Lab. Systems, 18, 251-263 (1993)
[7] de Jong, S.; Kiers, H. A.L., Principal covariates regression. Part i. theory, Chemom. Intell. Lab. Systems, 14, 155-164 (1992)
[8] Gelaldi, P.; Kowalski, B. R., Partial least-squares regressiona tutorial, Anal. Chim. Acta, 185, 1-17 (1986)
[9] Hotelling, H., 1935. The most predictable criterion. J. Educ. Psychol. 26, 139-142. Also in Bryant and Atchley, 1975.; Hotelling, H., 1935. The most predictable criterion. J. Educ. Psychol. 26, 139-142. Also in Bryant and Atchley, 1975.
[10] Izenman, A. J., Reduced-rank regression for the multivariate bilinear model, J. Multivariate Anal, 5, 248-264 (1975) · Zbl 0313.62042
[11] Kourti, T.; MacGregor, J. F., Multivariate spc methods for process and product monitoring, J. Quality Tech, 28, 4, 409-428 (1996)
[12] Magnus, J. R.; Neudecker, H., Matrix Differential Calculus with Applications in Statistics and Econometrics (1988), Wiley: Wiley New York · Zbl 0651.15001
[13] Merola, G.M., 1998. Dimensionality reduction methods in multivariate prediction. Ph.D. Thesis, Department of Statistics and Actuarial Science, University of Waterloo, Waterloo, Canada.; Merola, G.M., 1998. Dimensionality reduction methods in multivariate prediction. Ph.D. Thesis, Department of Statistics and Actuarial Science, University of Waterloo, Waterloo, Canada.
[14] Merola, G. M.; Abraham, B., Dimensionality reduction approach to multivariate prediction, Canad. J. Statist, 29, 2, 191-200 (2001) · Zbl 0974.62053
[15] Merola, G. M.; Abraham, B., Dimension reduction methods used in industry, (Khattree, R.; Rao, C. R., Statistics in Industry. Statistics in Industry, Handbook of Statistics, Vol. 22 (2003), Elsevier: Elsevier Amsterdam) · Zbl 1174.62440
[16] Phatak, A.; Reilly, P. M.; Penlidis, A., The geometry of 2-block partial least squares regression, Comm. Statist. Part A-Theory Methods, 21, 1517-1553 (1992) · Zbl 0775.62175
[17] Schmidli, H., 1995. Reduced rank regression. Contributions to Statistics. Physica-Verlag, Wingbung.; Schmidli, H., 1995. Reduced rank regression. Contributions to Statistics. Physica-Verlag, Wingbung. · Zbl 0857.62065
[18] Skagerberg, B.; MacGregor, J. F.; Kiparissides, C., Multivariate data analysis applied to low-density polythylene reactors, Chemom. Intell. Lab. Systems, 14, 341-356 (1992)
[19] Stone, M., 1974. Cross-validatory choice and assessment of statistical predictions. J. Roy. Statist. Soc. B 36, 111-133. with discussion.; Stone, M., 1974. Cross-validatory choice and assessment of statistical predictions. J. Roy. Statist. Soc. B 36, 111-133. with discussion. · Zbl 0308.62063
[20] Stone, M.; Brooks, R. J., Continuum regressioncross validated sequentially constructed prediction embracing ordinary least squares, partial least squares and principal component regression, J. Roy. Statist. Soc. B, 52, 2, 237-269 (1990) · Zbl 0708.62054
[21] Tenenhous, M., 1998. La régresion PLS: thórie et pratique. Editions Technip, Paris (in French).; Tenenhous, M., 1998. La régresion PLS: thórie et pratique. Editions Technip, Paris (in French). · Zbl 0923.62058
[22] Van den Wollenberg, R., Redundancy analysisan alternative for canonical correlation analysis, Psychometrica, 42, 207-219 (1977) · Zbl 0354.92050
[23] Wold, S., Cross-validatory estimation of the number of components in factor and principal components models, Technometrics, 20, 4, 397-405 (1978) · Zbl 0403.62032
[24] Wold, H., 1982. Soft modelling, the basic design and some extensions. In: Joresorg, K.G., Wold, H. (Eds.), Systems Under Indirect Observation, Vol. II. Wiley, New York, pp. 589-591.; Wold, H., 1982. Soft modelling, the basic design and some extensions. In: Joresorg, K.G., Wold, H. (Eds.), Systems Under Indirect Observation, Vol. II. Wiley, New York, pp. 589-591.
[25] Wold, H., 1984. Partial least squares. In: Encyclopedia of Statistical Sciences. Wiley, New York, pp. 581-591.; Wold, H., 1984. Partial least squares. In: Encyclopedia of Statistical Sciences. Wiley, New York, pp. 581-591.
This reference list is based on information provided by the publisher or from digital mathematics libraries. Its items are heuristically matched to zbMATH identifiers and may contain data conversion errors. In some cases that data have been complemented/enhanced by data from zbMATH Open. This attempts to reflect the references listed in the original paper as accurately as possible without claiming completeness or a perfect matching.