×

Sharp detection in PCA under correlations: all eigenvalues matter. (English) Zbl 1486.62182

Summary: Principal component analysis (PCA) is a widely used method for dimension reduction. In high-dimensional data, the “signal” eigenvalues corresponding to weak principal components (PCs) do not necessarily separate from the bulk of the “noise” eigenvalues. Therefore, popular tests based on the largest eigenvalue have little power to detect weak PCs. In the special case of the spiked model, certain tests asymptotically equivalent to linear spectral statistics (LSS) – averaging effects over all eigenvalues – were recently shown to achieve some power.
We consider a “local alternatives” model for the spectrum of covariance matrices that allows a general correlation structure. We develop new tests to detect PCs in this model. While the top eigenvalue contains little information, due to the strong correlations between the eigenvalues we can detect weak PCs by averaging over all eigenvalues using LSS. We show that it is possible to find the optimal LSS, by solving a certain integral equation. To solve this equation, we develop efficient algorithms that build on our recent method for computing the limit empirical spectrum [the author, Random Matrices Theory Appl. 4, No. 4, Article ID 1550019, 36 p. (2014; Zbl 1330.65029)]. The solvability of this equation also presents a new perspective on phase transitions in spiked models.

MSC:

62H25 Factor analysis and principal components; correspondence analysis
62H15 Hypothesis testing in multivariate analysis
45B05 Fredholm integral equations

Citations:

Zbl 1330.65029