How to apply PCA correctly?

975 views (last 30 days)
Sepp on 12 Dec 2015
Commented: the cyclist on 31 Mar 2022
I'm currently struggling with PCA and Matlab. Let's say we have a data matrix X and a response y (classification task). X consists of 12 rows and 4 columns. The rows are the data points, the columns are the predictors (features).
Now, I can do PCA with the following command:
[coeff, score] = pca(X);
As I understood from the matlab documentation, coeff contains the loadings and score contains the principal components in the columns. That mean first column of score contains the first principal component (associated with the highest variance) and the first column of coeff contains the loadings for the first principal component.
Is this correct?
But if this is correct, why is then X * coeff not equal to score?
  1 Comment
DrJ on 11 Dec 2019
Sepp @Sepp
your doubt can be clarified by this tutorial (eventhough in another program context) .. specially after 5' in
the cliclist
fabulous and generous explanation

Sign in to comment.

Accepted Answer

the cyclist
the cyclist on 12 Dec 2015
Edited: the cyclist on 18 Apr 2020
Maybe this script will help.
rng 'default'
M = 7; % Number of observations
N = 5; % Number of variables observed
X = rand(M,N);
% De-mean
X = bsxfun(@minus,X,mean(X));
% Do the PCA
[coeff,score,latent] = pca(X);
% Calculate eigenvalues and eigenvectors of the covariance matrix
covarianceMatrix = cov(X);
[V,D] = eig(covarianceMatrix);
% "coeff" are the principal component vectors.
% These are the eigenvectors of the covariance matrix.
% Compare the columns of coeff and V.
% (Note that the columns are not necessarily in the same *order*,
% and they might be *lightly different from each other
% due to floating-point error.)
% Multiply the original data by the principal component vectors
% to get the projections of the original data on the
% principal component vector space. This is also the output "score".
% Compare ...
dataInPrincipalComponentSpace = X*coeff
% The columns of X*coeff are orthogonal to each other. This is shown with ...
% The variances of these vectors are the eigenvalues of the covariance matrix, and are also the output "latent". Compare
% these three outputs
the cyclist
the cyclist on 31 Mar 2022
You skipped the step where the means are subtracted:
% De-mean
X = bsxfun(@minus,X,mean(X)); % <------ YOU MISSED THIS STEP
[coeff,score,latent,~,explained] = pca(X);
Pca_space_Dat = 4×3
-13.2773 -8.0194 -1.3334 27.3728 -1.9973 2.0290 8.2020 5.6244 -2.9099 -22.2975 4.3923 2.2144
score = 4×3
-13.2773 -8.0194 -1.3334 27.3728 -1.9973 2.0290 8.2020 5.6244 -2.9099 -22.2975 4.3923 2.2144
The reason for this step is mentioned in the comments above. Also, in more recent versions of MATLAB, you can do
X = X - mean(X);
rather than
X = bsxfun(@minus,X,mean(X));

Sign in to comment.

More Answers (2)

Yaser Khojah
Yaser Khojah on 17 Apr 2019
Dear the cyclist, thanks for showing this example. I have a question regarding to the order of the COEFF since they are different than the V. Is there anyway to see which order of these columns? In another word, what are the variables of each column?
the cyclist
the cyclist on 26 Dec 2020
Sorry it took me a while to see this question.
If you do
[coeff,score] = pca(X);
it is true that pca() will internally de-mean the data. So, score is derived from de-meaned data.
But it does not mean that X itself [outside of pca()] has been de-meaned. So, if you are trying to re-create what happens inside pca(), you need to manually de-mean X first.

Sign in to comment.

Greg Heath
Greg Heath on 13 Dec 2015
Hope this helps.
Thank you for formally accepting my answer

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!