How does normalization in corrcoef(x) influence the actual correlation

Hi all,
I'm trying to find the degree of correlation between 4 vectors (namely MIMO channels). Using the cov(x) function seems to go with the books but it doesn't return a matrix with 1s on its diagonal. So my first question would be why (I assume it has to do with level of fading, i.e. variance of each specific channel)? The function corrcoef(x) is the normalised cov(x) in the following way:
"if C = COV(X), then CORRCOEF(X) is normalised as C(i,j)/SQRT(C(i,i)*C(j,j))".
Can someone help me understand how does the normalisation influence the result (apart from the obvious divison factor) and why it is normalised in such a way?
It seems intuitively better since it returns coefficients equal to 1 for correlated variables (and accross the diagonal).
Thanks!

2 Comments

Yep, it was actually obvious... It seems I just had a bit of a mixup in terminology... but thanks!

Sign in to comment.

Answers (0)

Asked:

on 17 Jan 2019

Commented:

on 18 Jan 2019

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!