Hubbry Logo
search
logo

Uncorrelatedness (probability theory)

logo
Community Hub0 Subscribers
Write something...
Be the first to start a discussion here.
Be the first to start a discussion here.
See all
Uncorrelatedness (probability theory)

In probability theory and statistics, two real-valued random variables, , , are said to be uncorrelated if their covariance, , is zero. If two variables are uncorrelated, there is no linear relationship between them.

Uncorrelated random variables have a Pearson correlation coefficient, when it exists, of zero, except in the trivial case when either variable has zero variance (is a constant). In this case the correlation is undefined.

In general, uncorrelatedness is not the same as orthogonality, except in the special case where at least one of the two random variables has an expected value of 0. In this case, the covariance is the expectation of the product, and and are uncorrelated if and only if .

If and are independent, with finite second moments, then they are uncorrelated. However, not all uncorrelated variables are independent.

Two random variables are called uncorrelated if their covariance is zero. Formally:

Two complex random variables are called uncorrelated if their covariance and their pseudo-covariance is zero, i.e.

See all
User Avatar
No comments yet.