<< Chapter < Page Chapter >> Page >
(Blank Abstract)

When we take the expected value , or average, of a random process , we measure several important characteristics about how the process behavesin general. This proves to be a very important observation. However, suppose we have several random processes measuringdifferent aspects of a system. The relationship between these different processes will also be an important observation. Thecovariance and correlation are two important tools in finding these relationships. Below we will go into more details as towhat these words mean and how these tools are helpful. Note that much of the following discussions refer to just randomvariables, but keep in mind that these variables can represent random signals or random processes.

Covariance

To begin with, when dealing with more than one random process, it should be obvious that it would be nice to be able to havea number that could quickly give us an idea of how similar the processes are. To do this, we use the covariance , which is analogous to the variance of a single variable.

Covariance
A measure of how much the deviations of two or more variables or processes match.
For two processes, X and Y , if they are not closely related then the covariance will be small, and if they are similar then the covariancewill be large. Let us clarify this statement by describing what we mean by "related" and "similar." Two processes are"closely related" if their distribution spreads are almost equal and they are around the same, or a very slightlydifferent, mean.

Mathematically, covariance is often written as σ x y and is defined as

cov X Y σ x y X X Y Y
This can also be reduced and rewritten in the following twoforms:
σ x y x y x y
σ x y y x X X Y Y f x y

Useful properties

  • If X and Y are independent and uncorrelated or one of them has zero mean value, then σ x y 0
  • If X and Y are orthogonal, then σ x y X Y
  • The covariance is symmetric cov X Y cov Y X
  • Basic covariance identity cov X Y Z cov X Z cov Y Z
  • Covariance of equal variables cov X X Var X

Correlation

For anyone who has any kind of statistical background, you should be able to see that the idea of dependence/independenceamong variables and signals plays an important role when dealing with random processes. Because of this, the correlation of two variables provides us with a measure of how the two variables affect one another.

Correlation
A measure of how much one random variable depends upon the other.
This measure of association between the variables will provideus with a clue as to how well the value of one variable can be predicted from the value of the other. The correlation isequal to the average of the product of two random variables and is defined as
cor X Y X Y y x x y f x y

Correlation coefficient

It is often useful to express the correlation of random variables with a range of numbers, like a percentage. For agiven set of variables, we use the correlation coefficient to give us the linear relationship between our variables. The correlation coefficient of twovariables is defined in terms of their covariance and standard deviations , denoted by σ x , as seen below

ρ cov X Y σ x σ y
where we will always have -1 ρ 1 This provides us with a quick and easy way to view the correlation between our variables. If there is norelationship between the variables then the correlation coefficient will be zero and if there is a perfect positivematch it will be one. If there is a perfect inverse relationship, where one set of variables increases while theother decreases, then the correlation coefficient will be negative one. This type of correlation is often referred tomore specifically as the Pearson's Correlation Coefficient ,or Pearson's Product Moment Correlation.

Positive Correlation
Negative Correlation
Uncorrelated (No Correlation)
Types of Correlation

So far we have dealt with correlation simply as a number relating the relationship between any two variables.However, since our goal will be to relate random processes to each other, which deals with signals as a function oftime, we will want to continue this study by looking at correlation functions .

Example

Now let us take just a second to look at a simple example that involves calculating the covariance and correlation of twosets of random numbers. We are given the following data sets: x 3 1 6 3 4 y 1 5 3 4 3 To begin with, for the covariance we will need to find the expected value , or mean, of x and y . x 1 5 3 1 6 3 4 3.4 y 1 5 1 5 3 4 3 3.2 x y 1 5 3 5 18 12 12 10 Next we will solve for the standard deviations of our two setsusing the formula below (for a review click here ). σ X X 2 σ x 1 5 0.16 5.76 6.76 0.16 0.36 1.625 σ y 1 6 4.84 3.24 0.04 0.64 0.04 1.327 Now we can finally calculate the covariance using one of the two formulas found above. Since we calculated the threemeans, we will use that formula since it will be much simpler. σ x y 10 3.4 3.2 -0.88 And for our last calculation, we will solve for thecorrelation coefficient, ρ . ρ -0.88 1.625 1.327 -0.408

Matlab code for example

The above example can be easily calculated using Matlab. Below I have included the code to find all of the valuesabove.

x = [3 1 6 3 4]; y = [1 5 3 4 3]; mx = mean(x) my = mean(y) mxy = mean(x.*y) % Standard Dev. from built-in Matlab Functions std(x,1) std(y,1) % Standard Dev. from Equation Above (same result as std(?,1)) sqrt( 1/5 * sum((x-mx).^2)) sqrt( 1/5 * sum((y-my).^2)) cov(x,y,1) corrcoef(x,y)

Get Jobilize Job Search Mobile App in your pocket Now!

Get it on Google Play Download on the App Store Now




Source:  OpenStax, Intro to digital signal processing. OpenStax CNX. Jan 22, 2004 Download for free at http://cnx.org/content/col10203/1.4
Google Play and the Google Play logo are trademarks of Google Inc.

Notification Switch

Would you like to follow the 'Intro to digital signal processing' conversation and receive update notifications?

Ask