<< Chapter < Page Chapter >> Page >
(Blank Abstract)

In order to study the characteristics of a random process , let us look at some of the basic properties and operations of a random process. Below wewill focus on the operations of the random signals that compose our random processes. We will denote our random process with X and a random variable from a random process or signal by x .

Mean value

Finding the average value of a set of random signals or random variables is probably the most fundamental concepts we use inevaluating random processes through any sort of statistical method. The mean of a random process is the average of all realizations of that process. In order to find this average, we must look at a random signal over arange of time (possible values) and determine our average from this set of values. The mean , or average, of a random process, x t , is given by the following equation:

m x t x t X X x x f x
This equation may seem quite cluttered at first glance, but we want to introduce you to the various notations used torepresent the mean of a random signal or process. Throughout texts and other readings, remember that these will all equalthe same thing. The symbol, x t , and the X with a bar over it are often used as a short-hand to represent anaverage, so you might see it in certain textbooks. The other important notation used is, X , which represents the "expected value of X " or the mathematical expectation. This notation is very common and will appearagain.

If the random variables, which make up our random process, are discrete or quantized values, such as in a binary process,then the integrals become summations over all the possible values of the random variable. In this case, our expectedvalue becomes

x n x x n
If we have two random signals or variables, their averages can reveal how the two signals interact. If the product of the two individualaverages of both signals do not equal the average of the product of the two signals, then the twosignals are said to be linearly independent , also referred to as uncorrelated .

In the case where we have a random process in which only one sample can be viewed at a time, then we will often not haveall the information available to calculate the mean using the density function as shown above. In this case we mustestimate the mean through the time-average mean , discussed later. For fields such as signal processing that deal mainly withdiscrete signals and values, then these are the averages most commonly used.

Properties of the mean

  • The expected value of a constant, , is the constant:
  • Adding a constant, , to each term increases the expected value by that constant:
    X X
  • Multiplying the random variable by a constant, , multiplies the expected value by that constant.
    X X
  • The expected value of the sum of two or more random variables, is the sum of each individual expectedvalue.
    X Y X Y

Mean-square value

If we look at the second moment of the term (we now look at x 2 in the integral), then we will have the mean-square value of our random process. As you would expect, this is written as

X 2 X 2 x x 2 f x
This equation is also often referred to as the average power of a process or signal.

Variance

Now that we have an idea about the average value or values that a random process takes, we are often interested in seeingjust how spread out the different random values might be. To do this, we look at the variance which is a measure of this spread. The variance, often denoted by 2 , is written as follows:

2 Var X X X 2 x x X 2 f x
Using the rules for the expected value, we can rewrite this formula as the following form, which is commonly seen:
2 X 2 X 2 X 2 X 2

Standard deviation

Another common statistical tool is the standard deviation. Once you know how to calculate the variance, the standarddeviation is simply the square root of the variance , or .

Properties of variance

  • The variance of a constant, , equals zero:
    Var 0
  • Adding a constant, , to a random variable does not affect the variance because the mean increases by the same value:
    Var X X X
  • Multiplying the random variable by a constant, , increases the variance by the square of the constant:
    Var X X 2 X
  • The variance of the sum of two random variables only equals the sum of the variances if the variable are independent .
    Var X Y X Y X Y
    Otherwise, if the random variable are not independent, then we must also include the covariance of the product of the variablesas follows:
    Var X Y X 2 Cov X Y Y

Time averages

In the case where we can not view the entire ensemble of the random process, we must use time averages to estimate thevalues of the mean and variance for the process. Generally, this will only give us acceptable results for independent and ergodic processes, meaning those processes in which each signal or member of the process seems to have thesame statistical behavior as the entire process. The time averages will also only be taken over a finite interval sincewe will only be able to see a finite part of the sample.

Estimating the mean

For the ergodic random process, x t , we will estimate the mean using the time averaging function defined as

X X 1 T t 0 T X t
However, for most real-world situations we will be dealing with discrete values in our computations and signals. Wewill represent this mean as
X X 1 N n 1 N X n

Estimating the variance

Once the mean of our random process has been estimated then we can simply use those values in the following varianceequation (introduced in one of the above sections)

x 2 X 2 X 2

Example

Let us now look at how some of the formulas and concepts above apply to a simple example. We will just look at a single,continuous random variable for this example, but the calculations and methods are the same for a random process.For this example, we will consider a random variable having the probability density function described below and shown in .

f x 1 10 10 x 20 0

Probability density function

A uniform probability density function.

First, we will use to solve for the mean value.

X x 10 20 x 1 10 x 10 20 1 10 x 2 2 1 10 200 50 15
Using we can obtain the mean-square value for the above density function.
X 2 x 10 20 x 2 1 10 x 10 20 1 10 x 3 3 1 10 8000 3 1000 3 233.33
And finally, let us solve for the variance of this function.
2 X 2 X 2 233.33 15 2 8.33

Get Jobilize Job Search Mobile App in your pocket Now!

Get it on Google Play Download on the App Store Now




Source:  OpenStax, Intro to digital signal processing. OpenStax CNX. Jan 22, 2004 Download for free at http://cnx.org/content/col10203/1.4
Google Play and the Google Play logo are trademarks of Google Inc.

Notification Switch

Would you like to follow the 'Intro to digital signal processing' conversation and receive update notifications?

Ask