<< Chapter < Page Chapter >> Page >

The two most common expectations are the mean μ X and variance σ X 2 defined by

μ X = E [ X ] = - x f X ( x ) d x
σ X 2 = E [ ( X - μ X ) 2 ] = - ( x - μ X ) 2 f X ( x ) d x .

A very important type of random variable is the Gaussian or normal random variable.A Gaussian random variable has a density function of the following form:

f X ( x ) = 1 2 π σ X exp - 1 2 σ X 2 ( x - μ X ) 2 .

Note that a Gaussian random variable is completely characterized by its mean and variance.This is not necessarily the case for other types of distributions. Sometimes, the notation X N ( μ , σ 2 ) is used to identify X as being Gaussian with mean μ and variance σ 2 .

Samples of a random variable

Suppose some random experiment may be characterized by a random variable X whose distribution is unknown. For example, suppose we are measuring a deterministic quantity v , but our measurement is subject to a random measurement error ε . We can then characterize the observed value, X , as a random variable, X = v + ε .

If the distribution of X does not change over time, we may gain further insight into X by making several independent observations { X 1 , X 2 , , X N } . These observations X i , also known as samples , will be independent random variables and have the same distribution F X ( x ) . In this situation, the X i 's are referred to as i.i.d. , for independent and identically distributed . We also sometimes refer to { X 1 , X 2 , , X N } collectively as a sample, or observation, of size N .

Suppose we want to use our observation { X 1 , X 2 , , X N } to estimate the mean and variance of X . Two estimators which should already be familiar to you are the sample mean and sample variance defined by

μ ^ X = 1 N i = 1 N X i
σ ^ X 2 = 1 N - 1 i = 1 N ( X i - μ ^ X ) 2 .

It is important to realize that these sample estimates are functions of random variables, and are therefore themselves random variables.Therefore we can also talk about the statistical properties of the estimators. For example, we can compute the mean and variance of the sample mean μ ^ X .

E μ ^ X = E 1 N i = 1 N X i = 1 N i = 1 N E X i = μ X
V a r μ ^ X = V a r 1 N i = 1 N X i = 1 N 2 V a r i = 1 N X i = 1 N 2 i = 1 N V a r X i = σ X 2 N

In both [link] and [link] we have used the i.i.d. assumption. We can also show that E [ σ ^ X 2 ] = σ X 2 .

An estimate a ^ for some parameter a which has the property E [ a ^ ] = a is said to be an unbiased estimate. An estimator such that V a r [ a ^ ] 0 as N is said to be consistent . These two properties are highly desirable because they imply that if alarge number of samples are used the estimate will be close to the true parameter.

Suppose X is a Gaussian random variable with mean 0 and variance 1. Use the Matlab function random or randn to generate 1000 samples of X , denoted as X 1 , X 2 , ..., X 1000 . See the online help for the random function . Plot them using the Matlab function plot . We will assume our generated samples are i.i.d.

Write Matlab functions to compute the sample mean and sample variance of [link] and [link] without using the predefined mean and var functions. Use these functions to compute the sample meanand sample variance of the samples you just generated.

Inlab report

  1. Submit the plot of samples of X .
  2. Submit the sample mean and the sample variance that you calculated. Why are they not equal to the true mean and true variance?

Get Jobilize Job Search Mobile App in your pocket Now!

Get it on Google Play Download on the App Store Now




Source:  OpenStax, Purdue digital signal processing labs (ece 438). OpenStax CNX. Sep 14, 2009 Download for free at http://cnx.org/content/col10593/1.4
Google Play and the Google Play logo are trademarks of Google Inc.

Notification Switch

Would you like to follow the 'Purdue digital signal processing labs (ece 438)' conversation and receive update notifications?

Ask