<< Chapter < Page Chapter >> Page >

Questions or comments concerning this laboratory should be directedto Prof. Charles A. Bouman, School of Electrical and Computer Engineering, Purdue University, West Lafayette IN 47907;(765) 494-0340; bouman@ecn.purdue.edu

Introduction

Many of the phenomena that occur in nature have uncertainty and are best characterized statisticallyas random processes. For example, the thermal noise in electronic circuits, radar detection, and games of chanceare best modeled and analyzed in terms of statistical averages.

This lab will cover some basic methods of analyzing random processes. "Random Variables" reviews some basic definitions and terminology associated with random variables, observations, and estimation. "Estimating the Cumulative Distribution Function" investigates a common estimate of the cumulative distribution function. "Generating Samples from a Given Distribution" discusses the problem of transforming a random variable so that it has a given distribution, and lastly, "Estimating the Probability Density Function" illustrates how the histogram may be used to estimate the probability density function.

Note that this lab assumes an introductory background in probability theory. Some review is provided, but it is unfeasible to develop the theory in detail.A secondary reference such as [link] is strongly encouraged.

Random variables

The following section contains an abbreviated review of some of the basic definitions associated with random variables.Then we will discuss the concept of an observation of a random event, and introduce the notion of an estimator .

Basic definitions

A random variable is a function that maps a set of possible outcomes of a random experiment into a set of real numbers.The probability of an event can then be interpreted as the probability that the random variable will take ona value in a corresponding subset of the real line. This allows a fully numerical approach to modelingprobabilistic behavior.

A very important function used to characterize a random variable is the cumulative distribution function (CDF) , defined as

F X ( x ) = P ( X x ) x ( - , ) .

Here, X is the random variable, and F X ( x ) is the probability that X will take on a value in the interval ( - , x ] . It is important to realize that x is simply a dummy variable for the function F X ( x ) , and is therefore not random at all.

The derivative of the cumulative distribution function, if it exists, is known as the probability density function, denoted as f X ( x ) . By the fundamental theorem of calculus, the probability densityhas the following property:

t 0 t 1 f X ( x ) d x = F X ( t 1 ) - F X ( t 0 ) = P ( t 0 < X t 1 ) .

Since the probability that X lies in the interval ( - , ) equals one, the entire area under the density function must also equal one.

Expectations are fundamental quantities associated with random variables. The expected value of some function of X , call it g ( X ) , is defined by the following.

E [ g ( X ) ] = - g ( x ) f X ( x ) d x (for X continuous) E [ g ( X ) ] = x = - g ( x ) P ( X = x ) (for X discrete)

Note that expected value of g ( X ) is a deterministic number.Note also that due to the properties of integration, expectation is a linear operator.

Get Jobilize Job Search Mobile App in your pocket Now!

Get it on Google Play Download on the App Store Now




Source:  OpenStax, Purdue digital signal processing labs (ece 438). OpenStax CNX. Sep 14, 2009 Download for free at http://cnx.org/content/col10593/1.4
Google Play and the Google Play logo are trademarks of Google Inc.

Notification Switch

Would you like to follow the 'Purdue digital signal processing labs (ece 438)' conversation and receive update notifications?

Ask