<< Chapter < Page Chapter >> Page >
For simple, real valued random variables, the expectation is the probability weighted average of the values taken on. It may be viewed as the center of mass for the probability mass distribution on the line.

Introduction

The probability that real random variable X takes a value in a set M of real numbers is interpreted as the likelihood that the observed value X ( ω ) on any trial will lie in M . Historically, this idea of likelihood is rooted in the intuitive notion that if the experiment is repeated enough times the probability is approximately the fraction of times the value of X will fall in M . Associated with this interpretation is the notion of the average of the values taken on. We incorporate the concept of mathematical expectation into the mathematical model as an appropriate form of such averages.We begin by studying the mathematical expectation of simple random variables, then extend the definition and properties to the general case. In the process, we note the relationship ofmathematical expectation to the Lebesque integral, which is developed in abstract measure theory. Although we do not develop this theory, which lies beyond the scope of this study, identificationof this relationship provides access to a rich and powerful set of properties which have far reaching consequences in both application and theory.

Expectation for simple random variables

The notion of mathematical expectation is closely related to the idea of a weighted mean, used extensively in the handling of numerical data. Consider the arithmetic average x ¯ of the following ten numbers: 1, 2, 2, 2, 4, 5, 5, 8, 8, 8, which is given by

x ¯ = 1 10 ( 1 + 2 + 2 + 2 + 4 + 5 + 5 + 8 + 8 + 8 )

Examination of the ten numbers to be added shows that five distinct values are included. One of the ten, or the fraction 1/10 of them, has the value 1, three of the ten, or the fraction 3/10 of them,have the value 2, 1/10 has the value 4, 2/10 have the value 5, and 3/10 have the value 8. Thus, we could write

x ¯ = ( 0 . 1 · 1 + 0 . 3 · 2 + 0 . 1 · 4 + 0 . 2 · 5 + 0 . 3 · 8 )

The pattern in this last expression can be stated in words: Multiply each possible value by the fraction of the numbers having that value and then sum these products. The fractions are often referred to as the relative frequencies . A sum of this sort is known as a weighted average .

In general, suppose there are n numbers { x 1 , x 2 , x n } to be averaged, with m n distinct values { t 1 , t 2 , , t m } . Suppose f 1 have value t 1 , f 2 have value t 2 , , f m have value t m . The f i must add to n . If we set p i = f i / n , then the fraction p i is called the relative frequency of those numbers in the set which have the value t i , 1 i m . The average x ¯ of the n numbers may be written

x ¯ = 1 n i = 1 n x i = j = 1 m t j p j

In probability theory, we have a similar averaging process in which the relative frequencies of the various possible values of are replaced by the probabilities that those values are observed onany trial.

Definition . For a simple random variable X with values { t 1 , t 2 , , t n } and corresponding probabilities p i = P ( X = t i ) , the mathematical expectation , designated E [ X ] , is the probability weighted average of the values taken on by X . In symbols

Get Jobilize Job Search Mobile App in your pocket Now!

Get it on Google Play Download on the App Store Now




Source:  OpenStax, Applied probability. OpenStax CNX. Aug 31, 2009 Download for free at http://cnx.org/content/col10708/1.6
Google Play and the Google Play logo are trademarks of Google Inc.

Notification Switch

Would you like to follow the 'Applied probability' conversation and receive update notifications?

Ask