<< Chapter < Page Chapter >> Page >

Estimates for identical parameters are heavily dependent on the assumed underlying probability densities. To understand thissensitivity better, consider the following variety of problems, each of which asks for estimates of quantitiesrelated to variance. Determine the bias and consistency in each case.

Compute the maximum a posteriori and maximum likelihood estimates of based on L statistically independent observations of a Maxwellian random variable r . r r 0 0 p r r 2 -3 2 r 2 1 2 r 2 0 p

Find the maximum a posteriori estimate of the variance 2 from L statistically independent observations having the exponential density r r 0 p r r 1 2 r 2 where the variance is uniformly distributed over the interval 0 max 2 .

Find the maximum likelihood estimate of the variance of L identically distributed, but dependent Gaussian random variables. Here, the covariance matrix is written K r 2 K r , where the normalized covariance matrix has trace tr K r L

Imagine yourself idly standing on the corner in a large city when you note the serial number of a passing beer truck.Because you are idle, you wish to estimate (guess may be more accurate here) how many beer trucks the city has fromthis single operation

Making appropriate assumptions, the beer truck's number is drawn from a uniform probability density ranging betweenzero and some unknown upper limit, find the maximum likelihood estimate of the upper limit.

Show that this estimate is biased.

In one of your extraordinarily idle moments, you observe throughout the city L beer trucks. Assuming them to be independent observations, now what is the maximum likelihood estimateof the total?

Is this estimate of biased? asymptotically biased? consistent?

We make L observations r 1 , , r L of a parameter corrupted by additive noise ( r l n l ). The parameter is a Gaussian random variable [ 0 2 ] and n l are statistically independent Gaussian random variables [ n l 0 n 2 ].

Find the MMSE estimate of .

Find the maximum a posteriori estimate of .

Compute the resulting mean-squared error for each estimate.

Consider an alternate procedure based on the same observations r l . Using the MMSE criterion, we estimate immediately after each observation. This procedure yieldsthe sequence of estimates 1 r 1 , 2 r 1 r 2 ,, L r 1 r L . Express 1 as a function of l - 1 , l - 1 2 , and r l . Here, l 2 denotes the variance of the estimation error of the l th estimate. Show that 1 l 2 1 2 1 n 2

Although the maximum likelihood estimation procedure was not clearly defined until early in the 20th century, Gaussshowed in 1905 that the Gaussian density

It wasn't called the Gaussian density in 1805; this result is one of the reasons why it is.
was the sole density for which the maximum likelihood estimate of the mean equaledthe sample average. Let r 0 r L - 1 be a sequence of statistically independent, identically distributed random variables.

What equation defines the maximum likelihood estimate m ML of the mean m when the common probability density function of the data has the form p r m ?

The sample average is, of course, l l r l L . Show that it minimizes the mean-square error l l r l m 2 .

Equating the sample average to m ML , combine this equation with the maximum likelihood equation to show that the Gaussian densityuniquely satisfies the equations.

Because both equations equal 0, they can be equated. Use the fact that they must hold for all L to derive the result. Gauss thus showed that mean-squared error and the Gaussian density were closely linked,presaging ideas from modern robust estimation theory.

Get Jobilize Job Search Mobile App in your pocket Now!

Get it on Google Play Download on the App Store Now




Source:  OpenStax, Statistical signal processing. OpenStax CNX. Dec 05, 2011 Download for free at http://cnx.org/content/col11382/1.1
Google Play and the Google Play logo are trademarks of Google Inc.

Notification Switch

Would you like to follow the 'Statistical signal processing' conversation and receive update notifications?

Ask