<< Chapter < Page Chapter >> Page >

The calculations, as in [link] , are simple but can be tedious. We have an m-procedure called bayes to perform the calculations easily. The probabilities P ( A i ) are put into a matrix PA and the conditional probabilities P ( E | A i ) are put into matrix PEA. The desired probabilities P ( A i | E ) and P ( A i | E c ) are calculated and displayed

Matlab calculations for [link]

>>PEA = [0.10 0.02 0.06];>>PA = [0.2 0.5 0.3];>>bayes Requires input PEA = [P(E|A1) P(E|A2) ... P(E|An)]and PA = [P(A1) P(A2) ... P(An)] Determines PAE = [P(A1|E) P(A2|E) ... P(An|E)]and PAEc = [P(A1|Ec) P(A2|Ec) ... P(An|Ec)] Enter matrix PEA of conditional probabilities PEAEnter matrix PA of probabilities PA P(E) = 0.048P(E|Ai) P(Ai) P(Ai|E) P(Ai|Ec) 0.1000 0.2000 0.4167 0.18910.0200 0.5000 0.2083 0.5147 0.0600 0.3000 0.3750 0.2962Various quantities are in the matrices PEA, PA, PAE, PAEc, named above

The procedure displays the results in tabular form, as shown. In addition, the various quantities are in the workspace in the matrices named, so that theymay be used in further calculations without recopying.

Got questions? Get instant answers now!

The following variation of Bayes' rule is applicable in many practical situations.

(CP3*) Ratio form of Bayes' rule P ( A | C ) P ( B | C ) = P ( A C ) P ( B C ) = P ( C | A ) P ( C | B ) P ( A ) P ( B )

The left hand member is called the posterior odds , which is the odds after knowledge of the occurrence of the conditioning event. The second fraction in the right hand member is the prior odds , which is the odds before knowledge of the occurrence of the conditioning event C . The first fraction in the right hand member is known as the likelihood ratio . It is the ratio of the probabilities (or likelihoods) of C for the two different probability measures P ( | A ) and P ( | B ) .

A performance test

As a part of a routine maintenance procedure, a computer is given a performance test. The machine seems to be operating so well that the prior odds it is satisfactory are takento be ten to one. The test has probability 0.05 of a false positive and 0.01 of a false negative. A test is performed. The result is positive. What are the posterior odds the device isoperating properly?

SOLUTION

Let S be the event the computer is operating satisfactorily and let T be the event the test is favorable. The data are P ( S ) / P ( S c ) = 10 , P ( T | S c ) = 0 . 05 , and P ( T c | S ) = 0 . 01 . Then by the ratio form of Bayes' rule

P ( S | T ) P ( S c | T ) = P ( T | S ) P ( T | S c ) P ( S ) P ( S c ) = 0 . 99 0 . 05 10 = 198 so that P ( S | T ) = 198 199 = 0 . 9950
Got questions? Get instant answers now!

The following property serves to establish in the chapters on "Independence of Events" and "Conditional Independence" a number of important properties for the concept of independence and of conditional independence of events.

(CP4) Some equivalent conditions If 0 < P ( A ) < 1 and 0 < P ( B ) < 1 , then

P ( A | B ) * P ( A ) iff P ( B | A ) * P ( B ) iff P ( A B ) * P ( A ) P ( B ) and
P ( A B ) * P ( A ) P ( B ) iff P ( A c B c ) * P ( A c ) P ( B c ) iff P ( A B c ) P ( A ) P ( B c )

where * is < , , = , , or > and is > , , = , , or < , respectively.

Because of the role of this property in the theory of independence and conditional independence, we examine the derivation of these results.

VERIFICATION of (CP4)

  1. P ( A B ) * P ( A ) P ( B ) iff P ( A | B ) * P ( A ) (divide by P ( B ) — may exchange A and A c )
  2. P ( A B ) * P ( A ) P ( B ) iff P ( B | A ) * P ( B ) (divide by P ( A ) — may exchange B and B c )
  3. P ( A B ) * P ( A ) P ( B ) iff [ P ( A ) - P ( A B c ) ] * P ( A ) [ 1 - P ( B c ) iff - P ( A B c ) * - P ( A ) P ( B c ) iff P ( A B c ) P ( A ) P ( B c )
  4. We may use c to get P ( A B ) * P ( A ) P ( B ) iff P ( A B c ) P ( A ) P ( B c ) iff P ( A c B c ) * P ( A c ) P ( B c )

A number of important and useful propositons may be derived from these.

  1. P ( A | B ) + P ( A c | B ) = 1 , but, in general, P ( A | B ) + P ( A | B c ) 1 .
  2. P ( A | B ) > P ( A ) iff P ( A | B c ) < P ( A ) .
  3. P ( A c | B ) > P ( A c ) iff P ( A | B ) < P ( A ) .
  4. P ( A | B ) > P ( A ) iff P ( A c | B c ) > P ( A c ) .

VERIFICATION — Exercises (see problem set)

Repeated conditioning

Suppose conditioning by the event C has occurred. Additional information is then received that event D has occurred. We have a new conditioning event C D . There are two possibilities:

  1. Reassign the conditional probabilities. P C ( A ) becomes
    P C ( A | D ) = P C ( A D ) P C ( D ) = P ( A C D ) P ( C D )
  2. Reassign the total probabilities: P ( A ) becomes
    P C D ( A ) = P ( A | C D ) = P ( A C D ) P ( C D )

Basic result : P C ( A | D ) = P ( A | C D ) = P D ( A | C ) . Thus repeated conditioning by two events may be done in any order, or may be done in one step. This result extends easilyto repeated conditioning by any finite number of events. This result is important in extending the concept of "Independence of Events" to "Conditional Independence" . These conditions are important for many problems of probable inference.

Get Jobilize Job Search Mobile App in your pocket Now!

Get it on Google Play Download on the App Store Now




Source:  OpenStax, Applied probability. OpenStax CNX. Aug 31, 2009 Download for free at http://cnx.org/content/col10708/1.6
Google Play and the Google Play logo are trademarks of Google Inc.

Notification Switch

Would you like to follow the 'Applied probability' conversation and receive update notifications?

Ask