<< Chapter < Page Chapter >> Page >
If H is the event a hypothetical condition exists and E is the event the evidence occurs, the probabilities available are usually P(H)(or an odds value), P(E|H), andP(E|Hc). What is desired is P(H|E). We simply use Bayes' rule to reverse the direction of conditioning. No conditional independence is involved. Suppose there are two “independent” bits of evidence. Obtaining this evidence may be “operationally” independent, but if the items both relate to the hypothesized condition, then they cannot be really independent. The condition assumed is usually that of conditional independence, given H, and similarly, given Hc. Several cases representative of practical problems are considered. These ideas are applied to a classification problem. A population consists of members of two subgroups. It is desired to formulate a battery of questions to aid in identifying the subclass membership of randomly selected individuals in the population. The questions are designed so that for each individual the answers are independent, in the sense that the answers to any subset of these questions are not affected by and do not affect the answers to any other subset of the questions. The answers are, however, affected by the subgroup membership. Thus, our treatment of conditional independence suggests that it is reasonable to suppose the answers are conditionally independent, given the subgroup membership. These results are used to determine which subclass is more likely.

Some patterns of probable inference

We are concerned with the likelihood of some hypothesized condition. In general, we have evidence for the condition which can never be absolutely certain. We areforced to assess probabilities (likelihoods) on the basis of the evidence. Some typical examples:

HYPOTHESIS EVIDENCE
Job success Personal traits
Presence of oil Geological structures
Operation of a device Physical condition
Market condition Test market condition
Presence of a disease Tests for symptoms

If H is the event the hypothetical condition exists and E is the event the evidence occurs, the probabilities available are usually P ( H ) (or an odds value), P ( E | H ) , and P ( E | H c ) . What is desired is P ( H | E ) or, equivalently, the odds P ( H | E ) / P ( H c | E ) . We simply use Bayes' rule to reverse the direction of conditioning.

P ( H | E ) P ( H c | E ) = P ( E | H ) P ( E | H c ) · P ( H ) P ( H c )

No conditional independence is involved in this case.

Independent evidence for the hypothesized condition

Suppose there are two “independent” bits of evidence. Now obtaining this evidence may be “operationally” independent, but if the items both relate to thehypothesized condition, then they cannot be really independent. The condition assumed is usually of the form P ( E 1 | H ) = P ( E 1 | H E 2 ) —if H occurs, then knowledge of E 2 does not affect the likelihood of E 1 . Similarly, we usually have P ( E 1 | H c ) = P ( E 1 | H c E 2 ) . Thus { E 1 , E 2 } ci | H and { E 1 , E 2 } ci | H c .

Independent medical tests

Suppose a doctor thinks the odds are 2/1 that a patient has a certain disease. She orders two independent tests. Let H be the event the patient has the disease and E 1 and E 2 be the events the tests are positive. Suppose the first test has probability 0.1 of a false positive and probability 0.05 of a false negative. Thesecond test has probabilities 0.05 and 0.08 of false positive and false negative, respectively. If both tests are positive, what is the posterior probability thepatient has the disease?

Solution

Assuming { E 1 , E 2 } ci | H and ci | H c , we work first in terms of the odds, then convert to probability.

P ( H | E 1 E 2 ) P ( H c | E 1 E 2 ) = P ( H ) P ( H c ) · P ( E 1 E 2 | H ) P ( E 1 E 2 | H c ) = P ( H ) P ( H c ) · P ( E 1 | H ) P ( E 2 | H ) P ( E 1 | H c ) P ( E 2 | H c )

The data are

P ( H ) / P ( H c ) = 2 , P ( E 1 | H ) = 0 . 95 , P ( E 1 | H c ) = 0 . 1 , P ( E 2 | H ) = 0 . 92 , P ( E 2 | H c ) = 0 . 05

Substituting values, we get

P ( H | E 1 E 2 ) P ( H c | E 1 E 2 ) = 2 · 0 . 95 · 0 . 92 0 . 10 · 0 . 05 = 1748 5 so that P ( H | E 1 E 2 ) = 1748 1753 = 1 - 5 1753 = 1 - 0 . 0029
Got questions? Get instant answers now!

Get Jobilize Job Search Mobile App in your pocket Now!

Get it on Google Play Download on the App Store Now




Source:  OpenStax, Applied probability. OpenStax CNX. Aug 31, 2009 Download for free at http://cnx.org/content/col10708/1.6
Google Play and the Google Play logo are trademarks of Google Inc.

Notification Switch

Would you like to follow the 'Applied probability' conversation and receive update notifications?

Ask