<< Chapter < Page Chapter >> Page >

In the examples considered so far, it has been reasonable to assume conditional independence, given an event C , and conditional independence, given the complementary event. But there are cases in which the effect ofthe conditioning event is asymmetric. We consider several examples.

  • Two students are working on a term paper. They work quite separately. They both need to borrow a certain book from the library. Let C be the event the library has two copies available. If A is the event the first completes on time and B the event the second is successful, then it seems reasonable to assume { A , B } ci | C . However, if only one book is available, then the two conditions would not be conditionally independent. In general P ( B | A C c ) < P ( B | C c ) , since if the first student completes on time, then he or she must have been successful in getting the book, to the detrimentof the second.
  • If the two contractors of the example above both need material which may be in scarce supply, then successful completion would be conditionally independent,give an adequate supply, whereas they would not be conditionally independent, given a short supply.
  • Two students in the same course take an exam. If they prepared separately, the event of both getting good grades should be conditionally independent. Ifthey study together, then the likelihoods of good grades would not be independent. With neither cheating or collaborating on the test itself, if one does well, theother should also.

Since conditional independence is ordinary independence with respect to a conditional probability measure, it should be clear how to extend the concept to larger classes of sets.

Definition . A class { A i : i J } , where J is an arbitrary index set, is conditionally independent, given event C , denoted { A i : i J } ci | C , iff the product rule holds for every finite subclass of two or more.

As in the case of simple independence, the replacement rule extends.

The replacement rule

If the class { A i : i J } ci | C , then any or all of the events A i may be replaced by their complements and still have a conditionally independent class.

The use of independence techniques

Since conditional independence is independence, we may use independence techniques in the solution of problems. We consider two types of problems: an inference problemand a conditional Bernoulli sequence.

Use of independence techniques

Sharon is investigating a business venture which she thinks has probability 0.7 of being successful. She checks with five “independent” advisers. If theprospects are sound, the probabilities are 0.8, 0.75, 0.6, 0.9, and 0.8 that the advisers will advise her to proceed; if the venture is not sound, the respective probabilitiesare 0.75, 0.85, 0.7, 0.9, and 0.7 that the advice will be negative. Given the quality of the project, the advisers are independent of one another in thesense that no one is affected by the others. Of course, they are not independent, for they are all related to the soundness of the venture. We mayreasonably assume conditional independence of the advice, given that the venture is sound and also given that the venture is not sound. If Sharon goes with the majorityof advisers, what is the probability she will make the right decision?

Solution

If the project is sound, Sharon makes the right choice if three or more of the five advisors are positive. If the venture is unsound, she makes the right choice if three or moreof the five advisers are negative. Let H = the event the project is sound, F = the event three or more advisers are positive, G = F c = the event three or more are negative, and E = the event of the correct decision. Then

P ( E ) = P ( F H ) + P ( G H c ) = P ( F | H ) P ( H ) + P ( G | H c ) P ( H c )

Let E i be the event the i th adviser is positive. Then P ( F | H ) = the sum of probabilities of the form P ( M k | H ) , where M k are minterms generated by the class { E i : 1 i 5 } . Because of the assumed conditional independence,

P ( E 1 E 2 c E 3 c E 4 E 5 | H ) = P ( E 1 | H ) P ( E 2 c | H ) P ( E 3 c | H ) P ( E 4 | H ) P ( E 5 | H )

with similar expressions for each P ( M k | H ) and P ( M k | H c ) . This means that if we want the probability of three or more successes, given H , we can use ckn with the matrix of conditional probabilities. The following MATLAB solution of theinvestment problem is indicated.

P1 = 0.01*[80 75 60 90 80];P2 = 0.01*[75 85 70 90 70];PH = 0.7; PE = ckn(P1,3)*PH + ckn(P2,3)*(1 - PH)PE =    0.9255
Got questions? Get instant answers now!

Often a Bernoulli sequence is related to some conditioning event H . In this case it is reasonable to assume the sequence { E i : 1 i n } ci | H and ci | H c . We consider a simple example.

Test of a claim

A race track regular claims he can pick the winning horse in any race 90 percent of the time. In order to test his claim, he picks a horse to win in each of ten races.There are five horses in each race. If he is simply guessing, the probability of success on each race is 0.2. Consider the trials to constitute a Bernoulli sequence.Let H be the event he is correct in his claim. If S is the number of successes in picking the winners in the ten races, determine P ( H | S = k ) for various numbers k of correct picks. Suppose it is equally likely that his claim is valid or that he is merely guessing. We assume two conditional Bernoulli trials:

Claim is valid:       Ten trials, probability p = P ( E i | H ) = 0 . 9 .

Guessing at random: Ten trials, probability p = P ( E i | H c ) = 0 . 2 .

Let S = number of correct picks in ten trials. Then

P ( H | S = k ) P ( H c | S = k ) = P ( H ) P ( H c ) · P ( S = k | H ) P ( S = k | H c ) , 0 k 10

Giving him the benefit of the doubt, we suppose P ( H ) / P ( H c ) = 1 and calculate the conditional odds.

k = 0:10; Pk1 = ibinom(10,0.9,k);    % Probability of k successes, given HPk2 = ibinom(10,0.2,k);    % Probability of k successes, given H^c OH  = Pk1./Pk2;            % Conditional odds-- Assumes P(H)/P(H^c) = 1e   = OH > 1;              % Selects favorable odds disp(round([k(e);OH(e)]'))            6           2      % Needs at least six to have creditability           7          73      % Seven would be creditable,            8        2627      % even if P(H)/P(H^c) = 0.1           9       94585           10     3405063

Under these assumptions, he would have to pick at least seven correctly to give reasonable validation of his claim.

Got questions? Get instant answers now!

Get Jobilize Job Search Mobile App in your pocket Now!

Get it on Google Play Download on the App Store Now




Source:  OpenStax, Applied probability. OpenStax CNX. Aug 31, 2009 Download for free at http://cnx.org/content/col10708/1.6
Google Play and the Google Play logo are trademarks of Google Inc.

Notification Switch

Would you like to follow the 'Applied probability' conversation and receive update notifications?

Ask