Connexions

You are here: Home » Content » Applied Finite Mathematics » More Probability

Lenses

What is a lens?

Definition of a lens

Lenses

A lens is a custom view of the content in the repository. You can think of it as a fancy kind of list that will let you see content through the eyes of organizations and people you trust.

What is in a lens?

Lens makers point to materials (modules and collections), creating a guide that includes their own comments and descriptive tags about the content.

Who can create a lens?

Any individual member, a community, or a respected organization.

What are tags?

Tags are descriptors added by lens makers to help label content, attaching a vocabulary that is meaningful in the context of the lens.

Endorsed by (What does "Endorsed by" mean?)

This content has been endorsed by the organizations listed. Click each link for a list of all content endorsed by the organization.
• College Open Textbooks

This collection is included inLens: Community College Open Textbook Collaborative
By: CC Open Textbook Collaborative

"Reviewer's Comments: 'I recommend this book for undergraduates. The content is especially useful for those in finance, probability statistics, and linear programming. The course material is […]"

Click the "College Open Textbooks" link to see all content they endorse.

Click the tag icon to display tags associated with this content.

Affiliated with (What does "Affiliated with" mean?)

This content is either by members of the organizations listed or about topics related to the organizations listed. Click each link to see a list of all content affiliated with the organization.
• Bookshare

This collection is included inLens: Bookshare's Lens
By: Bookshare - A Benetech Initiative

"Accessible versions of this collection are available at Bookshare. DAISY and BRF provided."

Click the "Bookshare" link to see all content affiliated with them.

• Featured Content

This collection is included inLens: Connexions Featured Content
By: Connexions

"Applied Finite Mathematics covers topics including linear equations, matrices, linear programming, the mathematics of finance, sets and counting, probability, Markov chains, and game theory."

Click the "Featured Content" link to see all content affiliated with them.

Click the tag icon to display tags associated with this content.

Recently Viewed

This feature requires Javascript to be enabled.

Tags

(What is a tag?)

These tags come from the endorsement, affiliation, and other lenses that include this content.

Inside Collection:

Collection by: Rupinder Sekhon. E-mail the author

More Probability

Module by: Rupinder Sekhon. E-mail the author

Summary: This chapter covers additional principles of probability. After completing this chapter students should be able to: find the probability of a binomial experiment; find the probabilities using Bayes' Formula; find the expected value or payoff in a game of chance; find the probabilities using tree diagrams.

Chapter Overview

In this chapter, you will learn to:

1. Find the probability of a binomial experiment.
2. Find probabilities using Bayes' Formula.
3. Find the expected value or payoff in a game of chance.
4. Find probabilities using tree diagrams.

Binomial Probability

In this section, we will consider types of problems that involve a sequence of trials, where each trial has only two outcomes, a success or a failure. These trials are independent, that is, the outcome of one does not affect the outcome of any other trial. Furthermore, the probability of success, pp size 12{p} {}, and the probability of failure, 1p1p size 12{ left (1 - p right )} {}, remains the same throughout the experiment. These problems are called binomial probability problems. Since these problems were researched by a Swiss mathematician named Jacques Bernoulli around 1700, they are also referred to as Bernoulli trials.

We give the following definition:

Binomial Experiment

A binomial experiment satisfies the following four conditions:

1. There are only two outcomes, a success or a failure, for each trial.
2. The same experiment is repeated several times.
3. The trials are independent; that is, the outcome of a particular trial does not affect the outcome of any other trial.
4. The probability of success remains the same for every trial.

The probability model that we are about to investigate will give us the tools to solve many real-life problems like the ones given below.

1. If a coin is flipped 10 times, what is the probability that it will fall heads 3 times?
2. If a basketball player makes 3 out of every 4 free throws, what is the probability that he will make 7 out of 10 free throws in a game?
3. If a medicine cures 80% of the people who take it, what is the probability that among the ten people who take the medicine, 6 will be cured?
4. If a microchip manufacturer claims that only 4% of his chips are defective, what is the probability that among the 60 chips chosen, exactly three are defective?
5. If a telemarketing executive has determined that 15% of the people contacted will purchase the product, what is the probability that among the 12 people who are contacted, 2 will buy the product?

We now consider the following example to develop a formula for finding the probability of kk size 12{k} {} successes in n Bernoulli trials.

Example 1

Problem 1

A baseball player has a batting average of .300.300 size 12{ "." "300"} {}. If he bats four times in a game, find the probability that he will have

1. four hits
2. three hits
3. two hits
4. one hit
5. no hits.
Solution

Let us suppose SS size 12{S} {} denotes that the player gets a hit, and FF size 12{F} {} denotes that he does not get a hit.

This is a binomial experiment because it meets all four conditions. First, there are only two outcomes, SS size 12{S} {} or FF size 12{F} {}. Clearly the experiment is repeated four times. Lastly, if we assume that the player's skillfulness to get a hit does not change each time he comes to bat, the trials are independent with a probability of .3.3 size 12{ "." 3} {} of getting a hit during each trial.

We draw a tree diagram to show all situations.

Let us first find the probability of getting, for example, two hits. We will have to consider the six possibilities, SSFFSSFF size 12{ ital "SSFF"} {}, SFSFSFSF size 12{ ital "SFSF"} {}, SFFSSFFS size 12{ ital "SFFS"} {}, FSSFFSSF size 12{ ital "FSSF"} {}, FSFSFSFS size 12{ ital "FSFS"} {}, FFSSFFSS size 12{ ital "FFSS"} {}, as shown in the above tree diagram. We list the probabilities of each below.

PSSFF=.3.3.7.7=.32.72PSSFF=.3.3.7.7=.32.72 size 12{P left ( ital "SSFF" right )= left ( "." 3 right ) left ( "." 3 right ) left ( "." 7 right ) left ( "." 7 right )= left ( "." 3 right ) rSup { size 8{2} } left ( "." 7 right ) rSup { size 8{2} } } {}

PSFSF=.3.7.3.7=.32.72PSFSF=.3.7.3.7=.32.72 size 12{P left ( ital "SFSF" right )= left ( "." 3 right ) left ( "." 7 right ) left ( "." 3 right ) left ( "." 7 right )= left ( "." 3 right ) rSup { size 8{2} } left ( "." 7 right ) rSup { size 8{2} } } {}

PSFFS=.3.7.7.3=.32.72PSFFS=.3.7.7.3=.32.72 size 12{P left ( ital "SFFS" right )= left ( "." 3 right ) left ( "." 7 right ) left ( "." 7 right ) left ( "." 3 right )= left ( "." 3 right ) rSup { size 8{2} } left ( "." 7 right ) rSup { size 8{2} } } {}

PFSSF=.7.3.3.7=.32.72PFSSF=.7.3.3.7=.32.72 size 12{P left ( ital "FSSF" right )= left ( "." 7 right ) left ( "." 3 right ) left ( "." 3 right ) left ( "." 7 right )= left ( "." 3 right ) rSup { size 8{2} } left ( "." 7 right ) rSup { size 8{2} } } {}

PFSFS=.7.3.7.3=.32.72PFSFS=.7.3.7.3=.32.72 size 12{P left ( ital "FSFS" right )= left ( "." 7 right ) left ( "." 3 right ) left ( "." 7 right ) left ( "." 3 right )= left ( "." 3 right ) rSup { size 8{2} } left ( "." 7 right ) rSup { size 8{2} } } {}

PFFSS=.7.7.3.3=.32.72PFFSS=.7.7.3.3=.32.72 size 12{P left ( ital "FFSS" right )= left ( "." 7 right ) left ( "." 7 right ) left ( "." 3 right ) left ( "." 3 right )= left ( "." 3 right ) rSup { size 8{2} } left ( "." 7 right ) rSup { size 8{2} } } {}

Since the probability of each of these six outcomes is .32.72.32.72 size 12{ left ( "." 3 right ) rSup { size 8{2} } left ( "." 7 right ) rSup { size 8{2} } } {}, the probability of obtaining two successes is 6.32.726.32.72 size 12{6 left ( "." 3 right ) rSup { size 8{2} } left ( "." 7 right ) rSup { size 8{2} } } {}.

The probability of getting one hit can be obtained in the same way. Since each permutation has one SS size 12{S} {} and three FF size 12{F} {}'s, there are four such outcomes: SFFFSFFF size 12{ ital "SFFF"} {}, FSFFFSFF size 12{ ital "FSFF"} {}, FFSFFFSF size 12{ ital "FFSF"} {}, and FFFSFFFS size 12{ ital "FFFS"} {}.

And since the probability of each of the four outcomes is .3.73.3.73 size 12{ left ( "." 3 right ) left ( "." 7 right ) rSup { size 8{3} } } {}, the probability of getting one hit is 4.3.734.3.73 size 12{4 left ( "." 3 right ) left ( "." 7 right ) rSup { size 8{3} } } {}.

The table below lists the probabilities for all cases, and shows a comparison with the binomial expansion of fourth degree. Again, pp size 12{p} {} denotes the probability of success, and q=1pq=1p size 12{q= left (1 - p right )} {} the probability of failure.

 Outcome Four Hits Three hits Two Hits One hits No Hits Probability . 3 4 . 3 4 size 12{ left ( "." 3 right ) rSup { size 8{4} } } {} 4 . 3 3 . 7 4 . 3 3 . 7 size 12{4 left ( "." 3 right ) rSup { size 8{3} } left ( "." 7 right )} {} 6 . 3 2 . 7 2 6 . 3 2 . 7 2 size 12{6 left ( "." 3 right ) rSup { size 8{2} } left ( "." 7 right ) rSup { size 8{2} } } {} 4 . 3 . 7 3 4 . 3 . 7 3 size 12{4 left ( "." 3 right ) left ( "." 7 right ) rSup { size 8{3} } } {} . 7 4 . 7 4 size 12{ left ( "." 7 right ) rSup { size 8{4} } } {}

This gives us the following theorem:

Theorem 1

Binomial Probability Theorem

The probability of obtaining kk size 12{k} {} successes in nn size 12{n} {} independent Bernoulli trials is given by

P n , k ; p = nCkp k q n k P n , k ; p = nCkp k q n k size 12{P left (n,k;p right )= ital "nCkp" rSup { size 8{k} } q rSup { size 8{n - k} } } {}
(1)

where pp size 12{p} {} denotes the probability of success and q=1pq=1p size 12{q= left (1 - p right )} {}the probability of failure.

We use the above formula to solve the following examples.

Example 2

Problem 1

If a coin is flipped 10 times, what is the probability that it will fall heads 3 times?

Solution

Let SS size 12{S} {} denote the probability of obtaining a head, and F the probability of obtaining a tail.

Clearly, n=10n=10 size 12{n="10"} {}, k=3k=3 size 12{k=3} {}, p=1/2p=1/2 size 12{p=1/2} {}, and q=1/2q=1/2 size 12{q=1/2} {}.

Therefore,

b10,3;1/2=10C31/231/27=.1172b10,3;1/2=10C31/231/27=.1172 size 12{b left ("10",3;1/2 right )="10"C3 left (1/2 right ) rSup { size 8{3} } left (1/2 right ) rSup { size 8{7} } = "." "1172"} {}
(2)

Example 3

Problem 1

If a basketball player makes 3 out of every 4 free throws, what is the probability that he will make 6 out of 10 free throws in a game?

Solution

The probability of making a free throw is 3/43/4 size 12{3/4} {}. Therefore, p=3/4p=3/4 size 12{p=3/4} {}, q=1/4q=1/4 size 12{q=1/4} {}, n=10n=10 size 12{n="10"} {}, and k=6k=6 size 12{k=6} {}.

Therefore,

b10,6;3/4=10C63/461/44=.1460b10,6;3/4=10C63/461/44=.1460 size 12{b left ("10",6;3/4 right )="10"C6 left (3/4 right ) rSup { size 8{6} } left (1/4 right ) rSup { size 8{4} } = "." "1460"} {}
(3)

Example 4

Problem 1

If a medicine cures 80% of the people who take it, what is the probability that of the eight people who take the medicine, 5 will be cured?

Solution

Here p=.80p=.80 size 12{p= "." "80"} {}, q=.20q=.20 size 12{q= "." "20"} {}, n=8n=8 size 12{n=8} {}, and k=5k=5 size 12{k=5} {}.

b8,5;.80=8C5.805.203=.1468b8,5;.80=8C5.805.203=.1468 size 12{b left (8,5; "." "80" right )=8C5 left ( "." "80" right ) rSup { size 8{5} } left ( "." "20" right ) rSup { size 8{3} } = "." "1468"} {}
(4)

Example 5

Problem 1

If a microchip manufacturer claims that only 4% of his chips are defective, what is the probability that among the 60 chips chosen, exactly three are defective?

Solution

If SS size 12{S} {} denotes the probability that the chip is defective, and FF size 12{F} {} the probability that the chip is not defective, then p=.04p=.04 size 12{p= "." "04"} {}, q=.96q=.96 size 12{q= "." "96"} {}, n=60n=60 size 12{n="60"} {}, and k=3k=3 size 12{k=3} {}.

b60,3;.04=60C3.043.9657=.2138b60,3;.04=60C3.043.9657=.2138 size 12{b left ("60",3; "." "04" right )="60"C3 left ( "." "04" right ) rSup { size 8{3} } left ( "." "96" right ) rSup { size 8{"57"} } = "." "2138"} {}
(5)

Example 6

Problem 1

If a telemarketing executive has determined that 15% of the people contacted will purchase the product, what is the probability that among the 12 people who are contacted, 2 will buy the product?

Solution

If S denoted the probability that a person will buy the product, and F the probability that the person will not buy the product, then p=.15p=.15 size 12{p= "." "15"} {}, q=.85q=.85 size 12{q= "." "85"} {}, n=12n=12 size 12{n="12"} {}, and k=2k=2 size 12{k=2} {}.

b12,2,.15=12C2.152.8510=.2924b12,2,.15=12C2.152.8510=.2924 size 12{b left ("12",2, "." "15" right )="12"C2 left ( "." "15" right ) rSup { size 8{2} } left ( "." "85" right ) rSup { size 8{"10"} } = "." "2924"} {}.

Bayes' Formula

In this section, we will develop and use Bayes' Formula to solve an important type of probability problem. Bayes' formula is a method of calculating the conditional probability PFEPFE size 12{P left (F \lline E right )} {} from PEFPEF size 12{P left (E \lline F right )} {}. The ideas involved here are not new, and most of these problems can be solved using a tree diagram. However, Bayes' formula does provide us with a tool with which we can solve these problems without a tree diagram.

We begin with an example.

Example 7

Problem 1

Suppose you are given two jars. Jar I contains one black and 4 white marbles, and Jar II contains 4 black and 6 white marbles. If a jar is selected at random and a marble is chosen,

1. What is the probability that the marble chosen is a black marble?
2. If the chosen marble is black, what is the probability that it came from Jar I?
3. If the chosen marble is black, what is the probability that it came from Jar II?
Solution

Let JIJI size 12{JI} {} I be the event that Jar I is chosen, JIIJII size 12{J ital "II"} {} be the event that Jar II is chosen, BB size 12{B} {} be the event that a black marble is chosen and WW size 12{W} {} the event that a white marble is chosen.

We illustrate using a tree diagram.

1. The probability that a black marble is chosen is PB=1/10+2/10=3/10PB=1/10+2/10=3/10 size 12{P left (B right )=1/"10"+2/"10"=3/"10"} {}.

2. To find PJIBPJIB size 12{P left (JI \lline B right )} {}, we use the definition of conditional probability, and we get

PJIB=PJIBPB=1/103/10=13PJIB=PJIBPB=1/103/10=13 size 12{P left (JI \lline B right )= { {P left (JI intersection B right )} over {P left (B right )} } = { {1/"10"} over {3/"10"} } = { {1} over {3} } } {}
(6)
3. Similarly, PJIIB=PJIIBPB=2/103/10=23PJIIB=PJIIBPB=2/103/10=23 size 12{P left (J ital "II" \lline B right )= { {P left (J ital "II" intersection B right )} over {P left (B right )} } = { {2/"10"} over {3/"10"} } = { {2} over {3} } } {}

In parts b and c, the reader should note that the denominator is the sum of all probabilities of all branches of the tree that produce a black marble, while the numerator is the branch that is associated with the particular jar in question.

We will soon discover that this is a statement of Bayes' formula .

Let us first visualize the problem.

We are given a sample space SS size 12{S} {} and two mutually exclusive events JIJI size 12{JI} {} and JIIJII size 12{J ital "II"} {}. That is, the two events, JIJI size 12{JI} {} and JIIJII size 12{J ital "II"} {}, divide the sample space into two parts such that JIJII=SJIJII=S size 12{JI union J ital "II"=S} {}. Furthermore, we are given an event BB size 12{B} {} that has elements in both JIJI size 12{JI} {} and JIIJII size 12{J ital "II"} {}, as shown in the Venn diagram below.

From the Venn diagram, we can see that

B = B J I B J II B = B J I B J II size 12{B= left (B intersection JI right ) union left (B intersection J ital "II" right )} {}

and

P B = P B J I + P B J II P B = P B J I + P B J II size 12{P left (B right )=P left (B intersection JI right )+P left (B intersection J ital "II" right )} {}

But the product rule in (Reference) gives us

P B J I = P J I P B J I P B J I = P J I P B J I size 12{P left (B intersection JI right )=P left (JI right ) cdot P left (B \lline JI right )} {}       P B J II = P J II P B J II P B J II = P J II P B J II size 12{P left (B intersection J ital "II" right )=P left (J ital "II" right ) cdot P left (B \lline J ital "II" right )} {}

Substituting in Paragraph 59, we get

P B = P J I P B J I + P J II P B J II P B = P J I P B J I + P J II P B J II size 12{P left (B right )=P left (JI right ) cdot P left (B \lline JI right )+P left (J ital "II" right ) cdot P left (B \lline J ital "II" right )} {}

The conditional probability formula gives us

P J I B = P J I B P B P J I B = P J I B P B size 12{P left (JI \lline B right )= { {P left (JI intersection B right )} over {P left (B right )} } } {}

Therefore,

P J I B = P J I P B J I P B P J I B = P J I P B J I P B size 12{P left (JI \lline B right )= { {P left (JI cdot P left (B \lline JI right ) right )} over {P left (B right )} } } {}

or,

P J I B = P J I P B J I P J I P B J I + P J II P B J II P J I B = P J I P B J I P J I P B J I + P J II P B J II size 12{P left (JI \lline B right )= { {P left (JI right ) cdot P left (B \lline JI right )} over {P left (JI right ) cdot P left (B \lline JI right )+P left (J ital "II" right ) cdot P left (B \lline J ital "II" right )} } } {}

The last statement is Bayes' Formula for the case where the sample space is divided into two partitions. The following is the generalization of this formula for n partitions.

Let SS size 12{S} {} be a sample space that is divided into nn size 12{n} {} partitions, A1A1 size 12{A rSub { size 8{1} } } {}, A2A2 size 12{A rSub { size 8{2} } } {}, . . . AnAn size 12{A rSub { size 8{n} } } {}. If EE size 12{E} {} is any event in SS size 12{S} {}, then

P A i E = P A i P E A i P A 1 P E A 1 + P A 2 P E A 2 + + P A n P E A n P A i E = P A i P E A i P A 1 P E A 1 + P A 2 P E A 2 + + P A n P E A n size 12{P left (A rSub { size 8{i} } \lline E right )= { {P left (A rSub { size 8{i} } right )P left (E \lline A rSub { size 8{i} } right )} over {P left (A rSub { size 8{1} } right )P left (E \lline A rSub { size 8{1} } right )+P left (A rSub { size 8{2} } right )P left (E \lline A rSub { size 8{2} } right )+ dotsaxis +P left (A rSub { size 8{n} } right )P left (E \lline A rSub { size 8{n} } right )} } } {}
(7)

We begin with the following example.

Example 9

Problem 1

A department store buys 50% of its appliances from Manufacturer A, 30% from Manufacturer B, and 20% from Manufacturer C. It is estimated that 6% of Manufacturer A's appliances, 5% of Manufacturer B's appliances, and 4% of Manufacturer C's appliances need repair before the warranty expires. An appliance is chosen at random. If the appliance chosen needed repair before the warranty expired, what is the probability that the appliance was manufactured by Manufacturer A? Manufacturer B? Manufacturer C?

Solution

Let events AA size 12{A} {}, BB size 12{B} {} and CC size 12{C} {} be the events that the appliance is manufactured by Manufacturer A, Manufacturer B, and Manufacturer C, respectively. Further, suppose that the event RR size 12{R} {} denotes that the appliance needs repair before the warranty expires.

We need to find PARPAR size 12{P left (A \lline R right )} {}, PBRPBR size 12{P left (B \lline R right )} {} and PCRPCR size 12{P left (C \lline R right )} {}.

We will do this problem both by using a tree diagram and by using Bayes' formula.

We draw a tree diagram.

The probability PARPAR size 12{P left (A \lline R right )} {}, for example, is a fraction whose denominator is the sum of all probabilities of all branches of the tree that result in an appliance that needs repair before the warranty expires, and the numerator is the branch that is associated with Manufacturer A. PBRPBR size 12{P left (B \lline R right )} {} and PCRPCR size 12{P left (C \lline R right )} {} are found in the same way. We list both as follows:

P A R = . 030 . 030 + . 015 + . 008 = . 030 . 053 = . 566 P A R = . 030 . 030 + . 015 + . 008 = . 030 . 053 = . 566 size 12{P left (A \lline R right )= { { "." "030"} over { left ( "." "030" right )+ left ( "." "015" right )+ left ( "." "008" right )} } = { { "." "030"} over { "." "053"} } = "." "566"} {}

PBR=.015.053=.283PBR=.015.053=.283 size 12{P left (B \lline R right )= { { "." "015"} over { "." "053"} } = "." "283"} {} and PCR=.008.053=.151PCR=.008.053=.151 size 12{P left (C \lline R right )= { { "." "008"} over { "." "053"} } = "." "151"} {}.

Alternatively, using Bayes' formula,

PAR=PAPRAPAPRA+PBPRB+PCPRC=.030.030+.015+.008=.030.053=.566PAR=PAPRAPAPRA+PBPRB+PCPRC=.030.030+.015+.008=.030.053=.566 size 12{ matrix { P left (A \lline R right )= { {P left (A right )P left (R \lline A right )} over {P left (A right )P left (R \lline A right )+P left (B right )P left (R \lline B right )+P left (C right )P left (R \lline C right )} } {} ## = { { "." "030"} over { left ( "." "030" right )+ left ( "." "015" right )+ left ( "." "008" right )} } = { { "." "030"} over { "." "053"} } = "." "566" } } {}
(8)

PBRPBR size 12{P left (B \lline R right )} {} and PCRPCR size 12{P left (C \lline R right )} {} can be determined in the same manner.

Example 10

Problem 1

There are five Jacy's department stores in San Jose. The distribution of number of employees by gender is given in the table below.

 Store Number Number of Employees Percent of Women Employees 1 300 .40 2 150 .65 3 200 .60 4 250 .50 5 100 .70 Total=1000

If an employee chosen at random is a woman, what is the probability that the employee works at store III?

Solution

Let k=1,2,...,5k=1,2,...,5 size 12{k=1,2, dotslow ,5} {} be the event that the employee worked at store kk size 12{k} {}, and WW size 12{W} {} be the event that the employee is a woman. Since there are a total of 1000 employees at the five stores,

P1=.30   P2=.15   P3=.20   P4=.25   P5=.10P1=.30    size 12{P left (1 right )= "." "30"} {}P2=.15    size 12{P left (2 right )= "." "15"} {}P3=.20    size 12{P left (3 right )= "." "20"} {}P4=.25    size 12{P left (4 right )= "." "25"} {}P5=.10 size 12{P left (5 right )= "." "10"} {}
(9)

Using Bayes' formula,

P 3W = P 3 PW3 P1PW1 +P 2P W2 +P 3P W3 + P 4P W 4 +P 5 P W5 = .20 .60 .30 .40 + .15 .65 + .20 .60 + .25 .50 + .10 .70 = .2254 P 3W = P 3 PW3 P1PW1 +P 2P W2 +P 3P W3 + P 4P W 4 +P 5 P W5 = .20 .60 .30 .40 + .15 .65 + .20 .60 + .25 .50 + .10 .70 = .2254 size 12{ matrix { P left (3 \lline W right )= { {P left (3 right )P left (W \lline 3 right )} over {P left (1 right )P left (W \lline 1 right )+P left (2 right )P left (W \lline 2 right )+P left (3 right )P left (W \lline 3 right )+P left (4 right )P left (W \lline 4 right )+P left (5 right )P left (W \lline 5 right )} } {} ## = { { left ( "." "20" right ) left ( "." "60" right )} over { left ( "." "30" right ) left ( "." "40" right )+ left ( "." "15" right ) left ( "." "65" right )+ left ( "." "20" right ) left ( "." "60" right )+ left ( "." "25" right ) left ( "." "50" right )+ left ( "." "10" right ) left ( "." "70" right )} } {} ## = "." "2254" } } {}
(10)

Expected Value

An expected gain or loss in a game of chance is called Expected Value. The concept of expected value is closely related to a weighted average. Consider the following situations.

1. Suppose you and your friend play a game that consists of rolling a die. Your friend offers you the following deal: If the die shows any number from 1 to 5, he will pay you the face value of the die in dollars, that is, if the die shows a 4, he will pay you $4. But if the die shows a 6, you will have to pay him$18.

Before you play the game you decide to find the expected value. You analyze as follows.

Since a die will show a number from 1 to 6, with an equal probability of 1/61/6 size 12{1/6} {}, your chance of winning $1 is 1/61/6 size 12{1/6} {}, winning$2 is 1/61/6 size 12{1/6} {}, and so on up to the face value of 5. But if the die shows a 6, you will lose $18. You write the expected value. E =$ 1 1 / 6 + $2 1 / 6 +$ 3 1 / 6 + $4 1 / 6 +$ 5 1 / 6 $18 1 / 6 =$ . 50 E = $1 1 / 6 +$ 2 1 / 6 + $3 1 / 6 +$ 4 1 / 6 + $5 1 / 6$ 18 1 / 6 = $. 50 size 12{E=$1 left (1/6 right )+$2 left (1/6 right )+$3 left (1/6 right )+$4 left (1/6 right )+$5 left (1/6 right ) - $"18" left (1/6 right )= -$ "." "50"} {}

This means that every time you play this game, you can expect to lose 50 cents. In other words, if you play this game 100 times, theoretically you will lose $50. Obviously, it is not to your interest to play. 2. Suppose of the ten quizzes you took in a course, on eight quizzes you scored 80, and on two you scored 90. You wish to find the average of the ten quizzes. The average is A = 80 8 + 90 2 10 = 80 8 10 + 90 2 10 = 82 A = 80 8 + 90 2 10 = 80 8 10 + 90 2 10 = 82 size 12{A= { { left ("80" right ) left (8 right )+ left ("90" right ) left (2 right )} over {"10"} } = left ("80" right ) { {8} over {"10"} } + left ("90" right ) { {2} over {"10"} } ="82"} {} (11) It should be observed that it will be incorrect to take the average of 80 and 90 because you scored 80 on eight quizzes, and 90 on only two of them. Therefore, you take a "weighted average" of 80 and 90. That is, the average of 8 parts of 80 and 2 parts of 90, which is 82. In the first situation, to find the expected value, we multiplied each payoff by the probability of its occurrence, and then added up the amounts calculated for all possible cases. In the second part of List 10, if we consider our test score a payoff, we did the same. This leads us to the following definition. Definition 1: Expected Value If an experiment has the following probability distribution,  Payoff x 1 x 2 x 3 ⋯ x n x 1 x 2 x 3 ⋯ x n size 12{ matrix { x rSub { size 8{1} } {} # x rSub { size 8{2} } {} # x rSub { size 8{3} } {} # dotsaxis {} # x rSub { size 8{n} } {} } } {} Probability p x 1 p x 2 p x 3 ⋯ p x n p x 1 p x 2 p x 3 ⋯ p x n size 12{ matrix { p left (x rSub { size 8{1} } right ) {} # p left (x rSub { size 8{2} } right ) {} # p left (x rSub { size 8{3} } right ) {} # dotsaxis {} # p left (x rSub { size 8{n} } right ){} } } {} then the expected value of the experiment is Expected Value = x 1 p x 1 + x 2 p x 2 + x 3 p x 3 + + x n p x n Expected Value = x 1 p x 1 + x 2 p x 2 + x 3 p x 3 + + x n p x n size 12{"Expected Value"=x rSub { size 8{1} } p left (x rSub { size 8{1} } right )+x rSub { size 8{2} } p left (x rSub { size 8{2} } right )+x rSub { size 8{3} } p left (x rSub { size 8{3} } right )+ dotsaxis +x rSub { size 8{n} } p left (x rSub { size 8{n} } right )} {} Example 11 Problem 1 In a town, 10% of the families have three children, 60% of the families have two children, 20% of the families have one child, and 10% of the families have no children. What is the expected number of children to a family? Solution We list the information in the following table.  Number of Children 3 2 1 0 Probability . 10 . 10 size 12{ "." "10"} {} . 60 . 60 size 12{ "." "60"} {} . 20 . 20 size 12{ "." "20"} {} . 10 . 10 size 12{ "." "10"} {} Expected Value=x1px1+x2px2+x3px3+x4px4Expected Value=x1px1+x2px2+x3px3+x4px4 size 12{"Expected Value"=x rSub { size 8{1} } p left (x rSub { size 8{1} } right )+x rSub { size 8{2} } p left (x rSub { size 8{2} } right )+x rSub { size 8{3} } p left (x rSub { size 8{3} } right )+x rSub { size 8{4} } p left (x rSub { size 8{4} } right )} {} (12) E=3.10+2.60+1.20+0.10=1.7E=3.10+2.60+1.20+0.10=1.7 size 12{E=3 left ( "." "10" right )+2 left ( "." "60" right )+1 left ( "." "20" right )+0 left ( "." "10" right )=1 "." 7} {} (13) So on average, there are 1.7 children to a family. Example 12 Problem 1 To sell an average house, a real estate broker spends$1200 for advertisement expenses. If the house sells in three months, the broker makes $8,000. Otherwise, the broker loses the listing. If there is a 40% chance that the house will sell in three months, what is the expected payoff for the real estate broker? Solution The broker makes$8,000 with a probability of .40.40 size 12{ "." "40"} {}, but he loses $1200 whether the house sells or not. E=$.8000.40$1200=$2,000E=$.8000.40$1200=$2,000 size 12{E= left ($ "." "8000" right ) left ( "." "40" right ) - left ($"1200" right )=$2,"000"} {}.

Alternatively, the broker makes $80001200$80001200 size 12{$left ("8000" - "1200" right )} {} with a probability of .40.40 size 12{ "." "40"} {}, but loses$1200 with a probability of .60.60 size 12{ "." "60"} {}. Therefore,

E=$6800.40$1200.60=$2,000E=$6800.40$1200.60=$2,000 size 12{E= left ($"6800" right ) left ( "." "40" right ) - left ($"1200" right ) left ( "." "60" right )=$2,"000"} {}. Example 13 Problem 1 In a town, the attendance at a football game depends on the weather. On a sunny day the attendance is 60,000, on a cold day the attendance is 40,000, and on a stormy day the attendance is 30,000. If for the next football season, the weatherman has predicted that 30% of the days will be sunny, 50% of the days will be cold, and 20% days will be stormy, what is the expected attendance for a single game? Solution Using the expected value formula, we get e=60,000.30+40,000.50+30,000.20=44,000.e=60,000.30+40,000.50+30,000.20=44,000 size 12{e= left ("60","000" right ) left ( "." "30" right )+ left ("40","000" right ) left ( "." "50" right )+ left ("30","000" right ) left ( "." "20" right )="44","000"} {}. (14) Example 14 Problem 1 A lottery consists of choosing 6 numbers from a total of 51 numbers. The person who matches all six numbers wins$2 million. If the lottery ticket costs $1, what is the expected payoff? Solution Since there are 51C6=18,009,46051C6=18,009,460 size 12{"51"C6="18","009","460"} {} combinations of six numbers from a total of 51 numbers, the chance of choosing the winning number is 1 out of 18,009,460. So the expected payoff is E=$2 million118009460$1=$0.89E=$2 million118009460$1=$0.89 size 12{E= left ($2" million" right ) left ( { {1} over {"18009460"} } right ) - $1=$0 "." "89"} {}
(15)

This means that every time a person spends \$1 to buy a ticket, he or she can expect to lose 89 cents.

Probability Using Tree Diagrams

As we have already seen, tree diagrams play an important role in solving probability problems. A tree diagram helps us not only visualize, but also list all possible outcomes in a systematic fashion. Furthermore, when we list various outcomes of an experiment and their corresponding probabilities on a tree diagram, we gain a better understanding of when probabilities are multiplied and when they are added. The meanings of the words and and or become clear when we learn to multiply probabilities horizontally across branches, and add probabilities vertically down the tree.

Although tree diagrams are not practical in situations where the possible outcomes become large, they are a significant tool in breaking the problem down in a schematic way. We consider some examples that may seem difficult at first, but with the help of a tree diagram, they can easily be solved.

Example 15

Problem 1

A person has four keys and only one key fits to the lock of a door. What is the probability that the locked door can be unlocked in at most three tries?

Solution

Let UU size 12{U} {} be the event that the door has been unlocked and LL size 12{L} {} be the event that the door has not been unlocked. We illustrate with a tree diagram.

The probability of unlocking the door in the first try = 1/4 The probability of unlocking the door in the first try = 1/4 size 12{"The probability of unlocking the door in the first try"="1/4"} {}
(16)
The probability of unlocking the door in the second try = 3 / 4 1 / 3 = 1 / 4 The probability of unlocking the door in the second try = 3 / 4 1 / 3 = 1 / 4 size 12{"The probability of unlocking the door in the second try "= left (3/4 right ) left (1/3 right )=1/4} {}
(17)
The probability of unlocking the door in the third try = 3 / 4 2 / 3 1 / 2 = 1 / 4 The probability of unlocking the door in the third try = 3 / 4 2 / 3 1 / 2 = 1 / 4 size 12{"The probability of unlocking the door in the third try"= left (3/4 right ) left (2/3 right ) left (1/2 right )=1/4} {}
(18)

Therefore, the probability of unlocking the door in at most three tries=1/4+1/4+1/4=3/4the probability of unlocking the door in at most three tries=1/4+1/4+1/4=3/4 size 12{"the probability of unlocking the door in at most three tries"=1/4+1/4+1/4=3/4} {}

Example 16

Problem 1

A jar contains 3 black and 2 white marbles. We continue to draw marbles one at a time until two black marbles are drawn. If a white marble is drawn, the outcome is recorded and the marble is put back in the jar before drawing the next marble. What is the probability that we will get exactly two black marbles in at most three tries?

Solution

We illustrate using a tree diagram.

The probability that we will get two black marbles in the first two tries is listed adjacent to the lowest branch, and it =310=310

The probability of getting first black, second white, and third black =320=320

Similarly, the probability of getting first white, second black, and third black =325=325

Therefore, the probability of getting exactly two black marbles in at most three tries =310+=320 + =325 = 57100 =310+=320+=325=57100

Example 17

Problem 1

A circuit consists of three resistors: resistor R1R1 size 12{R rSub { size 8{1} } } {}, resistor R2R2 size 12{R rSub { size 8{2} } } {}, and resistor R3R3 size 12{R rSub { size 8{3} } } {}, joined in a series. If one of the resistors fails, the circuit stops working. If the probability that resistors R1R1 size 12{R rSub { size 8{1} } } {}, R2R2 size 12{R rSub { size 8{2} } } {}, or R3R3 size 12{R rSub { size 8{3} } } {} will fail is .07.07 size 12{ "." "07"} {}, .10.10 size 12{ "." "10"} {}, and .08.08 size 12{ "." "08"} {}, respectively, what is the probability that at least one of the resistors will fail?

Solution

Clearly, the that at least one of the resistors fails=1none of the resistors failsthe that at least one of the resistors fails=1none of the resistors fails size 12{"the that at least one of the resistors fails"=1 - "none of the resistors fails"} {}.

It is quite easy to find the probability of the event that none of the resistors fails. We don't even need to draw a tree because we can visualize the only branch of the tree that assures this outcome.

The probabilities that R1R1 size 12{R rSub { size 8{1} } } {}, R2R2 size 12{R rSub { size 8{2} } } {}, R3R3 size 12{R rSub { size 8{3} } } {} will not fail are .93.93 size 12{ "." "93"} {}, .90.90 size 12{ "." "90"} {}, and .92.92 size 12{ "." "92"} {} respectively. Therefore, the probability that none of the resistors fails =.93.90.92=.77the probability that none of the resistors fails =.93.90.92=.77 size 12{"the probability that none of the resistors fails "= left ( "." "93" right ) left ( "." "90" right ) left ( "." "92" right )= "." "77"} {}.

Thus, the probability that at least one of them will fail=1.77=.23the probability that at least one of them will fail=1.77=.23 size 12{"the probability that at least one of them will fail"=1 - "." "77"= "." "23"} {}.

Content actions

PDF | EPUB (?)

What is an EPUB file?

EPUB is an electronic book format that can be read on a variety of mobile devices.

PDF | EPUB (?)

What is an EPUB file?

EPUB is an electronic book format that can be read on a variety of mobile devices.

Collection to:

My Favorites (?)

'My Favorites' is a special kind of lens which you can use to bookmark modules and collections. 'My Favorites' can only be seen by you, and collections saved in 'My Favorites' can remember the last module you were on. You need an account to use 'My Favorites'.

| A lens I own (?)

Definition of a lens

Lenses

A lens is a custom view of the content in the repository. You can think of it as a fancy kind of list that will let you see content through the eyes of organizations and people you trust.

What is in a lens?

Lens makers point to materials (modules and collections), creating a guide that includes their own comments and descriptive tags about the content.

Who can create a lens?

Any individual member, a community, or a respected organization.

What are tags?

Tags are descriptors added by lens makers to help label content, attaching a vocabulary that is meaningful in the context of the lens.

| External bookmarks

Module to:

My Favorites (?)

'My Favorites' is a special kind of lens which you can use to bookmark modules and collections. 'My Favorites' can only be seen by you, and collections saved in 'My Favorites' can remember the last module you were on. You need an account to use 'My Favorites'.

| A lens I own (?)

Definition of a lens

Lenses

A lens is a custom view of the content in the repository. You can think of it as a fancy kind of list that will let you see content through the eyes of organizations and people you trust.

What is in a lens?

Lens makers point to materials (modules and collections), creating a guide that includes their own comments and descriptive tags about the content.

Who can create a lens?

Any individual member, a community, or a respected organization.

What are tags?

Tags are descriptors added by lens makers to help label content, attaching a vocabulary that is meaningful in the context of the lens.

| External bookmarks