McGraw Hill Integrated II, 2012
MH
McGraw Hill Integrated II, 2012 View details
4. Simulations
Continue to next subchapter

Exercise 22 Page 913

Notice that we use probabilities to calculate an expected value.

A probability tells us about the chance of one event occurring, while an expected value is the weighted average of all possible outcomes of an experiment.

Practice makes perfect

To describe the difference between an expected value and a probability, let's begin by comparing the definitions of these two concepts.

Probability Expected Value
A chance that an event will happen. A weighted average of all possible events.
Must be a number between 0 and 1, inclusive. Often it is expressed as a decimal or a percent. Can be any number and does not have to be a possible outcome.
Therefore, an expected value involves using the probabilities, because there are weights in a weighted average. Let's consider an example to better illustrate this relationship.

Example

Assume that we roll a die one time.
If a die is fair, then the probability of rolling each number from the following set of possible outcomes will be constant. { 1,2,3,4,5,6} Since we have 6 possible outcomes, the probability of rolling 1 chosen number will be 1 6. Next, if we would like to evaluate the expected value of rolling a die we should add the products of each outcome and its corresponding probability of occuring.
1*1/6+2*1/6+3*1/6+4*1/6+5*1/6+6*1/6
Simplify right-hand side
1/6+2/6+3/6+4/6+5/6+6/6
1+2+3+4+5+6/6
21/6
3.5
The expected value of rolling a die is 3.5.