Week 13

Topics

This is the second of three weeks devoted to elementary probability and statistics. This week, we will cover the second half of chapter 8.

Overview of the Sections

Section 8-3 deals with conditional probability. The conditional probability P(A|B) (pronounced "P of A given B") is the probability that event A will happen if it is known that event B will happen. The conditional probability may or may not be the same as P(A).

Example: If you roll one die, the probability that you will get an even number is 1/2. If you know that the number is greater than 3, the probability of rolling an even number changes to 2/3 (there are three possible outcomes left [4,5, and 6], and two of them are even). This is the probability of rolling an even number given that the number is greater than 3.

You can calculate a conditional probability either with the Conditional Probobility formula (1) in Section 8.3, or by drawing a tree diagram or Venn diagram and reading off the probabilities.

Two events A and B are independent if P(A|B) is the same as P(A). This also implies that P(B|A) is the same as P(B). Whether B happens or not has no influence on A, and vice versa.

Section 8-4 deals with Bayes' formula. This formula lets you go back and forth between P(A|B) and P(B|A). The book explains the formula, but doesn't do much with it. By the time you have drawn a tree diagram to go along with a problem, you can read off what you need without using the formula.

Section 8-5 is very important if you plan to take a statistics course later. It introduces some basic concepts you will see over and over again later.

A random variable is a mapping that assigns a number to each elementary outcome in a sample space. This produces a new sample space consisting of numbers, with a probability distribution inherited from the original. The book uses a capital P for the probability distribution on the original space, and a small p for the probability distribution on the new space. The probability of each outcome in the new sample space is obtained by adding up the probabilities of all outcomes in the original sample space that get mapped into that outcome.

Example: You roll two dice. The original sample space has 36 elements (1,1),...,(6,6). Let X be the sum of the two numbers. That is a random variable. The new sample space consists of the numbers from 2 to 12. The original outcomes (1,4), (2,3), (3,2) and (4,1) are the ones that get mapped into 5, so p(X=5) = P(1,4) + P(2,3) + P(3,2) + P(4,1) = 4/36.

Example: Let Y be the larger of the two numbers you rolled. That is a different random variable on the same original sample space. This one has a new sample space of {1,...,6}, and p(Y=5) = P(1,5)+P(5,1)+ P(2,5)+P(5,2)+P(3,5)+P(5,3)+P(4,5)+P(5,4)+P(5,5)=9/36.

So what's the point? The point is that the new sample space consists of numbers, and with numbers you can do calculations. If you draw a card from a deck, your sample space consists of 52 playing cards. How would you calculate an average, for example? With random variables, your sample space consists of numbers, and now you can add them, average them, whatever. Doing calculations with random variables is what probabilists and statisticians do most of the time.

When you take a statistics course, that is mostly what you will be doing, too, but that is a later course. In Math 150 there is only one calculation we do with random variables: computing the expected value or expectation, written E(X). This is the average outcome. The formula is in the definition of Expected Value of a Random Variable in Section 8.5.

For example, if the experiment consists of playing a casino game, and the random variable X is the amount of money you win or lose, the expected value is the amount of money you win per play on the average. Most likely, this will be a negative amount, otherwise the casino would go broke.

Assignments

Read the textbook and do the homework assignment HW 10


Last Updated: Wednesday, August 5, 2015