# When Do I Add or Multiply in Probability?

Probability is a subject in which it can be quite difficult to see at a glance what method to use. There are many ways in which its ideas can be combined. Two principal ways are “or” and “and”: What is the probability that I draw an ace or a spade? How about an ace and a spade? When do I add probabilities, when do I multiply them, and when do I have to do something else entirely?

Dr. Mitteldorf gave the main ideas in 2001:

When to Add, When to Multiply?

I am studying probability at high school. When do you add and when multiply to compute probability?

He began,

You're looking for rules that will get you through.  t's not going towork. In probability more than in any other field, you have to understand the situation, and have a feel for it. Doing lots of problems will help, even if you feel insecure trying to work them at first. It helps especially if you do your best to think about the problems on your own. Try this and that - see what makes sense to you.  The calculation is the easy part, and knowing what to add or when to multiply is what it's all about.

In his answer, he explains that the probability that two events will both occur (A and B) is typically found by multiplying:

Remember that probabilities are all numbers less than 1. When you multiply them together, they get smaller. When you multiply several together, they can rapidly get a lot smaller. Now think about unlikely events: If it's unlikely that one such event occurs, it's "doubly" unlikely that two will occur. You multiply probabilities together to compute the unlikeliness of this coincidence - two unlikely events both occurring together.

But the probability that either event will occur (A or B) is typically found by adding:

When you're looking for the probability that two events, A andB, will BOTH occur, the probability of this coincidence is small, and you multiply the separate probabilities of A and B to get a smaller number. When you don't care which happens - either A or B - you can add the probabilities to find the separate probability that one or the other will happen.

So now I've done it. First I told you that there are no rules for telling you when to multiply and when to add; then I gave you a rule that tells you when to multiply and when to add.

He goes on to talk about the issue of mutually exclusive events, and how the rule has to be modified when the events can both happen.

But a full understanding requires more details. Several of those were brought out by the following question in 2012, when I helped with a specific problem by demonstrating all the important ideas:

Independent Outcomes OR (AND?) Dependent Ones

... I am confused about which formula I am supposed to use for this problem:

P(A)*P(B)      or      P(A) + P(B) - P(A and B)?

And why?
...
Then I thought that maybe the formula P(A and B)=P(A)*P(B) only works when there are two separate events, such as roll a 6, and then roll a 5. In the original problem, there is only one event -- namely, whether the ONE employee is a college graduate and has more than 10 years of experience. If this problem were to ask if you pick two college employees, and what is the probability that the first is a college graduate and the second has more than 10 years of experience, then would you use the formula Imentioned above?

There are several issues raised here. Millicent knows that the first rule is used for “and”, and the second for “or”, but her problem (which I won’t be discussing here) involves both. And examples she’s seen for the “or” formula typically involve two separate things happening, rather than just one; is that important? These questions were things I’d been wanting to write about anyway. So here are my comments. First, we need to see each of the rules stated clearly, including the conditions under which it applies:

   If A and B are ANY events, then
P(A and B) = P(A) * P(B | A)      [you may not have seen this yet]

If A and B are INDEPENDENT events, then
P(A and B) = P(A) * P(B)             [the convenient special case]

If A and B are ANY events, then
P(A or B) = P(A) + P(B) - P(A and B)            [the general case]

If A and B are MUTUALLY EXCLUSIVE events, then
P(A or B) = P(A) + P(B)    [the special case where P(A and B) = 0]

Each formula has a general case, which can always be applied, and a special case in which a particular relationship between the events simplifies things. The first pair are used when there is an “and”, and the second when there is an “or”.

But her problem involved both “or” and “and”! The reason, it turns out, is that this problem is more like an algebra problem, where we have to write an equation that, rather than directly giving the result we want, just describes the situation, and can then be solved. The third formula above (general case for “or”) involves both “or” and “and”, so we write that out, and can solve for the remaining unknown.

After setting her on the road to solving her problem, I turned to the side question, distinguishing events from experiments, which is not always made clear in textbooks:

Also, there ARE two events here (two kinds of outcomes); that's not the same as two EXPERIMENTS (such as picking two different people). It is true that in simple problems, "and" tends to appear in compound events involving two separate experiments (because that's the easiest way for two events to be independent), whereas "or" often involves a single experiment (e.g., picking one card that is either an ace or a spade). But you can also have "and" problems involving independent events from only one experiment (e.g., picking one card that is BOTH an ace AND a spade), and "or" problems involving two experiments (e.g., picking two cards and asking whether at least one is a spade -- though there's an easier way to do this one than the "or" formula).

So you can't go by the number of selections, but only by whether the events are independent, and whether you are looking for an "and" only, or a relationship between "and" and "or."

To illustrate more fully what I am saying here, consider these four compound events:

1. One experiment, two events, OR: draw one card, which is either an ace OR a spade.
2. Two experiments, two events, OR: draw two cards; the first is an ace OR the second is a spade.
3. One experiment, two events, AND: draw one card, which is both an ace AND a spade.
4. Two experiments, two events, AND: draw two cards; the first is an ace AND the second is a spade.

Problems 1 and 4 are probably more common, especially in introductory material, but all four are valid questions, and each is different.

For the record, the probabilities are:

1. P(Ace or Spade) = P(Ace) + P(Spade) – P(Ace of spades) = 4/52 + 13/52 – 1/52 = 16/52 = 4/13
2. P(First Ace or Second Spade) = P(First Ace) + P(Second Spade) – P(First Ace and Second Spade) = 4/52 + 13/52 – (4/52)(13/52) = 858/2704 = 33/104
4. P(First Ace and Second Spade) = P(First Ace) P(Second Spade) = 1/13 * 1/4 = 1/52

The last two happen to be the same because a deck of cards is arranged so that suits and values are independent. In general, they would be different, because drawing the first card would change the situation for the second – which I’ll get to in a moment!

Just a quick question that is sort of related to my earlier question: can you still use P(A and B) = P(A) * P(B) for dependent events, like P(picking 3 aces from a deck), or can you only use that for independent events?

Here I was able to explain the first formula I’d given, which I suspected she might not have been taught yet, because many textbooks, when they introduce the multiplication property, sweep this detail under the rug (to avoid having to introduce conditional probability too early).

In the example of picking two aces (to keep it as easy as possible to write), you do this:

P(two aces) = P(ace first and ace second)
= P(ace first) * P(ace second | ace first)

Calculate "P(B)" not as it is initially, but as it is after picking the first ace:

= 4/52 * 3/51
= 1/13 * 1/17 = 1/221

Again, the 3/51 isn't really the probability of picking an ace (the second not being independent of the first), but is a conditional probability.

So with this caveat, you can use the product rule for dependent events.

I was originally going to dig a little deeper into the reasons for the rules, but I’ll save that for next time: WHY do we add? WHY do we multiply?

### 4 thoughts on “When Do I Add or Multiply in Probability?”

1. Hello. I am confused as to when you use P(AandB)= P(A)*P(B)
vs P(AandB)= P(A)*P(A|B). Could someone please provide some clarification on this. Thank you.

1. Hi, Evan.

You probably mean “P(A and B) = P(A)*P(B) vs P(A and B) = P(A)*P(B|A)”.

The difference is that the former applies only when events A and B are independent. The latter (which I mentioned in the post, saying “You may not have seen this yet”) always applies; it says that we multiply the probability of A by the probability that B will happen, given that we know A has happened. When A and B are independent, P(B) is the same regardless of whether A happened, so we can replace P(B|A) with P(B).

In the original answer I quoted in the post, Independent Outcomes OR (AND?) Dependent Ones, I said the following that didn’t make it into the post:

I included in my list of rules one that you may not have seen, and which is sometimes taught as if it were just P(A and B) = P(A) * P(B), but applies even to independent events:

If A and B are ANY events, then
P(A and B) = P(A) * P(B | A)

Here P(B | A) is read as “the probability of B given A”, and means that we are finding the probability that B will happen, assuming that A has happened. Sometimes this distinction is skipped over in introductory classes, so they would write it as P(A and B) = P(A) * P(B).

This site uses Akismet to reduce spam. Learn how your comment data is processed.