*Alex Homer*

# Classic Monty Hall

You are on a gameshow hosted by a man named Monty Hall, where you are faced with three doors: behind two of them are goats and behind one is a car. You choose a door; when it is opened, you win the prize behind it. Before it’s opened, Monty (who knows what is where) opens a door. You know he will open a door with a goat behind it; if there are two goats, Monty picks one at random. You have the option to stick with your door or to switch. What should you do?

See the first article ‘Monty Hall Revisited‘ for the answer to the classic problem (and a bonus extension problem where Monty always opens the left door).

# Heads or Tails

Now there are only two doors. The producers flip a coin, and if it’s heads they put a goat behind the first door; if tails, a car. They flip again for the second door. You are blindfolded; Monty opens both doors, shuts them again, and tells you that there’s a goat behind one of the doors. What is the probability that there’s a car behind the other door? What if, instead, Monty just opens one door, and tells you there’s a goat behind it?

## Answer

The obvious answer is that we know that one of the doors hides a goat, but that doesn’t give us any information about the other: the choices were made independently, so there’s an even chance of a goat or a car behind the other.

Fortunately for us, that’s not true: it’s twice as likely as not that there’s a car behind the other door. The reason the argument above doesn’t work is that we don’t know which door Monty is talking about: although the doors don’t affect each other, the information we’ve been given is a fact about both of them combined—one of them hides a goat, but we don’t know which.

Let’s apply Bayes’ Theorem to the case where Monty opens both doors: the probability we’re interested in is the probability that there is one of each, given that Monty saw a goat. We’ll start with the probability that there’s one of each, before any doors were opened: that’s

Why? There are four equally likely possibilities: car then car, goat then goat, goat then car and car then goat. (Or you can think about it in terms of the coin tosses instead.) Since two of those give one of each prize, we get a probability of 2 ÷ 4, or ½.

Now, what’s the probability that Monty tells us there’s a goat behind one of the doors, given that there’s one of each? We’ll assume Monty doesn’t lie (after all, even though we can’t ask the audience, we’ll assume they’d call him out on the lie), though we could easily extend the problem to the case where there’s some probability that he’s fibbing. In that case, we have that

Now we just need the probability that Monty says there’s a goat. This is the same as the probability that there *is* a goat for him to tell us about, which happens in three out of the four cases above. So we have that

Then, using Bayes’ Theorem, we get that

Now what if Monty only opened one door? We don’t know which door he opened, or how likely he was to pick each one. It doesn’t matter, though: if we knew it was the left, there’d be a fifty-fifty chance of a car on the right, and vice versa (because of the coin toss). So either way, there remains a fifty-fifty chance of a car behind the other door—and that would have been true if he’d seen a car too.

So if Monty has seen both doors and told us there’s at least one goat, there’s a greater chance of there being a car than if he’s only opened one door. How can that make a difference to the odds? The difference comes when we think about how likely it was that Monty would say “goat” in the first place. In case number one, if there’s a goat, he’ll always say so. Whereas in case number two, sometimes he doesn’t because he doesn’t see it. Necessarily, those are the cases where there’s a car and a goat. So the event we’re conditioning on (that Monty says “goat”) happens less in exactly the case whose probability we’re interested in.

# Double or half

Now Monty dispenses with the goats. There are still two doors, behind each of which is a pile of counters, each worth one point. (There may also be fractions of counters.) The only thing you know is that one door hides twice as many counters as the other; the producers choose the amounts, and then choose at random which way around they’ll go. You can open one door, and then can choose to keep that or switch to whatever’s behind the other door. You’re going to play this game lots of times, to try and accumulate the most points (because points mean prizes). Is there a good strategy for this game?

## Answer

The clue is in the way the question is phrased: “Is there a good strategy?” The answer is “no”—or, at least, not that we can determine from the information we have. Before either door is opened, of course there is a fifty-fifty chance of the larger pile being behind either door. Afterwards: well…

An answer that’s sometimes presented (as a claimed paradox) is that it’s better to switch whichever door you pick first. Why? Well, the claim goes, suppose you find 100 counters when you open the first door. Then, there is a probability of ½ that this is the larger amount, so there are 50 behind the other door. And there is a probability of ½ that this is the smaller amount, so there are 200 counters behind the other door. That means that your “expected amount” from switching (the average amount you’d get, if you played out this exact scenario lots of times) is 50 × ½ + 200 × ½, which amounts to 125—and so more than the 100 you get when you stick with the door you opened. Since the same argument can be made however many counters you saw, you should always switch.

Your first choice clearly doesn’t matter, so suppose you initially pick the left door each time. Then, with this strategy, you will always end up with the amount behind the right door. But then you get the same amount of money as if you’d picked the right door each time and stuck, which we’ve just said is the worst strategy. Quite the paradox, isn’t it?

Well, not if Mr Bayes has anything to say about it. The problem is in the calculation of the expected amount behind the other door. Our paradox-proposing pal has told us, correctly, that the two possible amounts of counters behind the other door are 50 and 200, given that we saw 100. But if we’re trying to get an expected amount based on that, we’ll also need the probabilities of seeing each of those, *given that we saw 100*. And that’s where things get murkier.

Let’s look at Bayes’ Theorem again. Suppose we chose to open the left door (like I said, it doesn’t make a difference to the calculations).

It’s definitely true that the initial probability that the left door’s amount is the larger one is ½. But now we need the probability of seeing 100 given that we know this is the larger amount, which is a statement about how the producers chose the amounts. And there’s not enough information in the question to determine this.

In fact, we can take some simple examples to show how the producers’ choices affect our conditional probability, without needing to use Bayes’ Theorem. For instance, suppose we knew that the producers only had a stock of 200 counters. Then, given that we saw 100, it’s impossible for this to be the smaller box: they would need 300 counters in total to do that, and they don’t have enough. Similarly, if we knew that one of the boxes always contained at least 75 counters, we’d know that the other box must be larger, and we should switch.

If we don’t know anything for certain, what we could do is use our beliefs about what numbers of counters we think the producers will pick to assign probabilities to them. For instance, we might decide that we think the larger number of counters will always be a whole number, and that it’s going to be somewhere between 1 and 1000, being equally likely to be any of those numbers. This is called choosing a **prior distribution**; we then use Bayes’ theorem to update this given the information we have, forming a **posterior distribution**. A major field of statistics, called **Bayesian statistics**, does just that—but usually with more important things than fictional game shows…

To learn more about the history of the problem click here.

For an explanation of the original problem click here.

[…] Monty Hall extended—Tom Rocks Maths […]

LikeLike

[…] For some fun extensions to the classic problem click here. […]

LikeLike

[…] For some more fun extensions to the classic problem click here. […]

LikeLike