« Announcing Head First Scheme | Main | Why I want a Tablet PC »

Reese, Kevin, and The Monty Hall Problem

"There's a car behind one of those three doors... and goats behind the other two. Pick the door you think the car is behind. OK, now before I open the door you chose, I'll open one of the other doors... [door opens revealing a goat] and you can see there's no car behind that one. But I'll give you a chance to switch your door for the remaining closed door. What do you say? Do you want to switch or keep the one you originally chose?"

And that's one way of looking at the Monty Hall problem that's been inciting/confusing/intriguiging/pissing people off for years.
(You can try it yourself through this applet.)

In my recent blog on Seduction and Curiosity I made up a kind of variant on the Monty Hall problem, only this time the problem was for Kevin to pick the business card that had Reese's phone number on the back, from a pile of three business cards. Threecards After Kevin made his choice, Reese turned over a blank card, and asked if Kevin wanted to stick with the one he had, or switch for her remaining card. He declined, thinking, "Each card started with a 1-in-3 chance, and nothing could change that." But when he didn't switch, she criticized him for not realizing that switching would have improved his odds... something his friend Manny confirmed by saying, "Your odds would have gone from 1 in 3 to 2 in 3."

I wasn't really thinking when I posted the problem... it was just the first thing that came to me when I thought about the puzzles that have made me curious, so I used it as an example. (And you can definitely count me among the huge group of Those Who Did Not Get It when I first saw it.)

And I intentionally left a few things up for speculation that people have wondered and argued about in the Monty Hall problem including, "Does the motivation of the host matter?" "Do they ALWAYS offer the choice to switch, or does it depend on whether the host (Monty) knows that the contestant got lucky and picked the winning door..."

And then when I posted my Java code (in the comments to the original post), that repeated the Reese/Kevin thing 500 times, Daniel pointed out that there was the potential for another issue if my code was trying to model the real-life scenario of Reese and Kevin. The problem was that in my code Reese had a very specific algorithm--she always turned over the "A" card to show Kevin (revealing that it was blank) unless it happened to be the winning card, in which case she turned over the "C" card (Kevin always picked "B"). So that meant that if Kevin wasn't too drunk at this point, then he could eventually figure it out. Think about it... if Kevin is really paying attention, and he sees that most times Reese turns over the "A" card, then those times when she does turn over the "C" card, he thinks, "Oh... that must mean that the winning card is "A", because this time she didn't turn it over..." It would be just like if in the Monty Hall problem, Monty always looked at the doors from, say, left to right, and he always revealed the left-most of his two doors, unless that's where the car was. When the contestant saw him skip the left-most door and open the other one instead, the contestant would have new information he could use.

So, in my problem, the code was just meant to show you that if the EXACT same scenario were repeated 500 times, with Kevin always picking "B", and the winning card is randomly distributed among the three cards, and that if Reese ALWAYS reveals a blank card and offers Kevin the chance to switch, then indeed Kevin's chances go up to 2 in 3 instead of his original 1 in 3. In other words, in this example, Kevin should always switch. . (And this is exactly how Marilyn Vos Savant answered the question in the infamous article that started the controversy--she said the contestant should always switch.)

But as promised, here is my attempt at an explanation. But with

huge disclaimers:

1) I don't know or care where the math gene is... I was born without it. (Proving that good-at-programming doesn't always imply good-at-math). So, my explanation might be completely ridiculous.

2) This is not a subject I have ever--or will ever try to teach or write about, since it's WAY outside my knowledge comfort zone.

But if you're still reading and still wondering how switching could possibly improve the odds when all three business cards started with a 1 in 3 chance of being the right card... here's my biggest hint:

It's not about three individual cards. It's about two stacks.

Here's another way of thinking about it. Imagine I have, say, 50 cards divided into two stacks, and only one of the 50 cards has the winning phone number on the back. Stack X has just one card while stack Y has 49 cards. Twostacks And now I pose the question, "Which stack do you want?" Hmmmm... which stack is more likely to hold the winning card? ; ) Of COURSE you choose stack Y, the big one. The odds of the winning card being in the bigger stack are 49 in 50. A hell of a lot better than the 1 in 50 odds you get if you pick the stack-of-one, stack X.

The point of the Monty Hall/Reese-and-Kevin thing is that it was always about two things, not three. It was always about the host's big stack (two doors/cards, etc.) vs. the contestant's small stack (one door/card).

The person running the show (Monty or Reese), starts out with a bigger stack!

Reese has the bigger stack... the one with a 2 in 3 chance instead of a 1 in 3 chance.Kevinandreesestacks
And by offering you the chance to switch, Reese is really giving you the chance to swap your SMALL stack for her BIG stack. It was always about Reese's bigger stack!

Her turning over a card made it look as though it came down to just the two cards--her remaining one and Kevin's original pick. But in reality, she was simply giving you the chance to switch for her bigger stack, and the fact that she revealed one of the cards in her stack doesn't change the fact that the winning card was always twice as likely to be in Reese's bigger stack.

But again, there's the important assumption in my explanation that says, "Reese always offers Kevin a chance to switch, no matter how many times they might repeat this." So Kevin doesn't have to wonder whether Reese is giving him a chance to switch only because she already knows he DID pick the winning card.

Finally, here's one more way to look at why switching ups the odds... the point is that if Kevin sticks to his original pick ("B") and doesn't switch, the only way he can win is if he got lucky the first time and "B" was the winning card--the 1 in 3 chance. But if he does switch, then he doubles his chances, because Reese always turns over a blank card. If the winning card is "A", Reese turns over "C" and gives Kevin a chance to pick "A". If the winning card is "C", Reese turns over "A" and gives Kevin a chance to pick "C". If Kevin always picks "B" and always switches, he has two chances to win instead of one. If Kevin always switches, then the only time he loses is when he happened to pick the winning card the first time (the 1 in 3 chance that "B" is the winning card). But if the winning card was either "A" or "C", then Kevin wins if he switches.

Cardoutcomes

OK math heads and stats folks -- or anyone who has another way to think about this, please feel free to post your comments (and thanks so much to those who did in the original, with special mention to Brad Corbin, Matt Moran, and Woolstar) And if my explanation is just completely lame and makes no sense, feel free to say that too. I'm so out of my area of expertise here it's astonishing : )

And now, back to somewhat less geeky blogs...

Posted by Kathy on April 4, 2005 | Permalink

TrackBack

TrackBack URL for this entry:
http://www.typepad.com/services/trackback/6a00d83451b44369e200d83458163b69e2

Listed below are links to weblogs that reference Reese, Kevin, and The Monty Hall Problem:

» Book Review: The Curious Incident of the Dog in the Night-Time from Mostly Muppet Dot Com
I was given Mark Haddon's The Curious Incident of the Dog in the Night-Time as an Easter gift from my mother-in-law. She had read the book and thought I might find it interesting. It came highly reccommended and she let me know that she'd given it as... [Read More]

Tracked on Apr 8, 2005 10:26:49 AM

Comments

You might want to think about taking an elementary statistics class. You are way off. You are applying a past event to present circumstances and thinking that it somehow affects the current result. It is as if you think that because the coin came up heads 5 times in arow that it must come up heads again. The fact is that the card you picked the first time is irrelevant. it means nothing!!! No matter what card you pick, one will be tossed out and you will get to pick again. Therefore the prior pick means nothing. You are faced with a simple choice of two cards... pick one. 50-50. Yes, my odds of picking the right card the first time were 1 in 3. But once you remove a card, I am faced with two cards, pick one. Since I am picking again, the prior pick means nothing. Let's change it around so that it becomes clearer. I ask you to pick a number 1, 2, or 3. You pick 3. I tell you that it wasn't 2. Now pick a number 1 or 3. Explain to me how picking 1 can be a better pick than picking 3. Yes, it was a 1 in 3 chance when there were three numbers. But now there are only two numbers so either number becomes a 1 in 2 shot.

Posted by: Tom | Apr 4, 2005 12:29:10 PM

Think about this... I am sitting with Bert in the kitchen and show him three cards and ask him to pick one. After he picks one, I throw a wrong one away, put the two remaining cards in my pocket, and bring them to you in the living room. I ask you to pick a card. By your logic, one of the cards has a 2/3 chance of being right and the other has only a 1/3 chance of being right. If that is the case, then there is no way to ever make a rational choice about anything because we can never know all the past events that led to the current situation. Statistics goes out the window.

Posted by: Tom | Apr 4, 2005 12:42:29 PM

OK, after thinking it over I see that in my example, you aren't picking just one number when you switch, you are picking two numbers. Imagine this scenario. I show you 1 million cards and ask you to pick one. I then throw away every card excpet the card you chose and one other. You would have to be crazy not to switch since clearly your chance of having picked the right card the first time is miniscule. By switching, you are in effect getting 999,999 picks.

OK, so how does this work if as in the second example I don't let you know what happened? I have Bert pick a card and then I throw away 999,998 wrong picks. You then come in. Picking the card that Bert didn't pick is the right way to go but to you it appears that each card has an even chance since you don't know anything about Bert's pick. Makes you wonder if there are situations in real life that duplicate this... that appear to give us even choices but are stacked one way or the other.

Posted by: Tom | Apr 4, 2005 1:07:25 PM

Ahem...Simple way to think of it.

When Kevin picks the card, he has a 2/3 chance of being wrong. That means that when the blank card is revealed, it is 2/3 likely that it is "the other blank" card. So switching makes it 2/3 likely that you are correct after the switch.

Posted by: GBGames | Apr 4, 2005 2:24:44 PM

Hey Kathy,

Thanks for posting the solution. You're right - my mind has been curious all week, and now I feel like I (sort of) understand. I ran the java you wrote, and it's pretty hard to deny the results... My mind doesn't want to believe, because it's a little weird, but I can see the logic.

Thanks for having a interesting site...

Posted by: Mike | Apr 4, 2005 2:27:01 PM

GBGames: Well, if you want to be all simple and clear about it, sure, then I guess you could say it like that... ; )

(One more reason why you won't see *me* writing a stats book.)

Tom: you got sucked in the way just about everyone on the planet is at first (including all the 10,000 PhDs who wrote in after the Ask Marilyn column and said she had it wrong. It took Bert, like, a week to convince me. Your "million cards" example is exactly the right way to help someone understand it. As for the other scenarios you mentioned, well, then you've changed the game so it's a completely different problem.

Dan Steinberg has admonished that it's VERY important to be precise when describing the problem, because these puzzles are like games. And when you change it even a little, it could change the rules of the game. I left it a little fuzzy in the beginning because I was trying to get people to speculate not just on the solution, but on the very nature of it, so that THEY would come up with questions like, "does Reese's motivation matter?" "Would she offer this same option to everyone, regardless of whether they got lucky on their first pick?" But sure enough, he spotted something in my code that could indeed have *changed the rules* (changed the odds) if Kevin had been watching carefully.

Whew! I'm just glad it's over. I'll think a little longer next time before I throw a can of worms out like that... but I hope some of you enjoyed it ; )

Posted by: Kathy Sierra | Apr 4, 2005 3:01:52 PM

One way I try to explain it (it doesn't always work) is this.

Lets assume there are 1 million combinations of numbers in a Lotto (in lotto, you pick combinations of numbers and try to match the combination generated by a machine). Lets say my set of numbers is 1,3,10,15,18.

Bob looks in the newspaper and sees the winning Lotto numbers. He then comes over to my house and looks at my slip and sees what I guessed (I haven't read the newspaper). He then starts chanting out 999,998 combinations of numbers - every set of numbers EXCEPT my choice and one other combination.

He then says:

"There are 1 million combinations (doors) and I've opened 999,998 of them for you. Only two remain. The first is 1,3,10,15,18. The second is 2,4,11,12,25."

Now the question is - do I have a 50% chance of winning Lotto now?

If you answer yes - then it would seem very easy to win Lotto. All you need is someone who can read (both the newspaper and your slip) and chant alot of numbers. If they did it enough (maybe a week or two) you are BOUND to win - since each time there is a 50% chance. Do you really think winning is that easy?

If you answer no - then it is inconsistent to think that the probabilities in the Monty Hall problem are 50/50.

Posted by: Anon | Apr 4, 2005 5:09:30 PM

The Monty Hall problem is also clearly explained by Christopher, the autistic protagonist of 'The Curious Incident of the Dog in the Nighttime' by Mark Haddon. He explains why you should always change your choice of doors despite the fact that this seems counterintuitive, and he concludes:

"And this shows the intuition can sometimes get things wrong. And intuition is what people use in life to make decisions. But logic can help you work out the right answer."

What about that ?

Posted by: Peter FJ | Apr 4, 2005 9:22:04 PM

Tom: when I read your replies, I initially agreed 100% with you. None of the explanations here were making sense to me so I googled "monty hall problem".

Suppose there are three cards, and instead of being told "choose A, B, or C", you were told to choose between either JUST A, or {B or C}. Obviously if you chose a group of two cards instead of just one, you have a better chance. So you effectively choose {B or C} by choosing A first and then trading it.

Anon: who is this "Bob" guy, and why don't you call the cops?

Posted by: Keith Handy | Apr 4, 2005 10:06:15 PM

P.S. I realize now that Tom *did* figure it out. Only I didn't understand it when I read his third comment. Now that I do understand it, Kathy's explanation seems crystal clear, but for some reason I just wasn't getting that either (initially).

I am digging this blog! Keep it up.

Posted by: Keith Handy | Apr 4, 2005 10:12:29 PM

A or {B or C}

Thank you Keith, epiphany ensued.

-Matt

Posted by: Matt | Apr 5, 2005 1:30:42 PM

Interestingly enough, I checked out "The Curious Incident of the Dog in the Nighttime" from the library for my wife to read, but took a look at it myself. Couldn't put it down after that.

I really like how the chapters are number: 1 2 3 5 7 11 13 ...

--derek

Posted by: woolstar | Apr 5, 2005 5:29:35 PM

I've heard a few different versions of this problem, but my favourite is the one that has three condemed men waiting in sepearte cells for their execution. None of the prisoners know their cell number, when one of the guards wanders in and annouces that the prisoner in cell 235 will be pardoned.

If you are one of those prisoners then you have a 1/3 chance of being pardoned.

Then the gaurds come in, open a cell and lead one of the other prisoners off and execute him. Your chance of being pardoned is now 1/2.

Then the guy in the next cell breaks down the wall between your two cells and asks you if you'd like to swap cells. I guess he's read this blog and thinks by switching he'd improve his chance to 2/3, and you might think the same. Sadly you both can't be right. Your chance of being pardoned remains at 50/50.

Posted by: Mark | Apr 7, 2005 3:00:21 AM

Isn't this Bayes Theorem?

Posted by: ? | Apr 7, 2005 7:21:23 AM

To follow up on my 4 worder:

Michael Wood, Portsmouth Business School, 23 February 2000

For example:
James is being tried for a murder. The only evidence against him is the forensic evidence - blood and tissue samples taken from the scene of the crime match James's. The chance of such a match happening if James were in fact innocent and the match were just a coincidence is calculated (by the prosecution's expert witness) as 1 in 10,000 (0.0001).

If you were on the jury would you find him guilty? What do you think is the probability of his guilt? (If you think the answer is 0.9999 you are wrong: this is known as the prosecutor's fallacy.)

The defence's expert witness, however, points out that James was found through a systematic examination of everyone in the local community. There are 40,000 such people, and no evidence (except the forensic evidence) to associate any of them with the crime.

Do you still think James is guilty?

This situation can be described by Bayes Theorem.

H1 (the first hypothesis) = James is guilty
H2 (the second hypothesis) = James is innocent (not guilty)

These are the two hypotheses between which we have to decide.

E (the evidence) = The forensic evidence of a match

We can assume that
Prob(E given H1) = 1
since if he is guilty, the forensic evidence is bound to show a match, and that
Prob(E given H2) = 1/10,000 = 0.0001
since if he is innocent the probability of a match is 0.0001.

These two probabilities are known as likelihoods. Notice that they are both "Evidence given hypothesis". Bayes theorem reverses these conditional probabilities and tells us the probability of the hypothesis given the evidence.

To use Bayes theorem we also need to know P(H1) and P(H2). These are the prior probabilities of guilt and innocence. They are "prior" in the sense that these are the initial estimates before the evidence is considered. If we assume that the murder must have been committed by someone in the local community, then

P(guilty) = 1/40,000 = 0.000025
P(innocent) = 1 - 0.000025 = 0.999975

The first equation above now gives:
Prob(guilty given evidence) = (1x0.000025) / (1x0.000025+0.0001x0.999975)
= 0.2 = 20%

which means that the probability of James's guilt is only 20%. This is probably a lot less than your initial estimate!

Posted by: ? | Apr 7, 2005 7:28:52 AM

see my post:

http://www.livejournal.com/users/not_hothead_yet/400413.html

Posted by: smibbo | Jan 17, 2006 1:52:15 PM

I am going to try to tech this to a yr 10 class, How?

Posted by: sarah Chamings | Mar 2, 2006 5:29:31 AM

The comments to this entry are closed.