Three Doors on a Gameshow Problem

Three Doors on a Gameshow Problem

Posers and Puzzles

Cookies help us deliver our Services. By using our Services or clicking I agree, you agree to our use of cookies. Learn More.

w
Chocolate Expert

Cocoa Mountains

Joined
26 Nov 06
Moves
19249
21 Sep 09

Could someone please explain to me the concept behind the following problem:

You're on a gameshow and you have to choose between three doors A, B, and C, one of which has a car behind it. You choose door C, but instead of the gameshow host's showing you whether you were correct, he opens door A and shows you that it has nothing behind it, and lets you choose again between B and C. Should you reconsider and choose door B, or should you stay with door C?

I know that statistically you should switch and choose door B, but I don't understand why. Could someone explain this concept? I don't mind if you explain it mathematically, or logically, or using a combination of the two, but I simply don't get it.

Thanks in advance for your time.

f
Defend the Universe

127.0.0.1

Joined
18 Dec 03
Moves
16687
21 Sep 09

Originally posted by wittywonka
Could someone please explain to me the concept behind the following problem:

You're on a gameshow and you have to choose between three doors A, B, and C, one of which has a car behind it. You choose door C, but instead of the gameshow host's showing you whether you were correct, he opens door A and shows you that it has nothing behind it, and lets you ...[text shortened]... ng a combination of the two, but I simply don't get it.

Thanks in advance for your time.
There are other threads on this subject, but it basically boils down to the fact that the "host" always opens a door with nothing in it.

If you expand it out to 100 doors, where you choose one and the host opens 98 doors with nothing in them, it becomes more apparent that if you stick with your door, you have your original odds (1/n), but if you switch, your odds become (1 + number of opened doors)/(n - number of opened doors)

w
Chocolate Expert

Cocoa Mountains

Joined
26 Nov 06
Moves
19249
21 Sep 09
1 edit

Originally posted by forkedknight
There are other threads on this subject, but it basically boils down to the fact that the "host" always opens a door with nothing in it.

If you expand it out to 100 doors, where you choose one and the host opens 98 doors with nothing in them, it becomes more apparent that if you stick with your door, you have your original odds (1/n), but if you switch, your odds become (1 + number of opened doors)/(n - number of opened doors)
That concept actually makes some sense, about expanding your initial game field to 100 doors, but where did you get that formula for the probability after switching? Taking the 100 doors scenario, wouldn't it be .01 probability (staying with the door you chose originally) and .5 probability (switching)?

edit- Never mind; you were calculating odds, not probabilities. Thanks.

Already mated

Omaha, Nebraska, USA

Joined
04 Jul 06
Moves
1116284
21 Sep 09

Try this to make it clearer . . .maybe.

You pick a door (one of three). You will always have a one in three chance of winning.

Your host always takes away one of the losing doors. You don't know which.

It looks like it's now one in two (50:50) chances, but it's not. Here's why:

Your host removed a loser, but there were two chances to your one that it would be one of the other doors. It's still that way. You always have a one in three chance of guessing right and a two in three chances of guessing wrong.

You switch and join the two out of three group. The removal of one loser didn't change those odds.

Does this help?

d

Joined
31 May 07
Moves
696
21 Sep 09

If you want help believing it, but not understanding it. Think of this more extreme scenario:
A person gets you to pick one card out of a deck, saying you're trying to get the ace of spades. You pick one, but instead of turning it over, he reveals the 50 cards left in the deck that aren't the ace of spades, leaving one that hasn't been turned over in the deck, and the one you chose to start with. It's fairly clear that the one that was in the deck is more likely to be the ace of spades.
But anyway, the correct answer to the gameshow problem is not you should switch, but you should stick. Some assumptions that need adressing:
1) The gameshow host knows which door the car is behind. If he doesn't, then the probability IS 50/50.
2) The gameshow host always opens a door. If this is not true then one must consider
3) The gameshow host wants you to win. In which case there's 100% chance of winning with change. If he doesn't, there's 100% chance of winning with stick.

P
Upward Spiral

Halfway

Joined
02 Aug 04
Moves
8702
21 Sep 09

I usually use the example forkedknight gave, but if you want another way of thinking about it is to think about the probability of winning by NOT switching in a sort of frequentist way (imagine you repeat the experiment many times).

If you NEVER switch, then how many times will you win? Exactly 1/3 of the times.
If you ALWAYS switch, then how many times will you win? It must be the remainder as there are no more options, so it must be 2/3 of the times.

PS: All this is again assuming he always opens a door without a prize as other comments mention.

A

Joined
02 Mar 06
Moves
17881
21 Sep 09

i also like to think of it this way:

if there are n doors, then your initial chances of guessing right were 1/n. the host removes all losing doors with one remaining and asks you to choose between your initial choice, and the door that is left.

in other words, he's asking you now to consider this question: "did you guess RIGHT the first time? or did you guess WRONG the first time?" the probability you guessed right was 1/n, so the probability you guessed wrong was (n-1)/n. switching doors is akin to saying "my initial choice was wrong."

and in the classic version of this problem, n=3 so your initial guess has a probability of 1/3 of being right, whereas 2/3 you were wrong... so switching doubles your chances of winning

Immigration Central

tinyurl.com/muzppr8z

Joined
23 Aug 04
Moves
26701
21 Sep 09

It's called the Monty Haul problem. Look it up.

P
Upward Spiral

Halfway

Joined
02 Aug 04
Moves
8702
22 Sep 09

Originally posted by AThousandYoung
It's called the Monty Hall problem. Look it up.
Fixed.

d

Joined
05 Jan 04
Moves
45179
22 Sep 09

Originally posted by AThousandYoung
It's called the Monty Haul problem. Look it up.
Let's make a deel.

Quiz Master

RHP Arms

Joined
09 Jun 07
Moves
48793
22 Sep 09

Monty Haul

A derogatory term for a tabletop RPG that is far too easy and therefore poses no challenge to its players, usually a 'Monty Haul' game quickly becomes boring once it's players become the most powerful things in the game world.
Named for the host of "the Price is Right" a TV quiz show.

http://www.urbandictionary.com/define.php?term=monty%20haul

P
Bananarama

False berry

Joined
14 Feb 04
Moves
28719
22 Sep 09

Originally posted by wolfgang59
Named for the host of "the Price is Right" a TV quiz show
Monty Hall never asked America to spay or neuter its pets, he was the host of "Let's Make A Deal".

d

Joined
05 Jan 04
Moves
45179
23 Sep 09

Originally posted by PBE6
Monty Hall never asked America to spay or neuter its pets, he was the host of "Let's Make A Deal".
And it's Hall, not Haul.

I
King of slow

Joined
12 Oct 06
Moves
14424
23 Sep 09

Originally posted by Palynka
I usually use the example forkedknight gave, but if you want another way of thinking about it is to think about the probability of winning by NOT switching in a sort of frequentist way (imagine you repeat the experiment many times).

If you NEVER switch, then how many times will you win? Exactly 1/3 of the times.
If you ALWAYS switch, then how many times ...[text shortened]... PS: All this is again assuming he always opens a door without a prize as other comments mention.
Of all the explanations I've heard of why you should always switch, this is the most straightforward. Not sure why I've never seen it before now.

P
Upward Spiral

Halfway

Joined
02 Aug 04
Moves
8702
24 Sep 09

Originally posted by Ichibanov
Of all the explanations I've heard of why you should always switch, this is the most straightforward. Not sure why I've never seen it before now.
Glad you liked it.