|
Welcome to the International Skeptics Forum, where we discuss skepticism, critical thinking, the paranormal and science in a friendly but lively way. You are currently viewing the forum as a guest, which means you are missing out on discussing matters that are of interest to you. Please consider registering so you can gain full use of the forum features and interact with other Members. Registration is simple, fast and free! Click here to register today. |
20th October 2008, 09:26 AM | #1 |
Anti-homeopathy illuminati member
Join Date: Feb 2004
Posts: 5,363
|
Dumb probability question?
OK, I'll await the howls of derisive laughter, but bear with me...
John Maynard Keynes is much in the news at the moment as we try to spend our way out of financial catastrophe. An anecdote of him reported in a news bulletin a few days ago said that he was staying in Monte Carlo and heard that one of the casinos had removed the zero from the roulette wheel so it had become a completely fair bet and, because of that, he went off to spend the evening gambling. Am I right in thinking that this was a rational thing to do because if a roulette bet is completely fair then you can always win? I don't think I'm being completely stupid and I think I am agreeing with Keynes' actions. If a bet is completely fair, but you know that you will stop when you have achieved some specified set of winnings then you must win because, although the long-run probability is of a tie between you and the casino, if you stop playing at the point when there is a sufficient excursion from the tie then you can permanently bank that win. So, in my estimation, actually it's not a fair competition between you and the casino, because the punter has an extra piece of information- the knowledge that he will stop once a specified excursion from a tie has been achieved. But, if I can do that, does it mean the casino will be broken by other individuals adopting the same strategy? As far as I can see, the only problem is deciding how big an excursion from a tie you aim at and to calculate how long on average that excursion will take to appear. Also you must have enough liquid funds to cover excursions in the casino's direction that occur before your winning point is reached. I think that seems correct. Where my head begins to hurt is if I ask what happens if I ever return to the roulette table. Common sense says I can't repeat my tactic, because at some stage the long-run fairness must mean that the tendency is for me to tie with the casino, yet we know that the gambler's fallacy is just that, a fallacy, the wheel has no memory of past outcomes so if I come for a new series of turns on the wheel their outcome is independent of my past play so I should be able to repeat my strategy of waiting for my predetermined excursion. That does not seem right so there must be a flaw in my argument. |
__________________
"i'm frankly surprised homeopathy does as well as placebo" Anonymous homeopath. "Alas, to wear the mantle of Galileo it is not enough that you be persecuted by an unkind establishment; you must also be right." (Robert Park) Is the pen is mightier than the sword? Its effectiveness as a weapon is certainly enhanced if it is sharpened properly and poked in the eye of your opponent. |
|
20th October 2008, 10:02 AM | #2 |
Philosopher
Join Date: Sep 2006
Posts: 6,985
|
|
20th October 2008, 10:13 AM | #3 |
Critical Thinker
Join Date: Sep 2008
Posts: 378
|
The flaw in your argument is to assume that at some point the profits must be even in order to tend to what you would expect by chance.
The table has no memory, either it is a continuous play or not. You can play for very long time before your “luck” to change, any kind of outcome has a probability associated whit it, if you play it long enough the outcome of you wining a million dollars will eventually happen and so does the outcome of losing another million. Although any outcome whit probability different from zero is likely to happen, it is safer to say that those most expected by change will most likely happen. It doesn’t matter if you won a million times more then you lost, you can still be falling in what it is expected by chance if you needed to made a trillion of plays. |
__________________
"If it sounds too stupid to be true, probably some one already made a job out of it" |
|
20th October 2008, 10:21 AM | #4 |
Anti-homeopathy illuminati member
Join Date: Oct 2003
Posts: 28,209
|
To that assumes that the casino pays out amount dirrectly equiv to the odds but lets say it does also lets suppose everyone stops the moment they have made a profit.
And to simplify the game becomes straight 50:50 and all wagers are 1 currency unit. Start with 128 people Round 1 64 are now up one currency unit and walk away. 64 are down. Round 2 64 have left 32 are level and 32 are two currency units down. Round 3 64 have left 16 are up one CU zero are level 32 are one CU down and 16 are three CU down Round 4 80 have left 16 are level zero are one CU down 24 are two CU down and 8 are four CU down Round 5 88 have left 20 are one CU down 16 are three CU down and 4 are five CU down So after 5 rounds 88 are in profit by one CU 20 are one CU down 16 are three CU down and 4 are five CU down and the trend by this stage is fairly clear. The casino is level haveing paid out 88 but taken in 20+(16*3)+(4*5)=88. The average amount made per person is zero. So in effect by following the stratergy eventualy most people will win but at the risk of significant loses. But in order to get a high chance of winning a small amount you have to follow a stratergy that risks signficant loses. |
20th October 2008, 10:29 AM | #5 |
Philosopher
Join Date: Oct 2007
Posts: 8,613
|
You have this more or less backwards. The asymmetry is not in information, it's in the availability of funds. Assuming the casino has infinite funds available and you don't, you will always lose in the long run no matter what betting strategy you follow.
Think of it like this. Since each bet is independent of every other bet, and since each bet is even odds, no matter what strategy you follow your expected earnings are zero. But, you object, I can quit while I'm ahead, and only when I'm ahead. That's true - but the odds are just as good that you'll end up behind. If you keep playing, you might get back into the black, or you might sink lower - and at some point you are out of money and you've lost everything. A good way to phrase this mathematically is the gambler's formula, which states that in a fair game with one opponent played until either you or your opponent is out of money, your odds of winning all the money are a/(a+b), where a is your total funds at the beginning and b is hers. Note that this goes to zero when b goes to infinity. There is a version of this formula that tells you the optimal amount to bet given the odds, your stake, and the time you have. You must have better than even odds to have a positive expected earnings (for the reason above), and moreover you don't want to bet too much - otherwise the odds are a bad run will come along and wipe you out. |
20th October 2008, 11:05 AM | #6 |
Penultimate Amazing
Join Date: Mar 2004
Posts: 21,629
|
The Drunkard's Walk theorem says that that will happen.
More accurately, the Drunkard's Walk theorem says that at some point, the profits will (with probability 1) be EVERY possible number. At some point you'll be even, you'll be up a million, you'll be up a billion, or you'll be down seven bucks. The problem is which happens first : you run out of money or the casino does. (Alternatively, which happens first, you hit your upside limit where you walk away, or you hit your downside limit where you walk away.) The relevant math says that the amount you win (in all the cases where you win) exactly balances the amount you lose (in those cases). Are you willing to risk losing your house to win $10? Apparently Keynes was. If you are, too, then go for it. |
20th October 2008, 12:04 PM | #7 |
Critical Thinker
Join Date: Sep 2008
Posts: 378
|
Certain and almost certain outcomes are not exactly the same.
Not even Drunkard's Walk theorem gives me absolute guarantees that such will happen in finite time, (although it’s probability converges to 1). But that was not the point of my statement, the point was that to converge to the odds is no exactly to have null distance to the odds. |
__________________
"If it sounds too stupid to be true, probably some one already made a job out of it" |
|
20th October 2008, 01:18 PM | #8 |
Illuminator
Join Date: Jan 2005
Posts: 4,289
|
This is often referred to as the Martingale betting system, which has been around for centuries.
|
21st October 2008, 06:29 AM | #9 |
Penultimate Amazing
Join Date: Jul 2006
Posts: 18,774
|
|
21st October 2008, 06:38 AM | #10 |
Guest
Join Date: Apr 2007
Posts: 13,797
|
|
21st October 2008, 07:00 AM | #11 |
Penultimate Amazing
Join Date: Mar 2004
Posts: 21,629
|
|
21st October 2008, 09:04 AM | #12 |
Guest
Join Date: Apr 2007
Posts: 13,797
|
|
21st October 2008, 09:11 AM | #13 |
Guest
Join Date: Mar 2008
Posts: 7,149
|
Makes perfect sense to me. Assume infinite budget on the part of the house and the player. Now assume a finite limit of money which the player wishes to win. Now assume infinite time.
It is an absolute certainty that the imbalance will be in the player's favor at some point in time. Now change that to a finite budget on the player's part and finite time. It's very simple to calculate an acceptable risk of ruin and an acceptable time scale and choose a sensible amount to gain for your budget. The house, in this case, is counting on the fact that most gamblers don't have an amount of money they want to win. They have an amount of money they're willing to lose. And given infinite time (which the house has) and infinite money (which the house basically has, for all intents an purposes) this is a total winner in their book. Also remember they're counting on other factors too. Roulette is an incredibly BORING activity. Many Roulette players will choose to also play other games, games which tilt in the house's favor. If they make winnings on the Roulette wheel, they will then tend to play other games with their 'free money.' P.S. Note that the casino will not lose out from sensible players on the Roulette wheel, because eventually those sensible players will hit their 'risk of ruin' percentile, and given the budget that needs to be involved for a sensible risk of ruin, that will counter all the smaller amounts made. P.P.S. The house always wins, unless you're counting the cards. |
21st October 2008, 09:56 AM | #14 |
Illuminator
Join Date: Feb 2008
Posts: 3,432
|
I have seen payout charts for slot machines (I was a bench tech for the largest casino in the U.S.) where they started out lower than expected, which happens alot, and never got higher than expected, until finally reaching expected payout. That means that if someone started playing the machine the day it was put in, they would never have been at a point when they were ahead, even though at the end the machine was at the expected pay out. Many would go above expected, then below, then above again until settling down to expected. This is not a guarantee though, as shown with the original examples.
This can also be applied to your roulette scenario, where the expected payout is even money. You could start with your first bet, lose, and then never reach a point where you ever come out ahead, but then stay there a year just to get to the point of breaking even. I would say that the chances of this happening are slim, but so are the chances of making any substantial amount of money. |
25th October 2008, 11:18 AM | #15 |
Anti-homeopathy illuminati member
Join Date: Feb 2004
Posts: 5,363
|
Thanks for all the various comments.
Clearly I made some implicit assumptions, specifically infinite time and money. So, as I see it now, I was right that if you have limitless resources and set a finite target win at which point you will walk away with the money in your pocket, you will always achieve that win, at some point. But, given finite resources on the part of each player and the casino, some players will fail to achieve their target win before they run out of money and will have to walk away. The number of players out of a population who fail is, I presume to guess, related to the ratio between the players' resources and the casino's. Effectively the proportion who fail is a method of measuring the relative size of the players' and casino's resources. Is that a fair summary? |
__________________
"i'm frankly surprised homeopathy does as well as placebo" Anonymous homeopath. "Alas, to wear the mantle of Galileo it is not enough that you be persecuted by an unkind establishment; you must also be right." (Robert Park) Is the pen is mightier than the sword? Its effectiveness as a weapon is certainly enhanced if it is sharpened properly and poked in the eye of your opponent. |
|
25th October 2008, 12:43 PM | #16 |
Philosopher
Join Date: Feb 2006
Posts: 9,774
|
|
29th October 2008, 01:18 PM | #17 |
Illuminator
Join Date: Feb 2008
Posts: 3,432
|
No. Some people here have said that, but it is untrue. The only thing close to a certainty is that you will break even eventually, but that doesn't mean that at any time you are guaranteed to be above that break even point.
I have seen a whole display of results on the roulette table end up black or red, (I know it is anecdotal, but still plausible, right?) which means some were winning big, and some were losing big. The losers will win some eventually, but may never win enough to get over the expected, break even point. |
29th October 2008, 03:29 PM | #18 |
Philosopher
Join Date: Sep 2006
Posts: 6,985
|
If there is no table limit, then with unlimited resources you can eventually win any amount you choose by betting red or black and doubling the bet every time you lose.
If you are betting constant amounts, it is not close to a certainty that you will break even eventually. The distance from even as a percentage of the total bets will tend to decrease, but the absolute distance from even will not. |
29th October 2008, 04:08 PM | #19 |
Penultimate Amazing
Join Date: Mar 2004
Posts: 21,629
|
The Drunkard's Walk theorem says that, (with an infinite bankroll and infintie time) you will with probability 1 achieve EVERY finite return. You will, with probability 1 eventually win as much as you like.
If you're trying to make some subtle argument about the difference between "always" and "probability 1," you're failing to express it. If you are asserting that there is a finite, nonzero probability that you will never win any arbitrary amount, you're simply wrong. |
29th October 2008, 08:20 PM | #20 |
Philosopher
Join Date: Oct 2007
Posts: 8,613
|
Think about what you said: it's certain you'll break even eventually, but not certain that you'll ever be above even. That makes no sense - if you'll be at even once, you'll be at even arbitrarily many times (remember, we have an infinite amount of time and an infinite amount of money). So start from one of those times, and half the time you'll go up, hold the time down.
As drkitten says, you'll hit every amount with probability 1. |
29th October 2008, 11:18 PM | #21 |
Guest
Join Date: Mar 2008
Posts: 7,149
|
|
30th October 2008, 11:36 AM | #22 |
Critical Thinker
Join Date: Dec 2006
Posts: 476
|
Isn't Keynes also the one who said "In the long run, we're all dead"?
I've never heard this roulette story about Keynes. Was the punchline of it that Keynes thought that he could win, or that he at least wouldn't be wasting his money (it being a fair game and all)? |
26th November 2008, 02:05 PM | #23 |
Illuminator
Join Date: Feb 2008
Posts: 3,432
|
Modified, no one claimed that there was no table limit. This is a real casino we are talking about. What do you mean by the absolute distance? How is that different from the regular distance?
drkitten, I never used the word always, nor did I use the term probability 1, so I fail to see how I was making that argument. I was simply sticking to the scenario at hand. OK, sure, I will agree that if there was no table limit, and I was not doing anything else for the next 10 years, and I had all the money in the world, I will eventually see myself at a point where I would be up n amount or down n amount (n=all rational numbers between 0 and infinity) But when there are table limits, even with unlimited funds, this wouldn't be the case. Sol, if you look at it in the perspective of the return on such a game being 100%, you can lose the first 10 bets, and after that slowly the probability will get closer to the 100%, which I am calling breaking even. I was wrong when I said eventually you will break even, I should have said close to it. |
26th November 2008, 06:57 PM | #24 |
Penultimate Amazing
Join Date: Mar 2004
Posts: 21,629
|
Only in the sense that if the table closes after losing more than 10,000 units, you will never be able to win more than 10,000 units at that table.
But the probability of your winning 9,999 units remains 1 -- even if you only ever bet one unit at a time.
Quote:
No, you were right the first time. It is a proven theorem of mathematics. If you simply bet on (fair) coin flips, one dollar for every flip, and you are limited neither by your own bankroll nor by the amount of time you and your opponent are willing to play the game, then the game will always end with your opponent hits his loss limit or goes bankrupt. (Think of it -- that's the only way the game can end.) In particular, you are guaranteed to (eventually) be ahead of the game by any amount that the casino is willing to let you win. If you call breaking even "100%," then it's a proven theorem that you will eventually be above 100%. |
27th November 2008, 07:05 AM | #25 |
Muse
Join Date: Nov 2004
Posts: 557
|
Your strategy will determine the probability distribution over the possible end results, but in a fair game the expected change in money is always 0$, no matter what strategy you use. Note that you need to stop after *some* amount of time, since you have only a finite amount of money and time.
Consider this fair game: double-or-nothing, guess the outcome of a coin flip. Let's try a well-known strategy and see if we should expect some winning. Our strategy is simple: start with a bet of 1$, double your bet when you lose, stop when you win, and stop after n loses. Your PDF is a likely gain of 1$ balanced by an unlikely loss of a huge amount of money. Specifically: {+1$ : p = 1-1/2^n, 1-2^n$ : p = 1/2^n}. Notice that 1*(1-1/2^n) + (1-2^n)/2^n = 0$, as expected. Try to come up with a terminating strategy with a non-0 expected value if you need to convince yourself. Even "gambler's ruin" is balanced by highly unlikely wins of massive amounts of money. edit: A lot of confusion is coming from assuming you have unlimited time or bankroll. These are *not good assumptions* for modeling casino play. |
27th November 2008, 12:23 PM | #26 |
Illuminator
Join Date: Nov 2002
Posts: 3,607
|
|
27th November 2008, 12:36 PM | #27 |
Illuminator
Join Date: Nov 2002
Posts: 3,607
|
I agree; they certainly aren't realistic assumptions.
It can be interesting to think about their consequences anyway. I don't think their presence could be said to make the game unfair. Well, I guess the overall game is unfair in that one party, and not the other, chooses when to stop. But each coin flip is still fair. |
Thread Tools | |
|
|