PDA

View Full Version : expected value calculations

roger
2nd February 2007, 11:27 AM
A finance professor offered this scenerio in an explanation about rational market behavior and expected value calculations. I disagree with his analysis. First, the scenerio:

Choose between 1) getting \$85,000, or 2) an 85 percent change of gettting \$100,000.

His argument was there is no rational reason to choose between the two, since they both have the same expected value.

Naturally, I agree they both have the same expected value. Run the scenerios enough times, and either choice 1 or 2 will give an average payoff of \$85,000/episode.

However, I still would strongly prefer 1. I'll go further. Modify 2 to be an 86% chance of getting 100,000, I'll still choose 1.

Why? Well, expected value computations rely on the law of large numbers. Do #2 enough times, and you'll average 85K (or 86K) per episode.

However, I don't know about you, but I am not often offered chances at 85K.
Second, 85K is a big sum of money, but the extra \$15K I might get in scenerio two is not much of an incentive. Yes, I'd rather have 100K than 85K, but it's not a big deal in comparision. In short, my life would be improved quite a bit with the 85K, but only marginally improved more with the extra 15K. So, I go for the sure thing, even though the expected value of 1 is lower than 2 in my revised scenerio.

I'm not interested in bickering about this to death, but I am interested in what other people would choose to do in either scenerio. Would you have a preference between 1 and 2 in the original scenerio? What about in my revised scenerio? Remember, you are being offered this once, not a million times or so.

ETA: I agreed with the bigger argument he was making about expected value calculations in markets - we often do it irrationally. If you are a regular investor you better assume the law of large numbers is in force. I'm just picking a nit.

drkitten
2nd February 2007, 11:35 AM
Choose between 1) getting \$85,000, or 2) an 85 percent change of gettting \$100,000

I'm not interested in bickering about this to death, but I am interested in what other people would choose to do in either scenerio. Would you have a preference between 1 and 2 in the original scenerio? What about in my revised scenerio? Remember, you are being offered this once, not a million times or so.

I'd probably take the \$85K and run, in either situation.

Your finance professor is making the common mistake of confusing the amount of money with the value of the money. Not all dollars are created equal (or interchangeable) -- as Dickens put it, "Annual income twenty pounds, annual expenditure nineteen nineteen six, result happiness. Annual income twenty pounds, annual expenditure twenty pounds ought and six, result misery." The effect of that final shilling in satisfying the consumer's wants is disproportionate to its actual monetary value.

UserGoogol
2nd February 2007, 11:35 AM
Yeah, I think the value of money is non-linear, (diminishing returns and all that) and therefore the value of an 85% chance of getting \$100K may not be exactly the same value as a 100% chance of getting \$85K. Your argument seems fairly reasonable, although it's probably fairly subjective.

It's probably understandable to gloss over that in a finance class, though, since dragging personal happiness into things just complicates things when it's much simpler to just look at purely monetary values such as profit and risk.

Jarom
2nd February 2007, 11:37 AM
The error made by the professor is assuming that utility (the amount of benefit) is proportional to dollar amount. The question of how to assign utility to a dollar amount is a tricky one, because it varies by person. For myself, if I assign 100 utility (the exact figure is irrelevant, only proportions matter) to \$100,000, I estimate that I would assign between 90 and 95 utility to \$85,000. A more risk-averse person might assign utility closer to 100, and Bill Gates would probably assign a utility close to 85.

So the expected utility to me would be 90-95 for the first choice, and 85 for the second, making choice 1 better.

drkitten
2nd February 2007, 11:40 AM
ETA: I agreed with the bigger argument he was making about expected value calculations in markets - we often do it irrationally. If you are a regular investor you better assume the law of large numbers is in force. I'm just picking a nit.

Not necessarily. One of the big issues,. even for large and regular investors, is balancing expected value against risk tolerance. As a simple example, if you offered to sell me investment #1 for \$50,000, that would be more or less a no-brainer. If you offered to sell me investment #2 for the same price, I might have rational reasons for turning it down, on the grounds that the 15% risk of losing all my capital is unacceptably high. Perhaps that's a substantial amount of my 16 year old daughter's college fund, and her expected expenses are \$60,000.

roger
2nd February 2007, 11:45 AM
Your finance professor is making the common mistake of confusing the amount of money with the value of the money. Exactly. The article started out by retelling a Warren Buffett story. He had turned down an \$11 dollar golf bet for a \$10,000 payoff if he got a hole-in-one that weekend. He was ribbed for it (since he was nearly a billionare at the time, and could easily afford \$11) and his response was that he used the same reasoning proces for an \$11 deal as an \$11million dollar one.

Which makes absolute sense when you have 10 times that 11 million.

In general, it's good investing advise. Use expected value to figure the value of a position, rather than a gut response to the situation. But in some situations is very contrary to other investment advise. I could easily come up with a scenerio where 1) big chance to profit with small chance of bankrupcy, and 2) good chance at profit with small chance of loss, where the expected value of 1 is much larger than 2. Yet 1 is a fools bet - you never gamble it all on an unsure thing. Expected value doesn't take into account the consequences of failure, or at least a naive application of it doesn't (a smart analysis would take the long term cost of bankruptcy into account).

roger
2nd February 2007, 11:46 AM
Not necessarily. One of the big issues,. even for large and regular investors, is balancing expected value against risk tolerance. As a simple example, if you offered to sell me investment #1 for \$50,000, that would be more or less a no-brainer. If you offered to sell me investment #2 for the same price, I might have rational reasons for turning it down, on the grounds that the 15% risk of losing all my capital is unacceptably high. Perhaps that's a substantial amount of my 16 year old daughter's college fund, and her expected expenses are \$60,000.Ah yes, I just posted a response to your earlier post that takes that issue into account. Quite right.

2nd February 2007, 03:08 PM
His argument was there is no rational reason to choose between the two, since they both have the same expected value.

When long-odds high-payoff progressive-jackpot lotteries go unwon for many cycles, the expected value of a ticket can exceed its purchase price (even when the likelihood of having to share the pot is figured in).

Is there any rational reason not to immediately purchase as many lottery tickets as possible whenever this condition arises?

If so, then the argument is refuted.

Or, to create a tighter but purely hypothetical case, imagine a lottery where tickets cost \$1, a winning ticket pays \$100M (no splitting the pot; all winning tickets win the full amount), and the odds of winning per ticket are 1 in 85M. The lottery will only happen once, tickets are on sale for a week and it takes about 10 seconds to buy one ticket, so the most tickets one could possibly buy are about 60,000. How many tickets should one buy?

Respectfully,

Rodney
2nd February 2007, 05:43 PM
When long-odds high-payoff progressive-jackpot lotteries go unwon for many cycles, the expected value of a ticket can exceed its purchase price (even when the likelihood of having to share the pot is figured in).

Is there any rational reason not to immediately purchase as many lottery tickets as possible whenever this condition arises?

If so, then the argument is refuted.

Or, to create a tighter but purely hypothetical case, imagine a lottery where tickets cost \$1, a winning ticket pays \$100M (no splitting the pot; all winning tickets win the full amount), and the odds of winning per ticket are 1 in 85M. The lottery will only happen once, tickets are on sale for a week and it takes about 10 seconds to buy one ticket, so the most tickets one could possibly buy are about 60,000. How many tickets should one buy?

Respectfully,
To show even more starkly how out-of-touch with reality your professor is, suppose the choice were between a million dollars and a 1% chance of winning a billion dollars. According to your professor's reasoning, you would be much better off choosing the latter because your expected value would be 10 million dollars -- ten times higher than choosing the million dollars.

andyandy
2nd February 2007, 05:45 PM
Personally, the utility value of \$85,000 relative to zero, compared with the increased utility value from \$85,000 to \$100,000 has such a discrepancy that i wouldn't even take a gamble at 95%. Give me 98%+ and i might consider it.....:)

kjkent1
2nd February 2007, 07:56 PM
It's a stupid hypothetical.

Expected value generally concerns the mean average outcome of a large number of probability trials.

In the proposed hypothetical, there is only one trial. It's like having a bowl filled with 85 white marbles and 15 black and asking someone to draw one for \$100,000 or take \$85,000 without drawing at all.

Only an idiot would turn down the guaranteed winner for that gamble.

However, if you were offered the opportunity to win \$1,000 for every white marble you select from the bowl over 100 trials, vs. losing \$150 for every black marble you pick, you might well take the bet, because you would probably come close to getting \$85K and you might do a bit better or a bit worse.

Art Vandelay
2nd February 2007, 09:15 PM
A finance professor offered this scenerio in an explanation about rational market behavior and expected value calculations. I disagree with his analysis. First, the scenerio:

Choose between 1) getting \$85,000, or 2) an 85 percent change of gettting \$100,000.

His argument was there is no rational reason to choose between the two, since they both have the same expected value.It's weird how people accept unquestionably that expected value has anything to do with it. Why should it? I don't care what the expected value is, I care what actual value I get. His argument makes as much sense as saying "there's no rational reason to choose a 747 over a Cessna, because they both have the same number of wings". How the hell did this guy get a job as a finance professor? That risk is an important factor is a basic precept of finance. He also doesn't seem to understand what "rational" means. "Rational" doesn't mean "I understand the reasons for it", it means "no arbitrage". Something can seem completely crazy, but as long as there is no arbitrage, as far as finance theory is concerned, it's rational.

.13.
3rd February 2007, 09:54 AM
I'm not interested in bickering about this to death, but I am interested in what other people would choose to do in either scenerio. Would you have a preference between 1 and 2 in the original scenerio? What about in my revised scenerio? Remember, you are being offered this once, not a million times or so.

It would depend on my situation. Currently as an unemplyed student I'd take the 85k in both scenarios. But if I was working the scenario 1 might as well be a coin toss. And in scenario 2 I would likely go for the 100k.

Or, to create a tighter but purely hypothetical case, imagine a lottery where tickets cost \$1, a winning ticket pays \$100M (no splitting the pot; all winning tickets win the full amount), and the odds of winning per ticket are 1 in 85M. The lottery will only happen once, tickets are on sale for a week and it takes about 10 seconds to buy one ticket, so the most tickets one could possibly buy are about 60,000. How many tickets should one buy?

As many as one can afford.

To show even more starkly how out-of-touch with reality your professor is, suppose the choice were between a million dollars and a 1% chance of winning a billion dollars. According to your professor's reasoning, you would be much better off choosing the latter because your expected value would be 10 million dollars -- ten times higher than choosing the million dollars.

In your example the expected value that I calculated is 9010000. The decision between these choices would naturally depend on your priorities. If you want to maximise your expectation the rational choice is to go for the billion dollars. On the other hand if you really need money (your kneecaps are in danger kind of a need) then take the million dollars. Anyone with means to support themselves without the million dollars should in my opinion try to get the billion. I would.

In the proposed hypothetical, there is only one trial. It's like having a bowl filled with 85 white marbles and 15 black and asking someone to draw one for \$100,000 or take \$85,000 without drawing at all.

Only an idiot would turn down the guaranteed winner for that gamble.

It is not idiotic. Both options have the same expectation. If you can manage without the 85 000\$ you might as well go for the 100 000\$.

roger
3rd February 2007, 10:05 AM
However, if you were offered the opportunity to win \$1,000 for every white marble you select from the bowl over 100 trials, vs. losing \$150 for every black marble you pick, you might well take the bet, because you would probably come close to getting \$85K and you might do a bit better or a bit worse.Well, that's the real point of the article. If you are doing a lot of buying and selling in the market, the dollar values of an individual trade aren't life changing events. In that case, it makes sense to look only at expected value - it's okay to walk away from the occasional 10K if you do 50 10K trades a year. As I said, I was just picking a nit with the hypothetical.

Or maybe for him, 85K is chump change. :eye-poppi

baron
3rd February 2007, 10:45 AM
It would be better to have values of \$0.85 and \$1. The higher the value the less likely a logical person is to take the chance. If it were \$8.5m or \$10m then only a raving lunatic would take the chance.

kjkent1
3rd February 2007, 11:33 AM
It is not idiotic. Both options have the same expectation. If you can manage without the 85 000\$ you might as well go for the 100 000\$.At the instant you add your "If you can manage without the \$85K...", you have changed the value of the choices. Because, if \$85K has no value to you, then the choice is between a guaranteed payout of \$0 vs. an 85% chance of \$15K (assuming that \$100,000K has \$15K worth of value to you).

But, as proposed, the hypothetical is nonsensical, and I doubt you could find one person in 100 who would opt for the gamble over the guaranteed payout.

Francesca R
3rd February 2007, 11:39 AM
I cannot imagine that a finance professor would seriously mean to convey the information you have drawn.

Expected return has plenty to do with investing but so does risk. No rational investor will decline an investment in favour of another one with the same expected payoff but higher risk. Not even a risk-tolerant rational investor. The additional risk needs additional compensation (a risk premium). That is all.

.13.
3rd February 2007, 12:33 PM
At the instant you add your "If you can manage without the \$85K...", you have changed the value of the choices. Because, if \$85K has no value to you, then the choice is between a guaranteed payout of \$0 vs. an 85% chance of \$15K (assuming that \$100,000K has \$15K worth of value to you).

Allright let's not invent any backgrounds to the person who chooses and focus on the dollar values and the odds. I'm very happy with that. Since both choices have an equal expectations there is absolutely no basis to call the gambler an idiot. If you still think there is, please explain.

But, as proposed, the hypothetical is nonsensical, and I doubt you could find one person in 100 who would opt for the gamble over the guaranteed payout.

Why do you think the hypothetical is nonsensical?

If 99 people out of 100 choose the guaranteed payout it doesn't mean that gambling here is irrational.

.13.
3rd February 2007, 12:40 PM
Well, that's the real point of the article. If you are doing a lot of buying and selling in the market, the dollar values of an individual trade aren't life changing events. In that case, it makes sense to look only at expected value - it's okay to walk away from the occasional 10K if you do 50 10K trades a year. As I said, I was just picking a nit with the hypothetical.

Maybe I misunderstood you but this doesn't seem to make sense. Why would it be rational to walk away from 10k when you do 50 10k trades a year? I assume you mean that the this particular single trade has a positive expectation of 10k. If your goal of trading is to make money it makes absolutely no sense to pass up this opportunity.

.13.
3rd February 2007, 12:47 PM
It would be better to have values of \$0.85 and \$1. The higher the value the less likely a logical person is to take the chance. If it were \$8.5m or \$10m then only a raving lunatic would take the chance.

Whether you gamble or not you have the same expectation. How do you determine that the gambler is "a raving lunatic"?

baron
3rd February 2007, 01:12 PM
If 99 people out of 100 choose the guaranteed payout it doesn't mean that gambling here is irrational.

Gambling in this instance is irrational. I doubt that any professional gambler on the planet who would go for option 2 over option 1.

Personally I would only take the odds if they were 99.5% on the \$100K. Even saying that I'm not sure I wouldn't back out and go for the \$85K when crunch time came around.

For the professor to say I have no "rational reason" for doing this makes me think either he's been misquoted, or he's mad. Can anyone really believe that if a person in the real world does not make decisions based on mathematical probability they are being unreasonable? Ask a blind person if they'd rather have guaranteed sight back in one eye or a 50% chance of getting sight back in both eyes and see how many are willing to be "unreasonable" about it.

baron
3rd February 2007, 01:16 PM
Whether you gamble or not you have the same expectation. How do you determine that the gambler is "a raving lunatic"?

By his nonsensical behaviour. In a similar way that I would brand someone as "raving mad" who quacked like a duck and hit themselves over the head repeatedly with a frying pan.

Let's see...

Guaranteed \$8.5M = a life of luxury for the winner and their family

85% chance of \$10M = an 85% chance of a life of luxury for the winner and their family (with a few more bedrooms) and a 15% chance of getting nothing whatsoever

.13.
3rd February 2007, 03:17 PM
Gambling in this instance is irrational. I doubt that any professional gambler on the planet who would go for option 2 over option 1.

Why is it irrational? Both choices have the same expectation.

.13.
3rd February 2007, 03:22 PM
By his nonsensical behaviour.

You just moved from "raving lunatic" to nonsensical behaviour. And described what the options were. Please explain why opting for gamble is nonsensical behaviour. Once again: Both options have the same expectation.

Number Six
3rd February 2007, 03:27 PM
When long-odds high-payoff progressive-jackpot lotteries go unwon for many cycles, the expected value of a ticket can exceed its purchase price (even when the likelihood of having to share the pot is figured in).

Is there any rational reason not to immediately purchase as many lottery tickets as possible whenever this condition arises?

If so, then the argument is refuted.

Or, to create a tighter but purely hypothetical case, imagine a lottery where tickets cost \$1, a winning ticket pays \$100M (no splitting the pot; all winning tickets win the full amount), and the odds of winning per ticket are 1 in 85M. The lottery will only happen once, tickets are on sale for a week and it takes about 10 seconds to buy one ticket, so the most tickets one could possibly buy are about 60,000. How many tickets should one buy?

Respectfully,

But the amount advertised in the jackpot is the annuity given out over 20-30 years instead of the lump sum. And also it is taxable. When you combine that with the fact that even if you do win you may have to share the jackpot, it is rare to have a positive expected value in playing the lottery, or at least the big ones that get all the press. I don't know about the smaller ones.

kjkent1
3rd February 2007, 03:37 PM
Allright let's not invent any backgrounds to the person who chooses and focus on the dollar values and the odds. I'm very happy with that. Since both choices have an equal expectations there is absolutely no basis to call the gambler an idiot. If you still think there is, please explain.

OK, let's change the payout so that it genuinely reflects the fact that there is only one opportunity to play.

There are two identical guns on a table. Gun #1 has 85 chambers, all of them empty. Gun #2 has 100 chambers, 85 of them empty.

You are being forced at the threat of immediate death, should you refuse to play one round of russian roulette with one of the two guns.

Would it be presumptuous of me to characterize any person who voluntarily chooses to play with gun #2, an idiot?

Schneibster
3rd February 2007, 05:47 PM
I'd take the \$85K, unless I had an expectation of repeatedly getting the chance to make that choice, and so would any rational individual. I wouldn't expect the break-even point for people choosing the chance to be much below an additional \$40K or so; and that's just a very nebulous opinion, don't ask me to substantiate it.

Mangafranga
3rd February 2007, 10:51 PM
A real life example from Deal_or_No_Deal_(Australian_game_show)

"So far, the top prize of \$200,000 has only been given away once to Dean Cartecchini on June 17, 2004 after saying "No Deal" to \$102,500 with the chance of walking away with only \$5 or the \$200,000. Luckily, Dean had the big one in his briefcase (#12)."

I don't see why introduced assumptions that mean the utility of 85k is greater than an 85% chance of 100k are justified. If one needed 100k for a buy in to a poker tournament, or to pay off a knee-cap debt, then one might go for the 100k. If one doesn't care either way, then taking the 100k bet could be more interesting. One may wish for the chance of the fame of being so ballsy as to take the risky choice. One might be an investor that makes many of these choices, and thus considers them equal.

There is the counter, but for most people 85k is going to have the greater utility than 85% chance of 100k. However in this case I think one could say, OK, let us be realistic about this. Realistically how many times is this choice going to be made by- really rich people, gamblers, investors, and how many times is it going to be made by the average Joe?

.13.
4th February 2007, 02:58 AM
OK, let's change the payout so that it genuinely reflects the fact that there is only one opportunity to play.

There are two identical guns on a table. Gun #1 has 85 chambers, all of them empty. Gun #2 has 100 chambers, 85 of them empty.

You are being forced at the threat of immediate death, should you refuse to play one round of russian roulette with one of the two guns.

Would it be presumptuous of me to characterize any person who voluntarily chooses to play with gun #2, an idiot?

You have completely changed the scenario here. What's the point? This is in no way comparable to choosing between \$85k/\$100k.

.13.
4th February 2007, 03:07 AM
I'd take the \$85K, unless I had an expectation of repeatedly getting the chance to make that choice, and so would any rational individual.

I don't understand why you would say that any rational individual would make the same choice as you. What makes the choice for gamble irrational?

.13.
4th February 2007, 03:13 AM
A real life example from Deal_or_No_Deal_(Australian_game_show)
"So far, the top prize of \$200,000 has only been given away once to Dean Cartecchini on June 17, 2004 after saying "No Deal" to \$102,500 with the chance of walking away with only \$5 or the \$200,000. Luckily, Dean had the big one in his briefcase (#12)."

Dean made the wrong decision there. The expected value of betting 102 500 for the gain of 97 500 with 50/50 odds is -2500. In this case the rational decision would have been to take the 102 500.

baron
4th February 2007, 03:41 AM
What makes the choice for gamble irrational?

Hasn't this been explained on several occasions? You keep asking the same question.

For the last time, what you and (apparently) the professor fail to realise is that in a one-off situation, having a chance of obtaining something is not the same as having something, even if an ad-infinitum application of the same odds would return the same result. Therefore, it is not sense to equate the two.

You also do not grasp that the sizes of the sums of money are significant. In this particular instance, only a fool would choose option 2. However, if the options were

1) Guaranteed \$1
2) 1:1m chance of \$1m

Then I would expect the vast majority of people would go for option 2, as the "loss" of \$1 is irrelevant and the odds of 1 in 1m of winning \$1m are 10x better than any lottery odds.

If the options where

1) Guaranteed \$850m
2) 85% chance of winning \$1bn

The person who went for option 2 would be an idiot of the worst order, as there is no practical difference between the two prizes and the only difference between the options is that choosing option 2 gives an almost 1 in 6 chance of receiving nothing at all.

Surely now you now see that one-off real-world decisions cannot be taken solely on the basis of mathematical odds.

.13.
4th February 2007, 04:13 AM
Hasn't this been explained on several occasions?

Actually you haven't explained anything.

For the last time, what you and (apparently) the professor fail to realise is that in a one-off situation, having a chance of obtaining something is not the same as having something, even if an ad-infinitum application of the same odds would return the same result. Therefore, it is not sense to equate the two.

A simple question: If you have only 1 chance to make the choice what are the expected values of either choices in this instance?

You also do not grasp that the sizes of the sums of money are significant. In this particular instance, only a fool would choose option 2. However, if the options were

1) Guaranteed \$1
2) 1:1m chance of \$1m

Then I would expect the vast majority of people would go for option 2, as the "loss" of \$1 is irrelevant and the odds of 1 in 1m of winning \$1m are 10x better than any lottery odds.

If the options where

1) Guaranteed \$850m
2) 85% chance of winning \$1bn

The person who went for option 2 would be an idiot of the worst order, as there is no practical difference between the two prizes and the only difference between the options is that choosing option 2 gives an almost 1 in 6 chance of receiving nothing at all.

Surely now you now see that one-off real-world decisions cannot be taken solely on the basis of mathematical odds.

But now you are introducing more complexity to the simple question: Why is gambling irrational in this instance? Just examine the options given and answer that question.

There is no point in introducing this complexity. What if Bill Gates were to make the gamble for 1 billion? There is no point in making up more and more scenarios.

baron
4th February 2007, 04:35 AM
But now you are introducing more complexity to the simple question: Why is gambling irrational in this instance? Just examine the options given and answer that question.

I have, twice now. Here it is for the third time ~

...in a one-off situation, having a chance of obtaining something is not the same as having something, even if an ad-infinitum application of the same odds would return the same result. Therefore, it is not sense to equate the two.

There is no point in introducing this complexity. What if Bill Gates were to make the gamble for 1 billion? There is no point in making up more and more scenarios.

Complexity? How is this scenario more complex?

And I really can't fathom why you're talking about Bill Gates (who, like any right-minded person, would never in a million years gamble for the \$1bn).

Schneibster
4th February 2007, 04:58 AM
What makes the choice for gamble irrational?The existence of a nearly equal choice that involves zero risk.

ETA: baron is saying just the same thing, with more verbiage.

Why gamble? Because the reward is worth the risk. In the case where the risk is the loss of a guaranteed \$85K, and the reward is only \$100K, the reward is not worth the risk. You'd have to up the ante pretty substantially to make the risk worthwhile.

Mangafranga
4th February 2007, 05:16 AM
Dean made the wrong decision there. The expected value of betting 102 500 for the gain of 97 500 with 50/50 odds is -2500. In this case the rational decision would have been to take the 102 500.

No no no, the gain is 97 500 + bloody legend status. It is even quite possible that this could qualify for absolute bloody legend status.

4th February 2007, 05:28 AM
Or, to create a tighter but purely hypothetical case, imagine a lottery where tickets cost \$1, a winning ticket pays \$100M (no splitting the pot; all winning tickets win the full amount), and the odds of winning per ticket are 1 in 85M. The lottery will only happen once, tickets are on sale for a week and it takes about 10 seconds to buy one ticket, so the most tickets one could possibly buy are about 60,000. How many tickets should one buy?

As many as one can afford.

Exactly! But let's consider what is usually meant by "afford." If I have \$200 in my pocket, but I need that \$200 to buy food for next week and pay the phone bill, should I buy 200 lottery tickets? How about taking out a \$60,000 loan to buy 60,000 tickets? Clearly, those would be foolish things to do. Because no matter how many tickets I buy, it's almost certain that I'll end up with nothing. If I don't have the money or I need the money I have for other things, then I cannot afford the lottery tickets despite their positive expected value.

Compare that with a case where the items I'm buying have a net positive value rather than just a net positive expected value. If I were being offered \$1 bills for \$.85 each, then it would become rational to buy as many as I can pay for, whether I can "afford" them or not. The \$200 in my pocket, next month's rent money, my retirement fund, as much money as I can borrow (as long as loan expenses won't exceed about 17 cents on the dollar), whatever. Why not?

So I must clarify what I meant when I said I cannot afford the lottery tickets. If I could afford to pay the \$200 in my pocket for 235 dollar bills (and a quarter in change), why can I not afford 200 lottery tickets with an equivalent expected value? What I actually cannot afford is the risk incurred by buying the lottery tickets.

Similarly, some people in some situations can afford a 15% risk of losing an otherwise guaranteed \$85,000 for the chance to win \$100,000, and some (I'd say, most) cannot.

Not only that, those who have the money to spare, and so can afford the risk, also expect to be compensated for that risk. An expected value payoff equal to the amount risked does not do that so it is a bad investment. That's one difference between investors and gamblers.

All the various "complications" to the scenario that people have described are situations that help to define or understand why some can afford a risk and some cannot, and why even some who can afford it would be unwilling to take it without a risk premium in the offer.

The OP's professor didn't claim that it's never rational to go for the \$100,000, so showing cases where some rational person somewhere would be willing to do it doesn't prove anything. The claim is stronger than that: that there's no rational basis for ever choosing either one or the other, meaning that there's never a rational reason not to go for the \$100,000. That claim has now been refuted based on at least two different rational reasons: being unable to afford the risk, and being unwilling to incur the risk without a risk premium.

Respectfully,

Schneibster
4th February 2007, 05:58 AM
The OP's professor didn't claim that it's never rational to go for the \$100,000, so showing cases where some rational person somewhere would be willing to do it doesn't prove anything. The claim is stronger than that: that there's no rational basis for ever choosing either one or the other, meaning that there's never a rational reason not to go for the \$100,000. That claim has now been refuted based on at least two different rational reasons: being unable to afford the risk, and being unwilling to incur the risk without a risk premium.Precisely. If one expects to make this wager many times, there is little reason to differentiate between an 85% chance of \$100K and a 100% chance of \$85K. Over a large number of such wagers, the results of the risk every time would approach the results of the sure thing every time. Taken once, however, the 15% risk of getting nothing, without a premium to make up for that risk, is unacceptable.

.13.
4th February 2007, 07:16 AM
Exactly! But let's consider what is usually meant by "afford."

I used the common definition:
1. to be able to do, manage, or bear without serious consequence or adverse effect

http://dictionary.reference.com/browse/afford

.13.
4th February 2007, 07:19 AM
No no no, the gain is 97 500 + bloody legend status. It is even quite possible that this could qualify for absolute bloody legend status.

Ofcourse if you prefer the "bloody legend status" over mathematically correct decision that is your (or in this case Deans choice) choice. I'd make my decision based on expected value.

.13.
4th February 2007, 07:33 AM
I have, twice now. Here it is for the third time ~

n a one-off situation, having a chance of obtaining something is not the same as having something, even if an ad-infinitum application of the same odds would return the same result. Therefore, it is not sense to equate the two.

And I ask again: What are the expected values in this "one-off situation"?

You originally said: "The higher the value the less likely a logical person is to take the chance. If it were \$8.5m or \$10m then only a raving lunatic would take the chance."

You have failed to argue your case. I brought up Bill Gates because accepting or rejecting an offer of \$850M would have no effect on his standards of living. But I'd rather not talk about whether some people can afford the gamble or not. Let's forget that. Just argue why one option is more rational than the other based on the expectation.

.13.
4th February 2007, 07:35 AM
The existence of a nearly equal choice that involves zero risk.

No, not nearly equal choices. They are equal choices.

.13.
4th February 2007, 07:36 AM
Precisely. If one expects to make this wager many times, there is little reason to differentiate between an 85% chance of \$100K and a 100% chance of \$85K. Over a large number of such wagers, the results of the risk every time would approach the results of the sure thing every time.

Well you tell me then what are the expected values of the choices when you choose only once.

Secondly:
Let's make the wager many times: What is the little reason to differentiate between the choices?

EDIT:

Taken once, however, the 15% risk of getting nothing, without a premium to make up for that risk, is unacceptable.

The whole point here is that there is the \$15k dollars of premium to compensate the 15% risk!

baron
4th February 2007, 09:00 AM
You have failed to argue your case. I brought up Bill Gates because accepting or rejecting an offer of \$850M would have no effect on his standards of living. But I'd rather not talk about whether some people can afford the gamble or not. Let's forget that. Just argue why one option is more rational than the other based on the expectation.

I'm sorry you think I haven't argued my case. I submit that you simply don't understand it, nor the similar cases posed by others.

Let's make the wager many times: What is the little reason to differentiate between the choices?

Because, as has been said a dozen times, if you wager on a one-off you have a 15% chance of ending up with nothing. If you perform several million identical wagers you will end up with an average of precisely \$85K per wager (taken to its logical conclusion). That's how chance works.

The whole point here is that there is the \$15k dollars of premium to compensate the 15% risk!

It doesn't compensate! In the real world the minimisation of risk itself is worth money and that's why nobody would take this bet!

4th February 2007, 09:12 AM
I used the common definition:

http://dictionary.reference.com/browse/afford

Hey, thanks for the reply! Because, when I began my post with, "But let's consider what is usually meant by "afford," I of course meant that I had no idea what the definition of the word was and I needed you to look it up for me in an online dictionary.

Now, any chance of your having read and comprehended the rest of that post?

The whole point here is that there is the \$15k dollars of premium to compensate the 15% risk!

Apparently not. Because this is wrong. The expected value of both options is the same. But one option has risk, and therefore greater cost. Same return, higher cost -> worse option. The proof that risk has cost is that some people can afford to take risks that others cannot. If risk were without cost, then anyone would be able to afford it; there would be nothing to "manage" or "bear."

You're not seeing the cost because the example is worded to sound like there's nothing at risk since only gain or no gain, not loss, are involved. Are you familiar with the concept of opportunity cost?

Suppose a third option were added to the original choice. Choose option 3, and you get a 99.9% chance of getting nothing, a 0.1% chance of getting 85 million dollars. Same expected value, much greater risk. If there are some people who would have rational reasons not to choose to spend their guaranteed 85 grand on a one in a thousand gamble to make millions (including, if they didn't have any immediate need for the money, believing they would have a better than one in 1000 chance to make 85 million by using the 85,000 to start a business), then the claim that there's no rational reason to choose preferentially between options of equal expected value is refuted.

Respectfully,

kjkent1
4th February 2007, 09:47 AM
You have completely changed the scenario here. What's the point? This is in no way comparable to choosing between \$85k/\$100k.

You mean that now you would take the \$85K? What does that tell you about the original scenario. Evidently expected value does not accurately estimate the real risk of the gamble. Apparently what really matters is what happens if you lose the bet, not what happens if you win.

So, change the risk from losing your life, to losing \$100K if you take bet number two. Suddenly, the bet becomes a function of how much you value money. This is really the case in every scenario. If you have more money than you know what to do with, then you don't care which bet you take. If you're poor as a church mouse and you need money, that you would be a fool to take the 85% chance of \$100K bet.

Bottom line is that expected value is meaningless until you get into the law of large numbers and a normalized distribution centered on expected value is produced. With only one possible trial , there really is no expected value.

Ben Tilly
4th February 2007, 12:20 PM
Gambling in this instance is irrational. I doubt that any professional gambler on the planet who would go for option 2 over option 1.
Why is it irrational? Both choices have the same expectation.
It is irrational because you're taking a risk for no reward.

To make it quantitative, any professional gambler will know the Kelly Criterion. Which says that this bet is not worth undertaking no matter how big your bankroll.

Cheers,
Ben

Ben Tilly
4th February 2007, 12:45 PM
A finance professor offered this scenerio in an explanation about rational market behavior and expected value calculations. I disagree with his analysis. First, the scenerio:

Choose between 1) getting \$85,000, or 2) an 85 percent change of gettting \$100,000.

His argument was there is no rational reason to choose between the two, since they both have the same expected value.

Your finance professor is an idiot. But his is a common idiocy. Unless your net worth is infinite, the first is clearly a better investment opportunity than the second. And for most of us, it is a far better opportunity. Even though the expected returns are equal.

Rather than tell him that so bluntly, however, present him with the following hypothetical. Suppose that you're offered a repeated bet. Each time you take the bet, you have 50% odds of tripling your money, and 50% odds of losing all of the money you put in the bet. This is obviously a worthwhile bet. So you're going to follow an investment strategy which is to invest a fixed fraction of your money each time in the bet.

What strategy will, with 100% odds, eventually beat any other?

Well based on a naive examination of expected value, you'd choose to put all of your money in the bet. But in time, with 100% odds, you're going to go broke. Obviously maximizing the expected value is not the wisest strategy. What is?

When your professor gives up on trying it, point out to him that the way this is set up you are multiplying your net worth by a random variable each time. Take logarithms of everything and this turns into repeated addition. And, of course, all of the nice probability theorems (the strong law of large numbers is the one you want here) apply to repeated addition. So take logs.

When you take logs your goal becomes to maximize the expected value of the logarithm of your net worth after one bet. Which is a straightforward max/min problem that you can solve with first-year calculus to discover that you want to invest 1/3 of your net worth in this bet.

Now let's analyze this bet with the insight that the logical way to approach investment opportunities is to try to maximize the expected value of the logarithm of your net worth. It turns out that, no matter what your current net worth is, that option 1 is better than option 2. But the richer you are, the closer they are to equal, and in the limit, an infinitely wealthy person would find them indistinguishable.

And there is a quantitative reason why your professor was wrong.

This idea was first published in http://www.racing.saratoga.ny.us/kelly.pdf. It became popularized after Edward O. Thorp used it to create the first blackjack strategy that could regularly beat the casinos. Thorp also later used the idea to achieve substantial success in the stock market.

Cheers,
Ben

jsfisher
4th February 2007, 01:21 PM
Rather than tell him that so bluntly, however, present him with the following hypothetical. Suppose that you're offered a repeated bet.
I do not see the relevance of your analogy. There is no bet under consideration. In fact, were the proposition offered repeatedly, the net gain per proposition would be \$85,000 (in the limit) with no chance of "going bust" and without regard to which option is chosen at each step.

The person electing option 1 or option 2 stands to lose nothing from choosing "poorly."

.13.
4th February 2007, 01:52 PM
It doesn't compensate! In the real world the minimisation of risk itself is worth money and that's why nobody would take this bet!

Apparently not. Because this is wrong.

.13.
4th February 2007, 01:54 PM
It is irrational because you're taking a risk for no reward.

I've said it earlier. There is the reward of extra 15k.

.13.
4th February 2007, 01:57 PM
You mean that now you would take the \$85K?

No. I mean your example is completely different from the original \$85k/\$100k example.

.13.
4th February 2007, 02:01 PM
Hey, thanks for the reply! Because, when I began my post with, "But let's consider what is usually meant by "afford,"

And I showed the common definition of "afford"...

Now, any chance of your having read and comprehended the rest of that post?

...which makes the rest of your post pretty much irrelevant.

Ben Tilly
4th February 2007, 02:01 PM
I do not see the relevance of your analogy. There is no bet under consideration. In fact, were the proposition offered repeatedly, the net gain per proposition would be \$85,000 (in the limit) with no chance of "going bust" and without regard to which option is chosen at each step.

The person electing option 1 or option 2 stands to lose nothing from choosing "poorly."

The relevance is that the example bet illustrates the correct way to think about investment opportunities. Which is to maximize the expected value of the logarithm of my net worth. (And the reason why it is correct is that in finance we find a series of worthwhile opportunities to make money, and each time the money to be made tends to be proportional to what we put in. Just like in the simplified bet.) And that in turns demonstrates why making \$85,000 guaranteed is worth more than making \$100,000 only 85% of the time.

For instance if my net worth is \$100,000, then an opportunity to spend some time (and no money) to make \$85,000 guaranteed should be worth the same to me as an investment opportunity to make \$106,215 with 85% odds. (I calculated that by calculating what number made the expected value of the logarithm of my net value the same.) If my net worth was \$10,000 then that goes up to \$131,339. And if my net worth was \$1,000,000 then that goes down to \$100,733. So you see that as your net worth increases, the "risk premium" that you need to ask for goes down. But it never quite goes to zero. No matter how rich you are, the first opportunity is a better investment opportunity than the second.

Cheers,
Ben

Ben Tilly
4th February 2007, 02:17 PM
It is irrational because you're taking a risk for no reward.I've said it earlier. There is the reward of extra 15k.

Any successful professional gambler will tell you that you're wrong. And the reason is that the additional money just matches the additional risk. That's call breaking even. You need some sort of premium above and beyond breaking even to compensate you for the risk, and without that the bet can't be worthwhile.

Even when you do get a premium, you have to ask whether you're getting enough of a premium to compensate you for the risk. And the answer to that depends on a number of factors, including how big a bankroll you have to play with. The standard way to calculate that is to use the Kelly formula (which comes from the Kelly criterion). Any professional gambler will know and use this. (Though it is common to only risk half of the value that Kelly gives. This results in a slight reduction of returns (about 15%) for a very large reduction in bankroll volatility and a built-in cushion in case you are over-estimating how well you'll do.)

Cheers,
Ben

PS For the record I do know some professional gamblers, and have discussed this exact topic with them.

Art Vandelay
4th February 2007, 03:45 PM
Gambling in this instance is irrational. I doubt that any professional gambler on the planet who would go for option 2 over option 1.:confused:
What do professional gamblers have to do with it? What about amateur gamblers?

Can anyone really believe that if a person in the real world does not make decisions based on mathematical probability they are being unreasonable?It's worse than that. Expected value is just one parameter. The professor is expecting us to believe that we should ignore every other parameter.

I'd take the \$85K, unless I had an expectation of repeatedly getting the chance to make that choice, and so would any rational individual.How is it irrational not to?

"So far, the top prize of \$200,000 has only been given away once [/quotre]Wow. The top prize is only \$200k? And I gues that's Aussie dollars, to boot.

[QUOTE=.13.;2311638]You just moved from "raving lunatic" to nonsensical behaviour. And described what the options were. Please explain why opting for gamble is nonsensical behaviour. Once again: Both options have the same expectation.Once again: so what?

Ofcourse if you prefer the "bloody legend status" over mathematically correct decision that is your (or in this case Deans choice) choice. I'd make my decision based on expected value.But "legend status" is part of the expected value.

No, not nearly equal choices. They are equal choices.You believe that they are equivalent. But they are most definitely not equal. If they were equal, it wouldn't be possible to choose between them. The very fact that we can distinguish between them shows that they are not equal.

The whole point here is that there is the \$15k dollars of premium to compensate the 15% risk!You clearly don't understand what "risk" means. The expected value for both is \$85. It's the expected value that is cancelled out. Risk is something additional that is not compensated for. What is this \$15k that you speak of? You can't count it twice! Once you've included it in the expected value calculation, you can't then turn around and include it in the risk evaluation as well. You seem to think that "risk" refers to the mean. It doesn't. It refers to the standard deviation.

Because, as has been said a dozen times, if you wager on a one-off you have a 15% chance of ending up with nothing. If you perform several million identical wagers you will end up with an average of precisely \$85K per wager (taken to its logical conclusion). That's how chance works.You have an expected value of \$85k. You will have a standard deviation of several thousand.

You mean that now you would take the \$85K? What does that tell you about the original scenario. Evidently expected value does not accurately estimate the real risk of the gamble. Apparently what really matters is what happens if you lose the bet, not what happens if you win.

So, change the risk from losing your life, to losing \$100K if you take bet number two. Suddenly, the bet becomes a function of how much you value money. This is really the case in every scenario. If you have more money than you know what to do with, then you don't care which bet you take. If you're poor as a church mouse and you need money, that you would be a fool to take the 85% chance of \$100K bet.You're speaking gibberish.

What strategy will, with 100% odds, eventually beat any other?None.

It became popularized after Edward O. Thorp used it to create the first blackjack strategy that could regularly beat the casinos. Thorp also later used the idea to achieve substantial success in the stock market.I don't see how that's really a relevant factor, especially for blackjack. Either you have a positive expected value or you don't.

I do not see the relevance of your analogy. There is no bet under consideration.But it's effectively the same as if there were a bet.

In fact, were the proposition offered repeatedly, the net gain per proposition would be \$85,000 (in the limit) with no chance of "going bust" and without regard to which option is chosen at each step.That just means that the range of possible bets is restricted.

The person electing option 1 or option 2 stands to lose nothing from choosing "poorly."Sure they do. They can lose \$85k.

And I showed the common definition of "afford"...

...which makes the rest of your post pretty much irrelevant.So you're just going to ignore the post because you don't like the definition of "afford"?

jsfisher
4th February 2007, 03:51 PM
The relevance is that the example bet illustrates the correct way to think about investment opportunities. Which is to maximize the expected value of the logarithm of my net worth. (And the reason why it is correct is that in finance we find a series of worthwhile opportunities to make money, and each time the money to be made tends to be proportional to what we put in. Just like in the simplified bet.) And that in turns demonstrates why making \$85,000 guaranteed is worth more than making \$100,000 only 85% of the time.

For instance if my net worth is \$100,000, then an opportunity to spend some time (and no money) to make \$85,000 guaranteed should be worth the same to me as an investment opportunity to make \$106,215 with 85% odds. (I calculated that by calculating what number made the expected value of the logarithm of my net value the same.) If my net worth was \$10,000 then that goes up to \$131,339. And if my net worth was \$1,000,000 then that goes down to \$100,733. So you see that as your net worth increases, the "risk premium" that you need to ask for goes down. But it never quite goes to zero. No matter how rich you are, the first opportunity is a better investment opportunity than the second.

Cheers,
Ben
Seeing your math might help. The expected value of your net worth would be the same for either option.

jsfisher
4th February 2007, 04:05 PM
I do not see the relevance of your analogy. There is no bet under consideration.
But it's effectively the same as if there were a bet.
I think you have taken my post out of context. Ben Tilly seemed to be applying the so-called Kelly Criterion to justify one option over the other. The Kelly Criterion defines the portion of a bankroll to invest for maximal return in a favorable gambling proposition.

Since there is no bankroll at stake--no wagered amount at risk--I didn't see why Kelly was being invoked.

Art Vandelay
4th February 2007, 05:15 PM
But there is a bankroll at stake. \$85k is being wagered.

jsfisher
4th February 2007, 05:20 PM
But there is a bankroll at stake. \$85k is being wagered.
Sure, you can view option 2 as such, but that still doesn't bring Kelly into play.

Jekyll
4th February 2007, 05:28 PM
But there is a bankroll at stake. \$85k is being wagered.
Art's right. As the \$85k is guaranteed in one case, it makes sense to subtract that from all of the outcomes and move it across to the bankroll. In this case option 1 is equivalent to not betting.

Ben Tilly
4th February 2007, 09:25 PM
Gambling in this instance is irrational. I doubt that any professional gambler on the planet who would go for option 2 over option 1.
:confused:
What do professional gamblers have to do with it? What about amateur gamblers?

The assumption is that professional gamblers are both more proficient than amateurs and are in it to make money. Therefore their opinions mean more than the opinions of amateurs.

What strategy will, with 100% odds, eventually beat any other?
None.
That's a strong assertion to make. Particularly when in that post I gave the optimal answer, named the common name of the result that gives this strategy, and linked to the original 1956 math paper by Kelly where it first came up. But since you don't seem to believe anything that I have to say, I'll show my work.

First to clarify this for bystanders, the strategy that I am talking about is what proportion of your net worth should you invest in a repeated worthwhile bet. (The unstated assumption is that each trial is independent. That will matter in a second.) In this specific case, the bet has even odds of tripleing the money in the bet.

Now suppose that a strategy is that you put a proportion p of your net worth into the bet. Then with probability 1/2 your net worth after one more bet is x(1+2p) and otherwise it is x(1-p) where x was your worth before your bet. We are trying to find the best value of p to use.

The problem, of course, is that our net worth is being multiplied by many independent random factors. We don't have a lot of tools to handle multiplication in probability theory. But we do addition. So lets take logarithms of everything in sight.

Now after one bet the logarithm of your net worth is, with probability 1/2, log(x)+log(1+2p) and with probability 1/2 it is log(x)+log(1-p). And after many bets we are adding together these independent random factors. By the strong law of large numbers, with 100% probability the log of your net worth after n trials divided by n will tend to log(1+2p)/2+log(1-p)/2. Therefore for any pair of strategies, p and q, p will eventually beat q with 100% probabilities if log(1+2p)/2+log(1-p)/2 is greater than log(1+2q)/2+log(1-q)/2.

So the optimal strategy, by the definition that I gave, is the one that maximizes log(1+2p)/2+log(1-p)/2. From first year calculus we can take the derivative to get 0.5(2/(1+2p)-1/(1-p)). Set that equal to zero, multiply by 2(1+2p)(1-p) and we find that we want 2(1-p)-(1+2p)=0, which means that p=1/4. Either testing 3 values or checking that the second derivative is negative will tell you that this is indeed a maximum.

Do you still believe that there is no answer to the problem that I posed?

(Oops, this isn't the number I gave before. Rather than work it out from scratch as I usually do, I tried to use the formula in wikipedia and got the wrong answer. Grrrr...)

It became popularized after Edward O. Thorp used it to create the first blackjack strategy that could regularly beat the casinos. Thorp also later used the idea to achieve substantial success in the stock market.

I don't see how that's really a relevant factor, especially for blackjack. Either you have a positive expected value or you don't.

Suppose that I have a 51% chance of doubling my money in the bet, and a 49% chance of losing it. There is a positive expected value. But if I put half my bankroll into the bet every time then the logarithm of my net value will, on average, drop by 0.132854913339209 each time. Which means that in the long term, on average, each bet will multiply my net worth by 0.87 and I'll be steadily losing money despite the fact that the bet is in my favour.)

A large part of the ideal blackjack strategy involves adjusting the size of your bets up and down as your expected payout varies. (Which you keep track of through card counting.) But the edges involved are very small and as I just showed if you consistently bet too much of your bankroll when you have a small expected value, you will lose money. If you bet too little, then you won't make money that you could have. To figure out the ideal amount to bet you need the Kelly criterion.

Cheers,
Ben

Ben Tilly
4th February 2007, 10:35 PM
Seeing your math might help. The expected value of your net worth would be the same for either option.

As I have been saying, the expected value of your net worth is not what is important. Instead it is the expected value of the logarithm of your net worth. And the result of that is that a logical person needs to receive a premium to be willing to take a risk.

How much more? Well in this case we are considering a person with x dollars comparing an offer to receive y dollars versus an offer to with probability p receive z dollars. How big does z have to be to make this person indifferent as to which offer they get? Well we want the expected value of the logarithm of their net worth to be the same. So log(x+y) should be the same as p*log(x+z)+(1-p)*log(x). Well in this case I had p (0.85), y (\$100,000) and I invented several values for x (\$10,000, \$100,000, and \$1,000,000). For each of them equality means that log(x+z)=(log(x+y)-(1-p)log(x))/p. Take exponents and subtract x to get z=exp((log(x+y)-(1-p)log(x))/p)-x. And, shoot, I miscalculated that before. (Then repeated my bad calculation with 3 sets of values. Not my night...) But anyways you're free to plug in the numbers that you want.

Incidentally this fact about logarithms of expected values mattering shows up in the real world in lots of ways. One example is that in different categories of investing, there is a positive correlation between returns and risk. Namely the higher the average return, the higher the risk and vice versa. Another example is that it is what drives the insurance industry. Since an insurance company has a lot more money than I do, they need a smaller premium than I do to accept a significant risk to me. Because we value that risk differently (I am more risk adverse than they), there is room to negotiate a deal that is worthwhile to both parties. (Large businesses do the same thing, but they cannot find any insurance company big enough to take on their risk. Instead they sell pieces of their risk to lots of re-insurers. The mechanics are different, but the math leading to a worthwhile bargain being able to be struck is the same.)

Cheers,
Ben

Schneibster
4th February 2007, 10:56 PM
No, not nearly equal choices. They are equal choices.No, one involves the risk of receiving nothing and one does not.

That would be, what, seven or eight times now this has been pointed out to you?

.13.
5th February 2007, 04:22 AM
But "legend status" is part of the expected value.

It is not if my goal is to make money.

You clearly don't understand what "risk" means. The expected value for both is \$85. It's the expected value that is cancelled out. Risk is something additional that is not compensated for.

If the risk weren't compensated for the expected values wouldn't be the same.

So you're just going to ignore the post because you don't like the definition of "afford"?

On the contrary. I'm using the common definition of "afford".

.13.
5th February 2007, 04:24 AM
No, one involves the risk of receiving nothing and one does not.

But there is also the additional \$15k.

baron
5th February 2007, 05:50 AM
But there is also the additional \$15k.

You simply don't understand this, do you? Why not say so instead of posting the exact same thing every single time?

jsfisher
5th February 2007, 05:56 AM
As I have been saying, the expected value of your net worth is not what is important. Instead it is the expected value of the logarithm of your net worth. And the result of that is that a logical person needs to receive a premium to be willing to take a risk.
Nope, I'm not buying it. Kelly is flatly not applicable here for two reasons. (1) You do not control the size of the wager, and so there is no exponential growth to be optimized, and (2) the wager never diminishes your bank roll.

Now, I would agree that if you were able to wager on a proposition with odds of 15:85 with probability of success being 0.85, then the Kelly formula would yield 0% of your bankroll as the optimal wager. No edge, so no wager.

But that is not the case at hand, and to demonstrate this further, suppose that Option 2 was \$100,001 with probability 0.85 instead of \$100,000. By your calculations, that would still not be a suitable premium for the risk involved for most reasonable "net worths", so you would recommend against Option 2. Yet, Option 2 is clearly (and provably) superior in the long run for repeated trials.

ETA: This, of course, does not address the sensibility of choosing Option 2 in a one-time-only proposition, just whether an analysis based on Kelly's findings have any bearing.

Jekyll
5th February 2007, 06:42 AM
Nope, I'm not buying it. Kelly is flatly not applicable here for two reasons. (1) You do not control the size of the wager, and so there is no exponential growth to be optimized, and (2) the wager never diminishes your bank roll.

(1) You choose whether to take the bet or not so there is a binary choice to be optimised.

(2) Bet 2 may diminish your future bankroll. The Kelly criteria is just as applicable to future earnings.

In the case of n trials you suggest, a naive application of the Kelly criteria would lead to you taking the first option until your bankroll reaches a threshold before switching to option 2. A more sophisticated application would have you betting on the second option until you reach n-k trials and your bankroll is below a threshold.

Both of these options reduce risk at the expense of reducing the expectance.

jsfisher
5th February 2007, 07:11 AM
(1) You choose whether to take the bet or not so there is a binary choice to be optimised.
Sure, but the formulae being used were derived under a different scenario. The Kelly criterion starts from

Vn = V0 x (1 + bf)^W x (1 - f)^L

where Vn is the future bankroll after n trials, V0 is the starting bankroll, b is the odds offered, f is the fraction of bankroll to be invested, and W and L are the number of wins and losses in n trials.

If you optimize the exponential growth rate of the above, then, sure enough, you get the probability-times-logarithm formulation Ben Tilly has been using.

But we have a different net-worth formula from the above for the case at hand.

Option 1 is: Vn = V0 + \$85000 x (W+L)
Option 2 is: Vn = V0 + \$100000 x W

So, I do not see how you can apply the Kelly result to a non-Kelly situation.

jsfisher
5th February 2007, 07:17 AM
(2) Bet 2 may diminish your future bankroll. The Kelly criteria is just as applicable to future earnings.
At least with respect to the Kelly criterion, it does not matter. The size of the wager is independent of the bankroll size, a situation expressly contrary to the derivation of the Kelly formula and of its application.

Geckko
5th February 2007, 07:33 AM
A finance professor offered this scenerio in an explanation about rational market behavior and expected value calculations. I disagree with his analysis. First, the scenerio:

Choose between 1) getting \$85,000, or 2) an 85 percent change of gettting \$100,000.

His argument was there is no rational reason to choose between the two, since they both have the same expected value.

Naturally, I agree they both have the same expected value. Run the scenerios enough times, and either choice 1 or 2 will give an average payoff of \$85,000/episode.

However, I still would strongly prefer 1. I'll go further. Modify 2 to be an 86% chance of getting 100,000, I'll still choose 1.

Why? Well, expected value computations rely on the law of large numbers. Do #2 enough times, and you'll average 85K (or 86K) per episode.

However, I don't know about you, but I am not often offered chances at 85K.
Second, 85K is a big sum of money, but the extra \$15K I might get in scenerio two is not much of an incentive. Yes, I'd rather have 100K than 85K, but it's not a big deal in comparision. In short, my life would be improved quite a bit with the 85K, but only marginally improved more with the extra 15K. So, I go for the sure thing, even though the expected value of 1 is lower than 2 in my revised scenerio.

I'm not interested in bickering about this to death, but I am interested in what other people would choose to do in either scenerio. Would you have a preference between 1 and 2 in the original scenerio? What about in my revised scenerio? Remember, you are being offered this once, not a million times or so.

ETA: I agreed with the bigger argument he was making about expected value calculations in markets - we often do it irrationally. If you are a regular investor you better assume the law of large numbers is in force. I'm just picking a nit.

HMmm.

Did you misunderstand the point at issue? Alternatively you need to reevaluate the worth of your course.

This is USUALLY a classic method of introducing the concept of choice under uncertainty in economics (or in such applied situations as finance). What your Professor SHOULD have said is that this example highlight the role of uncertainty in choice.

As you say, both options have equal expected value, but for any person who is not risk neutral, you would value one over the other. Specifically:

risk averse => prefer \$85k with certainty
risk loving => prefer the possiblity of getting \$100k with uncertainty

The way this problem is set up (equal expected value) it has a fairly simple solution. More realistic and more complex problems can be intractable for the present set of theories that try to describe choice under uncertainty.

For example, modify this problem by making the choice \$100k with certainty or a coin toss for \$500k. Which would you choose?? The current theoretical models allow us to model this by assming people maximise expected utility (not expected monetary value!!!), where expected utility is a function of wealth (total, not just the bet in question).

So in the modified example the model might predict that a very poor person wouldn't gamble, but would take the \$100k because it could have a massive positive impact on total expected utility. Someone relatively wealthy might take the gamble, because a \$500k win would have a sufficiently much greater impact on their utility than \$100k "cash in hand" would. The factors that would determine what any aprticular person might choose would depend on inputs such as their level of risk aversion, the present wealth etc. Look up "von Neumann-Morgenstern utility function" to learn about this.

Having said all this the von Neumann-Morgenstern wealth approach fails for some experiments where people don't behave as this theory would predict. It is not necessarily that they are irrational, but that this field of economics (choice under uncertainty) has not yet caught up - as is the case in many more advanced fields of study such as physics.

Jekyll
5th February 2007, 08:13 AM
At least with respect to the Kelly criterion, it does not matter. The size of the wager is independent of the bankroll size, a situation expressly contrary to the derivation of the Kelly formula and of its application.
I'm unsure as to what the problem is; the Kelly formula tells us to act in order to maximise E(log X) so as to maximise our take in the long haul. That we only have two options to choose from, and can not necessarily choose ideal solution should not affect this.

In the gambling scenarios described by Kelly in his paper you would also suffer similar restrictions, being limited to only being able to play with dollars and cents and not smaller fractions. They are also a discrete case, but the convex (or is it concave) -ness of the function makes it unimportant.

roger
5th February 2007, 08:16 AM
Might I suggest that most posters are taking this anecdote a bit too literally?

In context, the argument was - use expected value, and reason, to make your stock buy/sell decisions, not emotion. There were many more points built up from here, culled from the literature. For example, if you ask somebody how much they would pay to avoid a 1/1000 chance of death, the figure is low, but if you ask them how much you would have to give them to do something with a 1/1000 chance of death, the figure is much, much higher, even though the two scenerios are the same. That's irrational market behavior.

Now, as I said, if you take the initial scenerio that I presented literally, I think we can quibble with it (and we sure have :)). I started this thread to test my thinking on that quibble. But the broad point - intuition misleads us, so do the math, don't make decisions from the gut, stands.

That's all.

edite: this is not a course I am taking - I was reading course notes from a professor who teaches the Graham style of market analysis.

Jekyll
5th February 2007, 08:21 AM
But we have a different net-worth formula from the above for the case at hand.

Option 1 is: Vn = V0 + \$85000 x (W+L)
Option 2 is: Vn = V0 + \$100000 x W

So, I do not see how you can apply the Kelly result to a non-Kelly situation.
I would rewrite the options as:
V0 = N* \$85000 (ignore future discounting)

Option1:=> V(k+1):=V(k)

Option2:=> V(k+1):=V(k)+ \$1500*(1-X) -\$8500*X

where X is your appropriate discrete random variable and P(X=0)=.85 P(X=1)=.15 .

Then direct application of the Kelly criterion, treating V(k) as your bankroll should give you the second application I suggested.

jsfisher
5th February 2007, 08:38 AM
I'm unsure as to what the problem is; the Kelly formula tells us to act in order to maximise E(log X) so as to maximise our take in the long haul. That we only have two options to choose from, and can not necessarily choose ideal solution should not affect this.

In the gambling scenarios described by Kelly in his paper you would also suffer similar restrictions, being limited to only being able to play with dollars and cents and not smaller fractions. They are also a discrete case, but the convex (or is it concave) -ness of the function makes it unimportant.
It's "convexity."

Be that as it may, though, Kelly tells us how to act in order to maximize the exponential rate of growth of the bankroll:

G = lim (1/n) x log Vn/V0, as n approaches infinity.

It is only after assuming a particular wagering model (one dependent on bankroll size), etc., do you arrive at the probability-times-log formulae.

The wagering model at hand is different, and you have cannot arbitrarily apply Kelly-based formulae to it. (The \$100,001 variation helps demonstrate that point.) The growth rate expected with either option 1 or option 2 is linear, and, therefore, the exponential growth rate (at the heart of the Kelly criterion) would be zero.

If you really can apply the Kelly-based formulae to this situation, could someone please provide a derivation of them from the base formulae for Vn and for G?

Jaggy Bunnet
5th February 2007, 09:01 AM
Might I suggest that most posters are taking this anecdote a bit too literally?

At the risk of taking things a bit too literally, can I disagree with your next para?

For example, if you ask somebody how much they would pay to avoid a 1/1000 chance of death, the figure is low, but if you ask them how much you would have to give them to do something with a 1/1000 chance of death, the figure is much, much higher, even though the two scenerios are the same.

If they are already willing to do something that results in a 1/1000 chance of death, presumably they perceive it as having a benefit - therefore the "cost" to them of not doing it is the loss of that perceived benefit plus whatever they have to pay. Therefore the payment is only part of the cost.

Paying them to undertake an action they do not think is worthwhile without payment suggests that the risk to them is not equal to the reward absent any payment. Therefore the payment will have to make up for that perceived deficit plus sufficient reward to make it worthwhile. There is no reason to assume that should be equal to the amount they are willing to pay.

And of course it ignores utility - if I have £100 that I need to exactly pay my essential living expenses, the utility cost of paying to avoid a risk is very high (e.g. can't make the rent and end up on the street). However if all of my essential living expenses are covered, the utility of additional cash from accepting additional risk may be low (sure it would be nice to be able to order a takeaway pizza now and again, but is that adequate compensation for the extra risk?).

roger
5th February 2007, 09:19 AM
If they are already willing to do something that results in a 1/1000 chance of death, presumably they perceive it as having a benefit - therefore the "cost" to them of not doing it is the loss of that perceived benefit plus whatever they have to pay. Therefore the payment is only part of the cost.I think the idea is more along the lines of "how much are you willing to pay for a vaccine that ensures you won't die from disease X, which has a 1/1000 chance of killing you". Something along that nature - pay to remove a risk that is already present, but not a risk that you have actively taken on because of other rewards.

The idea is that we don't do risk analysis very well. We wring our hands over the possibility of getting struck by lightening in a storm, but happily get into our cars everyday, despite the fact that the latter is more risky, and often undertaken for frivolous purposes (say, to go get ice cream), than for going out into a storm (say, to go get something you left outside and is getting destroyed by the rain).

I'm the same way. Hiking in Grizzly country spooks me, even though I take on greater risks with no worries whatsoever. Which is why we have to analyze risks and payoffs in the market, not just go with what seems safe and comfortable.

I guess this thread is a good example of the risks of teaching through analogy. Analogies are meant to lead you to a way of thinking about something (hey, we misuse risk analysis in our own lives, so we probably misuse them in the market), not nitpicked for exact correspondance and truth.

Ben Tilly
5th February 2007, 12:03 PM
Nope, I'm not buying it. Kelly is flatly not applicable here for two reasons. (1) You do not control the size of the wager, and so there is no exponential growth to be optimized, and (2) the wager never diminishes your bank roll.

Whether or not Kelly applies depends, I agree, on the exact context. Now this is a problem that came up in a finance class. Typically in finance we don't see the exact same bet offered again and again with set odds. However over time we tend to get a series of somewhat similar opportunities, at a range of possible sizes, with a range of returns. How should we decide which ones to take?

So the context which I am assuming is that of a financial opportunity. For instance suppose we have a deal that will with 85% odds make \$100,000 and with 15% odds will give us nothing. Suppose further that we have a way to insure it for \$15,000 to a guaranteed return of \$85,000 no matter what. Should we buy that insurance?

And the answer is that we should. That is a wise financial decision.

But in a different context you might be right. We should use a different tradeoff.

if[/I] you were able to wager on a proposition with odds of 15:85 with probability of success being 0.85, then the Kelly formula would yield 0% of your bankroll as the optimal wager. No edge, so no wager.

So we agree that the math works. We just don't necessarily agree on when it is applicable. :-)

If you were going to repeatedly get the exact same offer with no variations, then you're right. If you are going to get a series of different opportunities of different sizes and characteristics, then I'm right. It depends on the context you impose.

In that sense there is no "right" answer to the original problem.

[quote=jsfisher;2315174 ETA: This, of course, does not address the sensibility of choosing Option 2 in a one-time-only proposition, just whether an analysis based on Kelly's findings have any bearing.

Right.

Cheers,
Ben

jsfisher
5th February 2007, 12:22 PM
Uhoh, if we all start agreeing with each other, does that mean we need to stop the thread?

Art Vandelay
5th February 2007, 02:33 PM
The assumption is that professional gamblers are both more proficient than amateurs and are in it to make money. Therefore their opinions mean more than the opinions of amateurs.But that's ridiculous! If I've deciding whether to pay \$10 for a steak, the only thing that matters is how much I enjoy steak. I'm not in it for money, I'm in it for a pleasing culinary experience. So why should I care what someone else would do, especially if that someone else is dealing in steaks solely to make money?

That's a strong assertion to make. Particularly when in that post I gave the optimal answer, named the common name of the result that gives this strategy, and linked to the original 1956 math paper by Kelly where it first came up.You provided the answer that maximizes the average of the log. It does not guarantee that the log will be maximized always. Sometimes, another strategy will turn out, in hindsight, to have been better. You shouldn't say that it has a 100% chance of being better.

In this specific case, the bet has even odds of tripleing the money in the bet.Tripling.

But if I put half my bankroll into the bet every time then the logarithm of my net value will, on average, drop by 0.132854913339209 each time.You are assuming that "bankroll" and "net value" are the same. Surely most professional gamblers do not walk into a casino with their entire net worth in their pocket, and even if they did, betting half of it on one hand wouldn't even be considered. Not because of Kelly, but because their livelihood depends on not arousing suspicion.

But the edges involved are very small and as I just showed if you consistently bet too much of your bankroll when you have a small expected value, you will lose money.Your expected change in log net worth is negative. That is quite different from losing money.

To figure out the ideal amount to bet you need the Kelly criterion.But that's not what you said. You said "beat the house". I don't know. Maybe the best strategy would be Kelly. But I doubt it. Even if it were, you would need to show that the strategy that maximizes the probability of making money is Kelly.

It is not if my goal is to make money.Andthe fact that people often have motivations other than making money is my point; what's the point of having money if you don't spend it on anything?

If the risk weren't compensated for the expected values wouldn't be the same.You clearly don't understand what "risk" means.

On the contrary. I'm using the common definition of "afford".By "the definition", I meant "his definition".

But there is also the additional \$15k.No ther isn't. There is no additional \$15k. The \$15k isn't additional, it's part of the \$85k expected value.

Nope, I'm not buying it. Kelly is flatly not applicable here for two reasons. (1) You do not control the size of the wager, and so there is no exponential growth to be optimized, and (2) the wager never diminishes your bank roll.You either wager zero, or a nonzeroamount. That's control. If you separate out the free \$85 from the wager, then the wager does diminish your bank roll.

Now, I would agree that if you were able to wager on a proposition with odds of 15:85 with probability of success being 0.85, then the Kelly formula would yield 0% of your bankroll as the optimal wager. How is that notthe case?

Yet, Option 2 is clearly (and provably) superior in the long run for repeated trials.No, it's not. There are particular parameters that it maximizes, but there is no objective sense in which any stategy is "superior".

At least with respect to the Kelly criterion, it does not matter. The size of the wager is independent of the bankroll size, a situation expressly contrary to the derivation of the Kelly formula and of its application.What do you mean?

For example, if you ask somebody how much they would pay to avoid a 1/1000 chance of death, the figure is low, but if you ask them how much you would have to give them to do something with a 1/1000 chance of death, the figure is much, much higher, even though the two scenerios are the same. That's irrational market behavior.Not necessarily. You are ignoring factors such as transaction costs, (for example, Buyer's Remorse).

baron
5th February 2007, 03:07 PM
But that's ridiculous! If I've deciding whether to pay \$10 for a steak, the only thing that matters is how much I enjoy steak. I'm not in it for money, I'm in it for a pleasing culinary experience. So why should I care what someone else would do, especially if that someone else is dealing in steaks solely to make money?

The question is not one of steak purchase. Choosing whether to pay \$10 for a steak does not normally require an appreciation of odds and risk. Evaluating the OP's scenario, which is entirely money-centred as opposed to steak-centred, does require such an appreciation. That's why the opinion of a professional gambler, in the original instance, is likely to be more valuable that that of an amateur or non-gambler.

Ben Tilly
5th February 2007, 09:20 PM
The assumption is that professional gamblers are both more proficient than amateurs and are in it to make money. Therefore their opinions mean more than the opinions of amateurs.
But that's ridiculous! If I've deciding whether to pay \$10 for a steak, the only thing that matters is how much I enjoy steak. I'm not in it for money, I'm in it for a pleasing culinary experience. So why should I care what someone else would do, especially if that someone else is dealing in steaks solely to make money?

The question was what the rational choice was, not the one that fit your desires. The point was that a professional gambler is expected to have an informed perspective on when it is rational to make a particular gamble.

That's a strong assertion to make. Particularly when in that post I gave the optimal answer, named the common name of the result that gives this strategy, and linked to the original 1956 math paper by Kelly where it first came up.
You provided the answer that maximizes the average of the log. It does not guarantee that the log will be maximized always. Sometimes, another strategy will turn out, in hindsight, to have been better. You shouldn't say that it has a 100% chance of being better.

If you review a good probability theory book you'll find that "100% chance" is equivalent to "probability 1" is equivalent to "the set of exceptions has measure 0".

None of those statements is equivalent to "there are no exceptions".

In the same book you should also find mention of the strong law of large numbers. Rather than wait for you to find that book, I'll just point you to http://planetmath.org/encyclopedia/StrongLawOfLargeNumbers.html. In its statement, 'a. s.' stands for "almost surely" which is a phrase meaning "with 100% probability".

So you see, I really am justified in saying "with 100% probability".

But if I put half my bankroll into the bet every time then the logarithm of my net value will, on average, drop by 0.132854913339209 each time.
You are assuming that "bankroll" and "net value" are the same. Surely most professional gamblers do not walk into a casino with their entire net worth in their pocket, and even if they did, betting half of it on one hand wouldn't even be considered. Not because of Kelly, but because their livelihood depends on not arousing suspicion.

To professional gamblers, their bankroll is, by definition, the amount of money which they have set aside and decided is potentially available for gambling. It is not exactly the same as net worth (a wise gambler leaves some money aside for necessities of various kinds), but it is often not that different from it.

A gambler almost never would walk into a casino with their entire bankroll on their person. Not because they want to avoid suspicion (frankly there is no amount of money within the means of most individuals that would cause suspicion at a typical casino), but because they do not expect to lose it all in one night. (Plus there is the possibility of an unfortunate robber.) However the size of the gambler's bankroll will dictate what risks that gambler is willing to take. And a known gambler who has a bad night frequently isn't limited to what they brought - odds are that there will be someone there who knows they are good for it and will lend them money to keep going.

However, despite the fact that the bankroll is not all on the gambler, the size of that bankroll plays a significant part in the decisions that that gambler makes.

But the edges involved are very small and as I just showed if you consistently bet too much of your bankroll when you have a small expected value, you will lose money.
Your expected change in log net worth is negative. That is quite different from losing money.

It is easy to program a monte carlo simulation of the gambler making that kind of bet with those odds. If you do, you'll see that the unwise gambler consistently loses money. If you wish to be theoretical, you can easily calculate the odds that the gambler is ahead after 100 rounds, or 1000 rounds. You'll find that they are vanishingly small.

If this is not the same as losing money, it is close enough for all practical intents and purposes.

To figure out the ideal amount to bet you need the Kelly criterion.
But that's not what you said. You said "beat the house". I don't know. Maybe the best strategy would be Kelly. But I doubt it. Even if it were, you would need to show that the strategy that maximizes the probability of making money is Kelly.

Are you actively trying to be stupid?

I'm serious. Are you actively trying to say things that you know are very silly to see if you can get a rise out of me?

Let's take the statements from last to first.

You say I need to show that the strategy that maximizes the probability of making money is Kelly. Well I've shown that the Kelly strategy will, with 100% odds, eventually result in the log of your net worth being more than it would be under any other strategy. Since the logarithm function is monotone increasing, that necessarily implies that someone playing the Kelly strategy will, with 100% odds, eventually beat someone using any other strategy. (Where the allowable strategies are "bet X% of my bankroll each time".)

I've said it. I've presented the explanation. I've presented you with Kelly's original paper. Yet you claim to still doubt this.

And finally we have the question of beating the house. What Thorp did was calculate the ideal strategy. He found, to everyone's surprise, that this strategy beat the house by about 1%. People had thought that blackjack had about a 5% edge for the house, so this was shocking. He demonstrated his strategy one weekend by taking \$10,000 to Las Vegas and coming back with \$21,000. Shortly after he wrote a best selling book explaining his strategy, and the casinos have been fighting card counters ever since.

The margins of this strategy are very small. For instance if you play the ideal strategy except that you do not vary bets based on counting cards, you'll steadily lose to the house. If you vary your bets but don't stay fairly close to Kelly, you'll again lose to the house. If you follow it perfectly, you'll make money. (And eventually will get kicked out of the casino - they know what to look for.)

Cheers,
Ben

Art Vandelay
6th February 2007, 12:59 AM
The question is not one of steak purchase. Choosing whether to pay \$10 for a steak does not normally require an appreciation of odds and risk. Evaluating the OP's scenario, which is entirely money-centred as opposed to steak-centred, does require such an appreciation. That's why the opinion of a professional gambler, in the original instance, is likely to be more valuable that that of an amateur or non-gambler.You just aren't getting it. If I'm deciding on whether to eat a steak, the only issue is how much I would enjoy eating a steak; the general valuation of a steak is irrelevant. If I'm trying to decide whether to make a bet, the only thing that matters is how much I would enjoy the bet. The general valuation of the bet is irrelevant. A professional gambling doesn't take into account anything but the money because that's all he cares about. An amateur gambler derives pleasure from gambling beyonmd merely the monetary value. Do you think that should just be ignored?

The question was what the rational choice was, not the one that fit your desires.Wow.

Just wow.

How can doing what one wants not be the rational choice? Math is meant to serve us, not us to serve math. What basis do you havefor saying that this is the rational choice?

In economics, whatever best gets what you want is what's rational. The only time anything can objectively be declared irrational is when there is arbitrage. Apparently you've arrogated to yourself the right to redefine "rational".

The point was that a professional gambler is expected to have an informed perspective on when it is rational to make a particular gamble.No, the point is that a professional gambler has
different prefences. A preference is not "right" or "wrong" or "rational" or "irrational". It just is. Why is that so hard to grasp?

If you review a good probability theory book you'll find that "100% chance" is equivalent to "probability 1" is equivalent to "the set of exceptions has measure 0".If I read a bad probability theory book, I'll see that claim. A good probability book, when it means "has measure 0", will say "measure 0".

And even if I ignore this lack of rigor, there's still the issue of "eventually". According to what you said, there is some finite point at which this strategy will beat every other, or, at the very least, for every other strategy, there is a finite point at which it is beaten. The "measure 0" issue only arises only "after" an infinte number of trials, which is not physically possible. All you can say is that the limit is zero. A limit andanactual value are completely different concepts, and object to your so casually interchanging them.

So to making it actually mathematically valid, it would have to be "What strategy is such that, for every other strategy, and for every probability greater than zero, there is a number of trials such that the probability of the second strategy beating the first is less than the previously specified probability?" Or, if you want to be concise, "What strategy beats every other with a probability that goes to 100%?"

In the same book you should also find mention of the strong law of large numbers. Rather than wait for you to find that book, I'll just point you to http://planetmath.org/encyclopedia/StrongLawOfLargeNumbers.html. In its statement, 'a. s.' stands for "almost surely" which is a phrase meaning "with 100% probability".No, they used "almost surely". Because it's a mathematically precise term. If you click on the term, you'll see an explanation, in which the phrase "with 100% probability" is nowhere to be seen. Does the fact that "100% probability" doesn't appear not tell you anything?

So you see, I really am justified in saying "with 100% probability".Argument by assertion.

If you're going to be condescending, you should at the very least have an argument to back it up.

If this is not the same as losing money, it is close enough for all practical intents and purposes.No, it's not. You're saying that the median is negative. But the mean is still positive. There are practical purposes for which the latter is more important.

I'm serious. Are you actively trying to say things that you know are very silly to see if you can get a rise out of me?Are you? Sheesh, you sure are arrogant. If you know anything about math, you know that precision, rigor, and exhaustive justification are key concepts. Yet when I ask you to fully justifiy yours claims, and point out that you are being imprecise, you jump on me.

Well I've shown that the Kelly strategy will, with 100% odds, eventually result in the log of your net worth being more than it would be under any other strategy.Eventually? We are discussing the result after a finite number of hands.

Since the logarithm function is monotone increasing, that necessarily implies that someone playing the Kelly strategy will, with 100% odds, eventually beat someone using any other strategy. You still need quite a bit more for that argument to be rigorous. Besides with, with your loose definition of "100%", every strategy with positive expected log will have "100%" chance of beating the house. So if that's all you care about, why go to the trouble of working out the Kelly formula?

Schneibster
6th February 2007, 01:21 AM
I'm serious. Are you actively trying to say things that you know are very silly to see if you can get a rise out of me?I believe, in fact, that it is- and it is on my ignore list for it. This is about the third time I've watched you interact with it, and IMHO you are wasting your time. You are of course free to do as you choose.

baron
6th February 2007, 05:16 AM
If I'm trying to decide whether to make a bet, the only thing that matters is how much I would enjoy the bet. Here's a tip for you. Never, ever gamble. Ever.

The general valuation of the bet is irrelevant. A professional gambling doesn't take into account anything but the money because that's all he cares about. An amateur gambler derives pleasure from gambling beyonmd merely the monetary value. Do you think that should just be ignored?

Yes, because it's assumptive claptrap of the worst kind. It isn't even worth the trouble of counter-argument.

RenaissanceBiker
6th February 2007, 06:23 AM
When I was a teen, I once won a \$5 bet from a guy who took 6 months to pay me. He had money, he was just a cheapskate. After 2 weeks I hounded him at every social event until he paid up. Later we were at Kings Island and I bet him \$5 he couldn't hold his hands up in the air for the whole ride on the Beast. He did it and I made a big show of paying him right away. "Do I have 6 months to pay you? Do I? I pay when I lose or I don't bet." I enjoyed that. It really wasn't about the money.

Ben Tilly
6th February 2007, 07:07 AM
I'm serious. Are you actively trying to say things that you know are very silly to see if you can get a rise out of me?
I believe, in fact, that it is- and it is on my ignore list for it. This is about the third time I've watched you interact with it, and IMHO you are wasting your time. You are of course free to do as you choose.

Obviously so. I just hate giving up on people, I want to give them the benefit of the doubt.

But I've gone beyond that with Art Vanderlay. In this thread and others he has consistently gone to lengths to avoid admitting to any error, and throws up excessive verbiage that has no purpose but to suggest to someone who isn't following the argument that he is winning it.

For instance look at the "100%" tangent. He took great delight in pointing out that the linked explanation of the strong law of large numbers linked to an explanation of "almost surely" that mentions nothing about 100%. However it links to an explanation that says that almost surely means "with probability 1". And it is ludicrous to think that he doesn't know that the way to convert from probabilities stated from 0 to 1 to probabilities stated from 0% to 100% is to multiply by 100. So probability 1 is the same as 100% probability. (Linguistic note, "percent" literally means "per 100". In other words, "How many times would I expect to see this outcome after 100 trials?") So the linked explanation exactly supports what I said. He knows that, and argues otherwise.

Too bad it isn't possible to tag the user name with a note saying "known troll" so that newbies won't be mislead into thinking that he is trying to say something useful.

Cheers,
Ben

jsfisher
6th February 2007, 08:40 AM
If I'm trying to decide whether to make a bet, the only thing that matters is how much I would enjoy the bet.

Here's a tip for you. Never, ever gamble. Ever.

The general valuation of the bet is irrelevant. A professional gambling doesn't take into account anything but the money because that's all he cares about. An amateur gambler derives pleasure from gambling beyonmd merely the monetary value. Do you think that should just be ignored?

Yes, because it's assumptive claptrap of the worst kind. It isn't even worth the trouble of counter-argument.
I have to agree with Art Vanderlay on this. For many, many people, gambling in its various forms can be recreation and/or social activities, totally separate from any realistic expectation of financial gain. Bingo, slots, keno all have negative financial expectations no matter how skilled the player. People who "invest" a buck a week on a lottery ticket aren't stupid, perhaps overly optimistic, but not necessarily stupid.

Number Six
6th February 2007, 08:52 AM
I have to agree with Art Vanderlay on this. For many, many people, gambling in its various forms can be recreation and/or social activities, totally separate from any realistic expectation of financial gain. Bingo, slots, keno all have negative financial expectations no matter how skilled the player. People who "invest" a buck a week on a lottery ticket aren't stupid, perhaps overly optimistic, but not necessarily stupid.

It's a slippery slope though. Some people start off with a realistic assessment of their chances but play anyway because they get pleasure commensurate with how much they'll lose, but the state (or the house or whoever is running the game) has a vested interest in the player becoming more and more unrealistic in his/her assessment of his/her chances and I think that that is often exactly what happens over time.

"Overly optimistic" is the same as "deceived" in this case. It doesn't mean they're stupid but it does mean they're misinformed about their chances. And if they're misinformed about their chances then they can't truly make a decision that says "The joy I get from playing is worth the amount of money I'm likely to lose."

baron
6th February 2007, 09:51 AM
I have to agree with Art Vanderlay on this. For many, many people, gambling in its various forms can be recreation and/or social activities, totally separate from any realistic expectation of financial gain. Bingo, slots, keno all have negative financial expectations no matter how skilled the player.

That's as may be, but in terms of the ongoing discussion you need to look at the topic under consideration. The original question is whether or not a person would gamble for \$100K or accept a guaranteed \$85K. For Art Vandelay to make gross - and largely incorrect - generalisations about how amateur gamblers don't care about the money (like they'd pass up on \$85K just for a laugh) whereas professional gamblers only care about the money is just plain silly. This simply stemmed from a comment that professional gamblers can judge the odds better than amateur ones, which is clearly the case.

People who "invest" a buck a week on a lottery ticket aren't stupid, perhaps overly optimistic, but not necessarily stupid.

Some are. Some are exceedingly stupid; remember we're talking about a cross-section of a typical population. Notwithstanding that, how many people do you think would continue to "invest" in the lottery if the top prize dropped to £10? Very few, I imagine.

Your argument may possibly hold out in some forms of low-level gambling, e.g. Bingo, some slots, but overall people will gamble because of a) the potential reward - the dream, if you like, and b) it gives them a buzz.

Certainly in this instance the idea of social or recreational gambling doesn't come into it.

And neither does eating steaks.

jsfisher
6th February 2007, 11:09 AM
That's as may be, but in terms of the ongoing discussion you need to look at the topic under consideration.
I was. There has been, in this discussion, a rigid view by some as to what is a valid consideration in deciding between the options. I do not share that view that there is a single, universal reason.
The original question is whether or not a person would gamble for \$100K or accept a guaranteed \$85K. For Art Vandelay to make gross - and largely incorrect - generalisations about how amateur gamblers don't care about the money (like they'd pass up on \$85K just for a laugh) whereas professional gamblers only care about the money is just plain silly. This simply stemmed from a comment that professional gamblers can judge the odds better than amateur ones, which is clearly the case.
For the case at hand, the odds were not at issue. The odds offered for option 2 make it equivalent to option 1 (in terms of expected value).

Were we to incrementally increase the amount offered by option 2, at what point would this hypothetical professional gambler select option 2? (I don't mean to reopen the whole Kelly Criterion discussion with this. Accept the question as rhetorical.) Even at \$100,001, option 2 offers a (slight) edge over option 1

Some are. Some are exceedingly stupid; remember we're talking about a cross-section of a typical population. Notwithstanding that, how many people do you think would continue to "invest" in the lottery if the top prize dropped to £10? Very few, I imagine.

Your argument may possibly hold out in some forms of low-level gambling, e.g. Bingo, some slots, but overall people will gamble because of a) the potential reward - the dream, if you like, and b) it gives them a buzz.

Certainly in this instance the idea of social or recreational gambling doesn't come into it.
The point was to challenge the notion that only a "professional gambler's opinion" was worth considering. The opening post posed the two options and asked whether there was a "rational reason to choose between the two". Nothing restricted rational reasoning to the perspective of a professional gambler nor was there any explicit requirement to put financial risk at the core of the reasoning.

You had tried to trivialize the lottery example by dropping the payout to £10. Ok, let's make the same sort of adjustment to the option 1 versus option 2 scenario. Option 1 gets you \$0.85 with certainty while option 2 gets you \$1.00 with probability 0.85 or zero with probability 0.15. Is the financial risk argument to choose option 1 all that convincing, now? Many rational people would decide to take option 2 with the rational reason of wanting to avoiding the fiddly change.

.13.
6th February 2007, 11:41 AM
You simply don't understand this, do you? Why not say so instead of posting the exact same thing every single time?

.13.
6th February 2007, 11:47 AM
By "the definition", I meant "his definition".

You should read more carefully what he wrote and what I wrote.

No ther isn't. There is no additional \$15k. The \$15k isn't additional, it's part of the \$85k expected value.

The options are 85k and 100k. 100k-85k=15k. The expected values are equal because there is the additional 15k.

baron
6th February 2007, 12:17 PM
I was. There has been, in this discussion, a rigid view by some as to what is a valid consideration in deciding between the options.

What rigid view is this, and who stated it? I said anyone who can judge risk (I mistakenly said "odds" in my last post, I concede, but the error should have been obvious after reading my previous posts) will take option 1. Do you disagree? I also said that in general, professional gamblers can judge risk better than amateur ones. Do you disagree with that?

Were we to incrementally increase the amount offered by option 2, at what point would this hypothetical professional gambler select option 2?

It would depend on the gambler, obviously. The fact that I can safely say a gambler wouldn't take a certain bet doesn't mean that a) I can predict what bets a gambler would take and b) all gamblers would have the same gambling threshhold.

The point was to challenge the notion that only a "professional gambler's opinion" was worth considering.

And who made this bizarre statement (apart from you)? What I said - multiple times - is that a professional gambler's opinion is likely to be worth more than an amateur's in this case.

The opening post posed the two options and asked whether there was a "rational reason to choose between the two". Nothing restricted rational reasoning to the perspective of a professional gambler nor was there any explicit requirement to put financial risk at the core of the reasoning.

What would you put at the core of the reasoning, in this instance? What might cause a reasonable person to risk getting nothing over getting a guaranteed \$85K?

You had tried to trivialize the lottery example by dropping the payout to £10.

Yes, to show that gambling enjoyment, in the case of the lottery, is largely based on the amount of money on offer.

Ok, let's make the same sort of adjustment to the option 1 versus option 2 scenario. Option 1 gets you \$0.85 with certainty while option 2 gets you \$1.00 with probability 0.85 or zero with probability 0.15. Is the financial risk argument to choose option 1 all that convincing, now? Many rational people would decide to take option 2 with the rational reason of wanting to avoiding the fiddly change.

In what way does this differ from my first post, which even uses exactly the same figures, or my later posts when I explain at length how the sum of money would affect the decision?

Have you read one single word of what I've posted? You're like a stuck record.

Can anyone really believe that if a person in the real world does not make decisions based on mathematical probability they are being unreasonable?

Schneibster
6th February 2007, 03:44 PM
Look, here's the deal: you're using the concept "expected value" in a situation where there is only one trial. "Expected value" SPECIFICALLY refers to the probability OVER A LARGE NUMBER OF TRIALS. Get over "expected value" when we're talking about ONE trial.

The first option involves 0% chance that you will get nothing
The second option involves 15% chance that you will get nothing
choose accordingly.

This is the same fallacy that gets someone to bet you at 2:1 that you will not flip another head AFTER you've already flipped one. It's a great way to get free beers, but it doesn't work with anyone who's alert.

Art Vandelay
6th February 2007, 05:51 PM
Yes, because it's assumptive claptrap of the worst kind. It isn't even worth the trouble of counter-argument.I'm making assumptions? You're the one claiming that there is no nonmonetary benefit.

But I've gone beyond that with Art Vanderlay. In this thread and others he has consistently gone to lengths to avoid admitting to any error, and throws up excessive verbiage that has no purpose but to suggest to someone who isn't following the argument that he is winning it.If you want me to admit an error, first you have to name one. Apparently, I'm not allowed to present a counter argument without you dismissing it as "excessive verbiage".

For instance look at the "100%" tangent. He took great delight in pointing out that the linked explanation of the strong law of large numbers linked to an explanation of "almost surely" that mentions nothing about 100%.It was your cite! You are somehow trying to criticize me for pointing out that YOUR OWN CITE doesn't support you? You're being incredibly rude.

However it links to an explanation that says that almost surely means "with probability 1".I skimmed through the article looking for "100%". I didn't see " 'with probability 1' ". Notice the scare quotes in the original. Do they indicate anything to you?

And it is ludicrous to think that he doesn't know that the way to convert from probabilities stated from 0 to 1 to probabilities stated from 0% to 100% is to multiply by 100. So probability 1 is the same as 100% probability. No, they indicate different contexts. "100%" indicates no exceptions. "Probability 1" indicates that we are operating under a specific definition of probability.
And you didn't say "100% probability", anyway. You said "100% odds", further implying that you were not referring to a specific mathematical concept.

There is a distinction between math use of terms and lay use. If you want to refer to the former, you should make that clear. In some contexts, "1000 grams" means the same as "1 kg". In others, they don't, because the former indicates more significant digits.

You think that "100%" is appropriately clear. I don't. That's a matter of opinion. Not error.

So the linked explanation exactly supports what I said. He knows that, and argues otherwise.I did not see that. So I didn't know, your arrogant mind reading aside. So you are in error. Will you admit it? Not bloody likely.

Furthermore, you're harping on a small part, trying to disctract from the central point. I guess you want people to forget about how I've already proven you to be wrong. There is no point at which the probability of beating every other strategy beats every other.

You are wrong. You are wrong. You are wrong.

Not only are you wrong, you express your views with condescension, personal attacks, and other acts that show you to be lacking in civility.

Too bad it isn't possible to tag the user name with a note saying "known troll" so that newbies won't be mislead into thinking that he is trying to say something useful.Misled.

Here's another example of you "debating" skills: http://forums.randi.org/showthread.php?t=70462&page=3&highlight=tilly

I previously tried to get some points through to you, but gave up after you repeatedly failing to provide reasonable responses (you even went so far as to accuse me of calling you a liar simply because I asked you to support one of your claims. You referred to "Just because you don't understand the reason doesn't mean they are absurd." as a "putdown". Because, apparently, if you claim to understand something, and I disagree, that's a "putdown". How can I have a discussion with someone who considers it an insult when someone disagrees with them?

You claimed that "My claims are a matter of easily established fact." yet your proof depended on your opinion.

That's as may be, but in terms of the ongoing discussion you need to look at the topic under consideration. The original question is whether or not a person would gamble for \$100K or accept a guaranteed \$85K. For Art Vandelay to make gross - and largely incorrect - generalisations about how amateur gamblers don't care about the money (like they'd pass up on \$85K just for a laugh) whereas professional gamblers only care about the money is just plain silly.This is just offensive. You are the one who said "Evaluating the OP's scenario, which is entirely money-centred as opposed to steak-centred, does require such an appreciation." And now you're trying to pretend that I'm the one who introduced the
generalization of "care only about the money". I never said that amateur gamblers don't care about money. I guess you are completely unable to address the actual point, so need to do divert the discussion to your ridiculous personal attacks.

Certainly in this instance the idea of social or recreational gambling doesn't come into it.
And why not?

You should read more carefully what he wrote and what I wrote.Or you could actually defend your position.

The options are 85k and 100k. 100k-85k=15k. The expected values are equal because there is the additional 15k.Since the expected values are equal, the 15k IS NOT ADDITIONAL. How many times do I have to say this?

jsfisher
6th February 2007, 06:38 PM
What rigid view is this, and who stated it?
The "rigid view" was in reference to "financial risk" as a lone consideration. Do I really need to enumerate all the posts that seem to be based on such an assumption?

I said anyone who can judge risk (I mistakenly said "odds" in my last post, I concede, but the error should have been obvious after reading my previous posts) will take option 1. Do you disagree?

I also said that in general, professional gamblers can judge risk better than amateur ones. Do you disagree with that?
Yes, I disagree. You seem to be assuming that professional gamblers are axiomatically skilled evaluators of risk. Let's take sports gambling, for example. There, to be successful, identifying overlays is important; many successful gamblers--skilled at finding the premium betting propositions--rely on canned money management schemes simply because they aren't very good at evaluating risk.

It would depend on the gambler, obviously. The fact that I can safely say a gambler wouldn't take a certain bet doesn't mean that a) I can predict what bets a gambler would take and b) all gamblers would have the same gambling threshhold.
Well, your assumption that a professional gambler wouldn't take option 2 is just that, an assumption. As for the threshold, I did not expect anyone to define the precise point where this hypothetical gambler switches options; it was a thought experiment meant to point out that these risks concepts being floated in this thread a mushy.

And who made this bizarre statement (apart from you)? What I said - multiple times - is that a professional gambler's opinion is likely to be worth more than an amateur's in this case.
First off, there was nothing bizarre with the statement, "The point was to challenge the notion that only a 'professional gambler's opinion' was worth considering." Second, it was not a quotation being attributed to anyone in particular. However, I do submit that your statement of believing a professional gambler's opinion likely to have greater worth does, in fact, connote the notion I suggested.

What would you put at the core of the reasoning, in this instance? What might cause a reasonable person to risk getting nothing over getting a guaranteed \$85K?
Several posts have offered reasonable responses to this question. Utility of the \$85,000 versus \$100,000 was one of the themes. What would constitute a "rational reason" to favor one option over the other could well vary by individual.

baron
7th February 2007, 06:36 AM
This is just offensive.

Only if you don't like being proved wrong.

You are the one who said "Evaluating the OP's scenario, which is entirely money-centred as opposed to steak-centred, does require such an appreciation." And now you're trying to pretend that I'm the one who introduced the
generalization of "care only about the money".

Try and keep up. My point was in relation to the OP, referincing specific odds and specific sums of money. Your point was a sweeping generalisation about what you imagine professional gamblers do and what amateur gamblers do in their day-to-day dealings.

I never said that amateur gamblers don't care about money.

You inferred it strongly, in between rampant generalisation and irrelevant talk of steak consumption. Let's examine what you said ~

A professional gambling [SIC] doesn't take into account anything but the money because that's all he cares about. An amateur gambler derives pleasure from gambling beyonmd merely the monetary value.

Here you state clearly that a professional gambler only cares about the money, and that the amateur gambler is the only one who derives pleasure from gambling. Presumably the pro's lead a really miserable life.

You go on to say ~

If I'm trying to decide whether to make a bet, the only thing that matters is how much I would enjoy the bet.

So you personally admit to not caring about the money. The clear inference is that any normal person would do as you do, unless you are highlighting yourself to be abnormal in this respect.

So that is why I summarised you in saying you believe pro gamblers only care about the money (verbatim) and amateur gamblers don't (strongly inferred).

I guess you are completely unable to address the actual point

I'll take my steak medium rare please.

so need to do divert the discussion to your ridiculous personal attacks.

Go and find out what a person attack is, then report back.

From where do you draw your conclusions (that people who appreciate risk would go for option 2)? Everybody I have spoken to would go for option 1. So would I. So would everyone who's posted so far expressing a definite preference.

Yes, I disagree. You seem to be assuming that professional gamblers are axiomatically skilled evaluators of risk.

Yes I am. I am no more inclined to argue this point than I would debate that marathon runners have above-average endurance or nuclear scientists above-average intelligence.

Well, your assumption that a professional gambler wouldn't take option 2 is just that, an assumption.

No it isn't. I asked two (both ex-, but that's irrelevant). They both said option 1, without a shadow of a doubt. When I say that most pro gamblers would take option 1 then that's a presumption based on evidence and common sense.

Utility of the \$85,000 versus \$100,000 was one of the themes. What would constitute a "rational reason" to favor one option over the other could well vary by individual.

We're talking about reasonable people in a reasonable situation. We're not talking about outlandish scenarios such as someone being shot if they don't obtain a full \$100K instantly, or people who are so wealthy they'd choose to gamble just to show off.

MortFurd
7th February 2007, 07:28 AM
A finance professor offered this scenerio in an explanation about rational market behavior and expected value calculations. I disagree with his analysis. First, the scenerio:

Choose between 1) getting \$85,000, or 2) an 85 percent change of gettting \$100,000.

His argument was there is no rational reason to choose between the two, since they both have the same expected value.

Naturally, I agree they both have the same expected value. Run the scenerios enough times, and either choice 1 or 2 will give an average payoff of \$85,000/episode.

However, I still would strongly prefer 1. I'll go further. Modify 2 to be an 86% chance of getting 100,000, I'll still choose 1.

Why? Well, expected value computations rely on the law of large numbers. Do #2 enough times, and you'll average 85K (or 86K) per episode.

However, I don't know about you, but I am not often offered chances at 85K.
Second, 85K is a big sum of money, but the extra \$15K I might get in scenerio two is not much of an incentive. Yes, I'd rather have 100K than 85K, but it's not a big deal in comparision. In short, my life would be improved quite a bit with the 85K, but only marginally improved more with the extra 15K. So, I go for the sure thing, even though the expected value of 1 is lower than 2 in my revised scenerio.

I'm not interested in bickering about this to death, but I am interested in what other people would choose to do in either scenerio. Would you have a preference between 1 and 2 in the original scenerio? What about in my revised scenerio? Remember, you are being offered this once, not a million times or so.

ETA: I agreed with the bigger argument he was making about expected value calculations in markets - we often do it irrationally. If you are a regular investor you better assume the law of large numbers is in force. I'm just picking a nit.
I've been following this since it started, and I really don't see where the argument lies.

As the professor is quoted, the rational decision is to take the \$85,000 and run. The other option is rather dim-witted. Option 2 assumes that a phantom \$15,000 is worth more than \$85,000 in your pocket- stupidity in the extreme.

If there is a repeated opportunity to get the \$100,000 then the situation changes. Only if I can repeatedly try for the \$100,000 and get to keep everything that I win does the \$100,000 become attractive.

Looked at it another way:
If I do nothing I get the \$85,000. Count that then as money in my pocket. Now I can bet that \$85,000 on a 15% chance of getting either \$100,000 back or losing it all. That's 15% chance of losing \$85,000 against an 85% chance of getting an extra \$15,000.

Who in his right mind is going to throw away \$85,000 in order to get \$15,000?

boooeee
7th February 2007, 12:59 PM
Who in his right mind is going to throw away \$85,000 in order to get \$15,000?

You need a kidney transplant or you'll die. A kidney transplant costs \$95,000. In that case, you take your chance with the 85%.

I realize that it's an unrealistic hypothetical, but the question of whether to take the bet is not black and white.

It would also be rational to take the bet if it could be used as a hedge against some other correlated outcome.

jsfisher
7th February 2007, 03:57 PM
Who in his right mind is going to throw away \$85,000 in order to get \$15,000?
My concern with this thread is not so much the conclusion many are advocating, but with the logic--or more correctly, lack of it--people have used to reach it. You can almost make a game of it, "Name that fallacy!"

The most reasonable approach appears to be one based on "financial risk": Option 1 is the rational choice because you get \$85,000 with certainty while with option 2 you would end up with either \$0 or \$100,000, but the risk of getting \$0 is too great to compensate for the possibility of getting \$15,000 more than option 1.

Ok, fine, but how was "risk" assessed?

If we alter the probabilities in option 2, then at what point could option 2 become more favorable? That is not a shallow question. If there were a probability of 99% of getting the \$100,000, is it then the rational choice? What about 86%?

What makes the risk at 85% / \$100,000 too great?

My point isn't to demand a precise formula for assessing risk. It is simply to point out that it may not be as cut-and-dried as some seem to allege.

baron
8th February 2007, 05:30 AM
What makes the risk at 85% / \$100,000 too great?

The same thing that makes any other absurd risk too great, as assessed by the application of common sense and a basic risk analysis.

MortFurd
8th February 2007, 05:42 AM
You need a kidney transplant or you'll die. A kidney transplant costs \$95,000. In that case, you take your chance with the 85%.

I realize that it's an unrealistic hypothetical, but the question of whether to take the bet is not black and white.

It would also be rational to take the bet if it could be used as a hedge against some other correlated outcome.

Absolutely correct about the kidney thing. In that situation, I might do the same thing. On the other hand, \$85,000 towards \$95,000 is a good start and I expect you could get a hospital to go along with the \$85,000 as a sort of "down payment."

Cuddles
8th February 2007, 06:39 AM
What makes the risk at 85% / \$100,000 too great?

It's as others have already said. In order to take a risk you have to have a payoff to compensate for it. At the 85% mark the expectations are exactly equal and therefore there is no extra payoff, therefor taking the risk is a bad choice. At any higher probablity (or payment of course) there is a payoff for the risk and it becomes a matter of judging whether the risk is worth it. However, that is an entirely different question. As it was first stated, the \$85000 is the only logical choice. If you ask a different question it is no longer cut and dried and becomes a question about the specific conditions in which the choice is made, but as it stands we can say with certainty that the risk is too great because there is no benefit to taking the risk.

MortFurd
8th February 2007, 07:41 AM
My concern with this thread is not so much the conclusion many are advocating, but with the logic--or more correctly, lack of it--people have used to reach it. You can almost make a game of it, "Name that fallacy!"

The most reasonable approach appears to be one based on "financial risk": Option 1 is the rational choice because you get \$85,000 with certainty while with option 2 you would end up with either \$0 or \$100,000, but the risk of getting \$0 is too great to compensate for the possibility of getting \$15,000 more than option 1.

Ok, fine, but how was "risk" assessed?
If we alter the probabilities in option 2, then at what point could option 2 become more favorable? That is not a shallow question. If there were a probability of 99% of getting the \$100,000, is it then the rational choice? What about 86%?

What makes the risk at 85% / \$100,000 too great?

My point isn't to demand a precise formula for assessing risk. It is simply to point out that it may not be as cut-and-dried as some seem to allege.
For the given conditions, it is cut and dried. For a one shot deal, I get \$85,000. The extra \$15,000 are phantom, there only to distract the sucker from taking a sure thing.

If the OP is repeated, then on the average I will gain \$85,000 for each repeat. Repeat 10 times, and I'll have on the order of \$850,000 no matter which way I go. The more repeats I can make, the closer the two sums come to matching.

Only if you can repeat does a risk analysis make sense with the given conditions - ignoring for now the "loan shark busting your knees if you don't cough up \$100,000 by tomorrow" and other scenarios that may crop up in the real world.

A risk analysis would be something along the lines of "how many chances must I get to repeat the scenario in order to be guaranteed \$X,000?" The closer X is to \$85,000, the more times you will have to repeat.

In the end, though, the repeat scenarion sucks. It assumes that some chump is standing around, handing out stacks of cash to whoever asks. In that case, the rational thing to do is to just keep asking him and taking the \$85,000. Over a short period you might fill your pockets a little faster if you go the "85% chance of \$100,000" route, but in the long run you will still average \$85,000 for each time you hit him up. No gain but for the hassle of flipping coins or playing a few rounds of "paper, scissors, rock" or whatever.

So, in the one time scenario as presented, with no ther conditions allowed, the \$85,000 is the rational thing to do. If the scenario were to be played out repeatedly, then taking the \$85,000 is still the rational thing to do.