Jump to content

the kelly principle


Recommended Posts

when you have a chance of winning that is 30%, as long as you don't spend more than 30% of your bankroll you will never go broke.what is that theory called? is that called the kelly (sp?) principle? I saw/read dan harrington talk about it a while ago but forgot what it is called. Or at least cannot google it.please help

Link to post
Share on other sites

the kelly criterion... i copied and pasted this from somewhere a long time ago, so i don't remember where it came from. this is really long, but i'll paste it here for anyone interested.Here's the text:Suppose you are playing a game in which you have some sort of edge. This edge can consist of any combination of probability and odds such that each bet has a positive expectation. Now you ask yourself, "What fraction of my bankroll should I wager on this game?" If your answer is, "The amount that if I keep repeating this same strategy over a long period of time, my bankroll increases at its maximum rate.", then your answer is to bet "Kelly". To see how we can find such an answer, we should look at it from the beginning. Suppose you bet a fraction f (0<1) of your current bankroll B, and you are getting odds of v-to-1. If you win the bet, you get back your original bet, plus v times the amount of your bet (which was f*B). If you lose, you lose the bet amount: Bet size = f*B, odds = v-to-1, amount won = v*f*B, amount lost = f*B If you win, your new bankroll B' is going to equal your old bankroll plus your winnings: Win: B' = B + v*f*B = (1+v*f)*B If you lose, your new bankroll will equal your old bankroll minus your losses: Lose: B' = B - f*B = (1-f)*B Let's take a short timeout to make sure these funny looking equations make sense by looking at an example (I'm a huge fan of concrete examples). Say you have $100, and you bet $20, getting 2-to-1. For the sake of future considerations, we'll assume the game is a fair coin flip (but the need for the probabilities does not come into play for awhile yet). If you win this bet, you win $40, raising your bankroll to $140. The equation above shows this correctly: B = $100, f = 1/5 = 0.2, v = 2: B' = (1+v*f)*B = (1+2*0.2)*($100) = $140 The "Lose" equation works in a similar manner. Now, suppose you play the game multiple times. Assuming no "pushes" (which I will be assuming throughout), then the "new" bankroll (which we have called B') after a game is played becomes the "old" bankroll for the next game to be played. We are assuming the game and the odds offered don't change, and that we do not choose to change our strategy (the fraction of bankroll to be wagered), so the equations given remain the same for each game, with the previous game's B' becoming the next game's B. Your result after winning twice in a row would be: B after two wins = (1+v*f)*[b after one win] = (1+v*f)*[(1+v*f)*(starting B)]B after two wins = [(1+v*f)^2]*(starting B) A check: Suppose in the example above you won again, employing the same strategy. Your new bankroll was $140, so you wagered 1/5 of it ($28) and won at 2-to-1, for a total win of $56. Now your new bankroll is: $140+$56 = $196. Plugging into the equation gives the same result: B after two wins = [(1+v*f)^2]*(starting B) = [(1+2*0.2)^2]*($100)B after two wins = [(1.4)^2]*($100) = [1.96]*($100) = $196 The two losses results work out the same way. What about a win and a loss, you ask? You just multiply the starting bankroll by the "lose factor" (1-f), and the "win factor" (1+v*f), and the result is the new bankroll. Note that it doesn't even matter if you lost first or won first. Back to our example: You won the first game and lost the second: $100 + $40 = $140 after first game, then lost $28 (you wagered 1/5 of it) in the second game for a total of $140-$28 = $112. If you lost first, you have $100 - $20 = $80 after the first game, and then you wager 1/5th of $80 (=$16) at 2-to-1 in the second game and win, for a total of $80 + 2*($16) = $112. Same amount as if you win first. [bTW, this less than obvious fact may affect retirement planning for those of you thinking about Roth IRA's vs. traditional ones. I won't digress any further on this topic.] Using the equation gives the same result: B after 1 win and 1 loss = (1+v*f)*(1-f)*(starting B) B after 1 win and 1 loss = (1+2*0.2)*(1-0.2)*($100) = (1.4)*(0.8)*$100 = $112 Okay, so lets say we have won "w" times out of n total games. This means we have lost n-w times (since we assumed no pushes). To find the new bankroll, we need to multiply the starting bankroll by (1+v*f) a total of w times, and multiply it by (1-f) a total of n-w times. In other words, our "new bankroll equation" has gotten much more complicated: B' = [(1+v*f)^w]*[(1-f)^(n-w)]*B The first factor in brackets is just the "win factor" multiplied by itself w times, and the second is the "lose factor" multiplied by itself n-w times. The two "B's" in this equation are not important, so we will instead look at just the factor multiplying B, as this is what determines the bankroll's growth (or lack thereof): B'/B = [(1+v*f)^w]*[(1-f)^(n-w)] This gives us the factor that the bankroll has changed from the beginning (after n games). We want to look at the bankroll's growth rate PER GAME, so if we call the average-per-game-factor "y", then after n games, the bankroll has grown by a factor of y^n. The average-per-game-factor is then found to be: B'/B = y^n y = (B'/B)^(1/n) = {[(1+v*f)^w]*[(1-f)^(n-w)]}^(1/n) Again we return to our example. After 2 games where we won 1 and lost 1, we have gone from $100 to $112. This is an increase of a factor of 1.12 for 2 games. On average, this is an increase PER GAME of a factor of sqrt(1.12) = 1.058 . In other words, if in the game described, we win the same number that we lose (i.e. we assumed way back when that the probability of winning is 1/2), on average we will increase our bankroll each game by a factor of 1.058. Now here's the important part: *** If we employ a different strategy (risk a different fraction "f" of the bankroll), then this factor will also change. We seek to find the f for which this factor is a MAXIMUM. *** Now finding the value of f for which this function peaks is no small matter. It involves calculus. If this intimidates you, I invite you to jump down to below the second set of "*'s" to see the answer. I include the calculus for the math.weenies that may find it interesting... ******** The value of f for which y(f) is a maximum is the same value for which ln[y(f)] is a maximum, so we can equivalently seek to maximize: z(f) = ln[y(f)] = (1/n)*[w*ln(1+v*f) + (n-w)*ln(1-f)] The derivative is: dz/df = (w/n)*v/(1+v*f) - [1-w/n]/(1-f) Setting equal to zero and finding f gives: ******** f = [p*(v+1)-1]/v, where p = w/n. Note that in the long run, the fractional number of times you will win a game (w/n) equals the probability of winning a single game, so p = probability of winning. Okay, this is our answer. Given that your chance of winning is p, and that you are receiving v-to-1 odds on your bet, then the fraction of your bankroll that you should wager to maximize your rate of bankroll growth is the f given above. This value can also be plugged back in above to find out what the maximum growth rate actually comes out to be. Let's try it for the game we've been using as an example. We have a probability of p=0.5 of winning, and odds of v=2, so the fraction of our bankroll we should risk each time is: f = [p*(v+1)-1]/v = [0.5*(2+1)-1]/2 = 1/4 y = [(1+v*f)^p]*[(1-f)^(1-p)] = 1.061 (recall p=w/n) The bankroll increases an average of 6.1% over its preceding amount every game. [using the "rule of 72" familiar to bankers and investors, this means the bankroll will double roughly every 12 games.] Most people like to remember the Kelly criterion using a mnemonic that goes something like: "Bet the fraction of your bankroll that equals your percentage advantage." It should be understood that this ONLY applies to bets with even money odds (v=1). Note that with v=1, f comes out to be equal to 2p-1, which is exactly your percentage advantage. The origin of this mnemonic is probably from blackjack, which would explain why the even money assumption is made. A better, more general mnemonic would be: "Bet the fraction of your bankroll equal your percentage advantage divided by the 'to-1' odds." For example, if you have a 10% advantage and you are getting 5-to-2 odds, then the fraction of your bankroll to bet is (0.10)/2.5 = 0.04 = 4%. If you plowed through this whole post, my compliments. It's possible that the only people willing (able?) to follow it all the way through are the people who already understand all this, which means I was drawing dead as I wrote it. I hate when that happens. Tom Weideman

Link to post
Share on other sites
Am I the only one who is confused here?  As long as you spend less than 100% of your stack chasing a draw you won't go broke...Someone elaborate for me?
wow. i meant broke over time. not that hand.anyway. it's not in his books. just an interview on espn from the wsop i think.
Link to post
Share on other sites
the kelly criterion...  i copied and pasted this from somewhere a long time ago, so i don't remember where it came from.  this is really long, but i'll paste it here for anyone interested.Here's the text:Suppose you are playing a game in which you have some sort of edge. This edge can consist of any combination of probability and odds such that each bet has a positive expectation. Now you ask yourself, "What fraction of my bankroll should I wager on this game?" If your answer is, "The amount that if I keep repeating this same strategy over a long period of time, my bankroll increases at its maximum rate.", then your answer is to bet "Kelly".  To see how we can find such an answer, we should look at it from the beginning. Suppose you bet a fraction f (0<1) of your current bankroll B, and you are getting odds of v-to-1. If you win the bet, you get back your original bet, plus v times the amount of your bet (which was f*B). If you lose, you lose the bet amount:  Bet size = f*B, odds = v-to-1, amount won = v*f*B, amount lost = f*B  If you win, your new bankroll B' is going to equal your old bankroll plus your winnings:  Win: B' = B + v*f*B = (1+v*f)*B  If you lose, your new bankroll will equal your old bankroll minus your losses:  Lose: B' = B - f*B = (1-f)*B  Let's take a short timeout to make sure these funny looking equations make sense by looking at an example (I'm a huge fan of concrete examples).  Say you have $100, and you bet $20, getting 2-to-1. For the sake of future considerations, we'll assume the game is a fair coin flip (but the need for the probabilities does not come into play for awhile yet). If you win this bet, you win $40, raising your bankroll to $140. The equation above shows this correctly:  B = $100, f = 1/5 = 0.2, v = 2: B' = (1+v*f)*B = (1+2*0.2)*($100) = $140  The "Lose" equation works in a similar manner.  Now, suppose you play the game multiple times. Assuming no "pushes" (which I will be assuming throughout), then the "new" bankroll (which we have called B') after a game is played becomes the "old" bankroll for the next game to be played. We are assuming the game and the odds offered don't change, and that we do not choose to change our strategy (the fraction of bankroll to be wagered), so the equations given remain the same for each game, with the previous game's B' becoming the next game's B. Your result after winning twice in a row would be:  B after two wins = (1+v*f)*[b after one win] = (1+v*f)*[(1+v*f)*(starting B)]B after two wins = [(1+v*f)^2]*(starting B)  A check: Suppose in the example above you won again, employing the same strategy. Your new bankroll was $140, so you wagered 1/5 of it ($28) and won at 2-to-1, for a total win of $56. Now your new bankroll is:  $140+$56 = $196. Plugging into the equation gives the same result:  B after two wins = [(1+v*f)^2]*(starting B) = [(1+2*0.2)^2]*($100)B after two wins = [(1.4)^2]*($100) = [1.96]*($100) = $196  The two losses results work out the same way. What about a win and a loss, you ask? You just multiply the starting bankroll by the "lose factor" (1-f), and the "win factor" (1+v*f), and the result is the new bankroll. Note that it doesn't even matter if you lost first or won first. Back to our example:  You won the first game and lost the second: $100 + $40 = $140 after first game, then lost $28 (you wagered 1/5 of it) in the second game for a total of $140-$28 = $112. If you lost first, you have $100 - $20 = $80 after the first game, and then you wager 1/5th of $80 (=$16) at 2-to-1 in the second game and win, for a total of $80 + 2*($16) = $112. Same amount as if you win first. [bTW, this less than obvious fact may affect retirement planning for those of you thinking about Roth IRA's vs. traditional ones. I won't digress any further on this topic.]  Using the equation gives the same result:  B after 1 win and 1 loss = (1+v*f)*(1-f)*(starting B) B after 1 win and 1 loss = (1+2*0.2)*(1-0.2)*($100) = (1.4)*(0.8)*$100 = $112  Okay, so lets say we have won "w" times out of n total games. This means we have lost n-w times (since we assumed no pushes). To find the new bankroll, we need to multiply the starting bankroll by (1+v*f) a total of w times, and multiply it by (1-f) a total of n-w times. In other words, our "new bankroll equation" has gotten much more complicated:  B' = [(1+v*f)^w]*[(1-f)^(n-w)]*B  The first factor in brackets is just the "win factor" multiplied by itself w times, and the second is the "lose factor" multiplied by itself n-w times. The two "B's" in this equation are not important, so we will instead look at just the factor multiplying B, as this is what determines the bankroll's growth (or lack thereof):  B'/B = [(1+v*f)^w]*[(1-f)^(n-w)]  This gives us the factor that the bankroll has changed from the beginning (after n games). We want to look at the bankroll's growth rate PER GAME, so if we call the average-per-game-factor "y", then after n games, the bankroll has grown by a factor of y^n. The average-per-game-factor is then found to be:  B'/B = y^n  y = (B'/B)^(1/n) = {[(1+v*f)^w]*[(1-f)^(n-w)]}^(1/n)  Again we return to our example. After 2 games where we won 1 and lost 1, we have gone from $100 to $112. This is an increase of a factor of 1.12 for 2 games. On average, this is an increase PER GAME of a factor of sqrt(1.12) = 1.058 . In other words, if in the game described, we win the same number that we lose (i.e. we assumed way back when that the probability of winning is 1/2), on average we will increase our bankroll each game by a factor of 1.058. Now here's the important part:  *** If we employ a different strategy (risk a different fraction "f" of the bankroll), then this factor will also change. We seek to find the f for which this factor is a MAXIMUM. ***  Now finding the value of f for which this function peaks is no small matter. It involves calculus. If this intimidates you, I invite you to jump down to below the second set of "*'s" to see the answer. I include the calculus for the math.weenies that may find it interesting...  ********  The value of f for which y(f) is a maximum is the same value for which ln[y(f)] is a maximum, so we can equivalently seek to maximize:  z(f) = ln[y(f)] = (1/n)*[w*ln(1+v*f) + (n-w)*ln(1-f)]  The derivative is:  dz/df = (w/n)*v/(1+v*f) - [1-w/n]/(1-f)  Setting equal to zero and finding f gives:  ********  f = [p*(v+1)-1]/v, where p = w/n.  Note that in the long run, the fractional number of times you will win a game (w/n) equals the probability of winning a single game, so p = probability of winning.  Okay, this is our answer. Given that your chance of winning is p, and that you are receiving v-to-1 odds on your bet, then the fraction of your bankroll that you should wager to maximize your rate of bankroll growth is the f given above. This value can also be plugged back in above to find out what the maximum growth rate actually comes out to be.  Let's try it for the game we've been using as an example. We have a probability of p=0.5 of winning, and odds of v=2, so the fraction of our bankroll we should risk each time is:  f = [p*(v+1)-1]/v = [0.5*(2+1)-1]/2 = 1/4  y = [(1+v*f)^p]*[(1-f)^(1-p)] = 1.061 (recall p=w/n)  The bankroll increases an average of 6.1% over its preceding amount every game. [using the "rule of 72" familiar to bankers and investors, this means the bankroll will double roughly every 12 games.]  Most people like to remember the Kelly criterion using a mnemonic that goes something like: "Bet the fraction of your bankroll that equals your percentage advantage." It should be understood that this ONLY applies to bets with even money odds (v=1). Note that with v=1, f comes out to be equal to 2p-1, which is exactly your percentage advantage. The origin of this mnemonic is probably from blackjack, which would explain why the even money assumption is made. A better, more general mnemonic would be:  "Bet the fraction of your bankroll equal your percentage advantage divided by the 'to-1' odds."  For example, if you have a 10% advantage and you are getting 5-to-2 odds, then the fraction of your bankroll to bet is (0.10)/2.5 = 0.04 = 4%.  If you plowed through this whole post, my compliments. It's possible that the only people willing (able?) to follow it all the way through are the people who already understand all this, which means I was drawing dead as I wrote it. I hate when that happens.  Tom Weideman
i've never gotten chub from a reply before. thank you very much. i'll metion you in my article.
Link to post
Share on other sites
Am I the only one who is confused here?  As long as you spend less than 100% of your stack chasing a draw you won't go broke...Someone elaborate for me?
wow. i meant broke over time. not that hand.anyway. it's not in his books. just an interview on espn from the wsop i think.
mmm, sorry, I guess I should have known that by stack you meant bankroll even though no one else uses those interchangeably...Either way what you said still doesn't make sense, I guess I'll read chiggleslaps post when I muster up the mental energy :-)
Link to post
Share on other sites
Am I the only one who is confused here?  As long as you spend less than 100% of your stack chasing a draw you won't go broke...Someone elaborate for me?
wow. i meant broke over time. not that hand.anyway. it's not in his books. just an interview on espn from the wsop i think.
mmm, sorry, I guess I should have known that by stack you meant bankroll even though no one else uses those interchangeably...Either way what you said still doesn't make sense, I guess I'll read chiggleslaps post when I muster up the mental energy :-)
check out: A New Interpretation of Information Rate (Bell System Technical Journal, 35, 917-926 it's the article i was looking for. new it as the kelly pronciple. thanks very much.
Link to post
Share on other sites

There is a new book out about the Kelley criteria and its application to both gambling and investments called "Fortune's Forumla". A very interesting read.Basically, the formula defines the percentage of your bankroll that you can invest over repeated iterations of a positive EV game such that you will maximize you expected value while having zero chance of going broke.It's not quite as simple as knowing you have a 30% chance to hit a draw...you have to know what your expected value is versus the amount you wagered. So if you invest $1 for a 30% chance of winning $4, your EV is $1.2, and your "edge" would be 20%. Given this "edge", you can use the Kelley criteria to determine how much you can safely wager in proportion to your bankroll. Wager too little and you aren't maximizing your potential, but wager too much and your bankroll will be decimated by an unlucky string to a point that is will take a long time to recover.It's basically the same as bankroll management, but a more precise calculation for games where you can exactly calculate your edge....card-counting in blackjack, for instance. Because the edge in poker is so fluid, the formula is much more difficult to apply in that context, buy if you know your BB/100 at a game, you can probably use it to determine what limits you should be playing at. But I've never seen it specifically applied to poker.

Link to post
Share on other sites
There is a new book out about the Kelley criteria and its application to both gambling and investments called "Fortune's Forumla". A very interesting read.Basically, the formula defines the percentage of your bankroll that you can invest over repeated iterations of a positive EV game such that you will maximize you expected value while having zero chance of going broke.It's not quite as simple as knowing you have a 30% chance to hit a draw...you have to know what your expected value is versus the amount you wagered. So if you invest $1 for a 30% chance of winning $4, your EV is $1.2, and your "edge" would be 20%. Given this "edge", you can use the Kelley criteria to determine how much you can safely wager in proportion to your bankroll. Wager too little and you aren't maximizing your potential, but wager too much and your bankroll will be decimated by an unlucky string to a point that is will take a long time to recover.It's basically the same as bankroll management, but a more precise calculation for games where you can exactly calculate your edge....card-counting in blackjack, for instance. Because the edge in poker is so fluid, the formula is much more difficult to apply in that context, buy if you know your BB/100 at a game, you can probably use it to determine what limits you should be playing at. But I've never seen it specifically applied to poker.
that is what my next gutshot.com article is about. you'll like it.
Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...