Студопедия
Случайная страница | ТОМ-1 | ТОМ-2 | ТОМ-3
АвтомобилиАстрономияБиологияГеографияДом и садДругие языкиДругоеИнформатика
ИсторияКультураЛитератураЛогикаМатематикаМедицинаМеталлургияМеханика
ОбразованиеОхрана трудаПедагогикаПолитикаПравоПсихологияРелигияРиторика
СоциологияСпортСтроительствоТехнологияТуризмФизикаФилософияФинансы
ХимияЧерчениеЭкологияЭкономикаЭлектроника

When luck has something to do with it

Читайте также:
  1. A A/anand some103 Everyone, something, etc
  2. Aibileen walks in the dining room and I do my best not to look at her for too long. I am afraid Hilly or Elizabeth will see something in my eyes.
  3. Americans can learn something from Ukraine's struggle for liberty
  4. be bursting with something
  5. BOREDOM TELLS ME SOMETHING
  6. Buy (something) on credit
  7. Carrying concrete pipes, called to the scene,etc are relative clauses: they relate to a noun. Carrying concrete pipestells us something about a lorry.

 

In games of pure chance, such as craps, you can compute the best strategy using probability theory. That's also true in mixed skill and luck games, if the other players' actions are fixed. For example, probability theory works if you are playing blackjack one-on-one against a casino dealer. But in games like poker and bridge, where all players can make choices, probability theory is not enough.

You:

 

Board:

 

Player:

 

Suppose, for example, you are dealt pocket aces in hold 'em and the board comes down with ace, queen, jack, seven, three with no three cards of the same suit. The only hand that can beat you is king/ten, which gives player A a straight. Looking at it mathematically, there are 45 unknown cards, which can be arranged in 45 x 44/2 = 990 ways. There are 4 x 4 = 16 ways to get king/ten, so the chance of two random cards beating you is 16/990 = 1.62 percent.

 

 

 

This calculation is illustrated in the preceding table. It lists all the card ranks in both the rows and the columns, with the number of that rank available (that is, not in your hand or on the board). There are four of most cards, but only one ace and three each of queen, jack, seven, and three. Each cell shows the number of ways that combination of cards can be made. For example, to see the number of queen/eight combinations, look in the queen row (since queen is the higher card) and the eight column. The 12 you see there is the product of the number of queens (3) and the number of eights (4).

Pairs are slightly different. Although there are four kings available, they cannot be combined in 4 x 4 = 16 ways. Once you pick one king, there are only three more available, so it's 4 x 3 = 12. That's easy enough, but what can be confusing is that you have to divide that number by 2 to get the 6 you see in the table. The reason you divide by 2 is that the kings are interchangeable. King of hearts/king of spades is the same hand as king of spades/king of hearts (and it's not possible to get king of hearts/king of hearts). But king of hearts/ten of spades is not the same hand as king of spades/ten of hearts. This is the same reason that it is twice as hard to roll four-four with two dice than five-three; any specific double combination is half as likely as any specific nondouble. Craps players call getting four-four "making eight the hard way." With four-four, the first die has to be a four and the second die has to be a four. Since each has probability 1/6, the probability of both is 1/6 x 1/6 = 1/36. With five-three, the first die can be either a five or a three-that's 2/6. The second die must be the other number-that's 1/6. But 2/6 x 1/6 = 2/36, twice the chance of four-four.

If you add up all the numbers in the table, you get 990. That's 45 (the number of unknown cards) times 44 (the number of unknown cards once you pick the first one) divided by 2 (because the cards are interchangeable). Looking up king/ten shows a 16, so a random twocard hand has 16/990 probability of being king/ten. If you were simply betting on that outcome instead of playing poker, that's all you would need to know.

 

But you are playing poker; you are not facing two random cards. In a table of 10 players, there's a 14.94 percent chance that one of them was dealt this hand. You have to ask yourself the probability that anyone holding king/ten would have stayed in the hand to this point and bet the way the other player did. You also have to ask what other hands the other player might have to justify his betting to this point. Finally, you have to predict what he will do with this hand, or with other likely hands, after any action you might take. You would feel more confident about winning if another player needed seven/ four to beat you instead of king/ten, because this hand would almost certainly have been folded preflop. You would also feel better if the queen and jack were on the turn and river, because king/ten might have folded after a flop of ace/seven/three, but certainly not after ace/queen/jack. The probability of seven/four is the same as the probability of king/ten, and for any given board any order of the cards is equally likely. Yet your betting strategy will be different depending on whether king/ten or seven/four beats you and on what order the board came down. Probability theory is not enough.

One approach is to assume a strategy for your opponents, then compute the best counterstrategy. That's mathematically appealing, because it allows us to view poker like blackjack, where other players' actions are predetermined. A common mistake among people who are good at math is to take an approach that is mathematically convenient, then insist that solution is the only rational one. We'll see how to exploit this mistake later in this chapter.

The key principle of game theory takes the question of strategy one step further. You assume a specific strategy for your opponents. It's the strategy that's best for them, under the assumptions that you know their strategy and will use the best counterstrategy for you. It might seem that this approach cannot be beaten. If your opponents pick their best strategy, you'll have the best counterstrategy. If they pick anything else, you'll do at least as well, and maybe better.

For example, in a baseball game, bottom of the ninth inning, score tied, bases loaded, and full count on the batter, if the pitcher walks the batter or gives up a hit, his team loses. If he can get the batter out, the game goes to extra innings and his team has about an even chance of winning. He has three pitches: fastball, curveball, and slider. He can throw his fastball for a strike 90 percent of the time and his curveball 70 percent, but he's been having control problems with his slider-there's only a 50 percent chance it will end up in the strike zone. The batter can choose to swing or not swing. If he swings at a pitch outside the strike zone, we assume he strikes out and the inning is over. If he swings at a fastball or curveball in the strike zone, assume he has a 50 percent chance of delivering a game-winning hit. But the slider is harder to hit, so even if it's in the strike zone, he'll get a hit only 20 percent of the time.

 

 

This table shows the probability of getting the batter out for each combination of choices based on these assumptions. If the batter lays off, he will be out if the pitch is a strike, so these percentages are just the probabilities. If he swings, he will be out if the pitch is a ball, and half the time if it is a strike, except for the slider, which gets him out four-fifths of the time if it is a strike.

The pitcher might be tempted to select the pitch with the highest average probability. The fastball has (55 percent + 90 percent)/2 = 72.5 percent, the curveball (65 percent + 70 percent)/2 = 67.5 percent, and the slider (90 percent + 50 percent)/2 = 70 percent. So the fastball is the best pitch, followed by the slider, and the curveball is the worst. This would be correct if the batter chose whether to swing by coin flip. On the other hand, if the pitcher knew what the batter was going to do, he would throw a slider if the batter planned to swing and a fastball if he didn't.

But we'll assume the batter is good enough to spot the pitch type and then choose whether to swing. We're not going to make him so good that he can tell whether it will be a ball or a strike. (He's not Ted Williams or Barry Bonds.) The batter wants to minimize the probability of getting out-employ his ideal strategy-so he'll always pick the column with the smaller number for whatever pitch is thrown. That means he'll swing at a fastball or curveball and lay off a slider. Knowing that, the pitcher will ignore the higher number in each row and choose the pitch with the highest value of the lower number, known as the minimax strategy. Under minimax, the best pitch is the curveball, with a 65 percent worst case. That was the worst pitch in the expected value calculation and the one that would never be thrown if the batter's intentions were known. Only game theory identifies the curveball as the best pitch. Of course, in a simple situation like this one, you might see the advantage of the curveball without formal mathematics. But combinations multiply rapidly in real games and even more rapidly in real life. Before game theory was invented, no one had identified minimax as a general strategic principle. Without the machinery of game theory, it's almost impossible to solve games with more than a handful of outcomes.

 

What if the batter has to make up his mind whether to swing before the pitch? That makes it a different game. The computation is slightly more complicated, but there is a trick that often works. The best game theory strategy often equalizes your opponent's options. Intuitively, if your opponent can benefit from making one decision versus another, you've left something on the table for him to exploit. In game theory poker, you often bet the amount that puts your opponent on the edge, with equal expected value from folding, calling, or raising. That isn't always true, but it works in this example.

Given that the pitcher wants to make it equally attractive for the batter to swing or lay off, he only needs a choice of two pitches to do it. One choice is not enough, since the batter will decide to swing or not based on the probable outcome for that pitch. But if the pitcher mixes two pitches in the right proportions, the batter can swing or not, and the pitcher's team's chance of winning the game is identical. Obviously, it makes sense to choose among the two pitches with the highest expected values regardless of what the batter will do: the fastball and the slider. If the pitcher puts the numbers 1 through 15 in his hat, draws one out, and pitches a fastball if the number is 1 to 8 and a slider if it is 9 to 15, he's got a 71 percent chance of getting the out whether the batter swings or not.

 

This is another important insight from game theory: It often makes sense to deliberately randomize your strategy, creating artificial risk using gambling devices. People who try to minimize risk, who say gambling is irrational because it creates artificial risk, can miss opportunities. The best nonrandom strategy for the pitcher is to always pitch the curveball, which gives a 65 percent chance of getting the out. The best randomized strategy gives 71 percent, which is better than the curveball, whatever decision the batter makes about swinging. Randomized strategies also have an important place in finance.

 


Дата добавления: 2015-10-26; просмотров: 161 | Нарушение авторских прав


Читайте в этой же книге: Never Ask of Money Spent, Where the Spender Thinks It Went | You Took Little Children Away from the Sun and the Dew ... for a Little Handful of Pay on a Few Saturday Nights | How Poker and Modern Derivatives Were Born in a Jambalaya of Native American and West African River Traders, Heated by Unlimited Opportunity and Stirred with a Scotch Spoon | ADVENTURERS AND PLANTERS | MY FIRST HAND OF COMMERCIAL POKER | A TALL, BOLD SLUGGER SET VIVID AGAINST THE LITTLE, SOFT CITIES | THE EDUCATION OF A POKER PLAYER | The Once-Bold Mates of Morgan | The Options Floor | Parity, Verticals, and Calendars |
<== предыдущая страница | следующая страница ==>
Poker at Lepercq| GOD GAVE YOU GUTS: DON'T LET HIM DOWN

mybiblioteka.su - 2015-2024 год. (0.01 сек.)