Sunday, January 06, 2008

Poker: The All-in Flush Draw

Pretend I am TransFish. Now this has got to be one of my favourite poker moments. I hold 2 suited cards (as an example from the picture, AT--short for AceTen) and am heads up against a paired opponent.

Flop comes out 2 suits. The air is thick with anticipation. All is quiet, even as the chatters and shouts and excitement intensify across the table. My face is flushed as I await my flush.

First, the technicalities: What are the odds of me making my flush?

The intuitive answer I gave myself was: 50%.
4 suits. Each suit has 1/4 chance of flopping.
2 cards. Add them up.
Unfortunately, the actual odds of making that flush is much lower than the intuitive 50%.

We have 3 independent scenarios:
2 more cards of my suit + 1 card of my suit and 1 non suit + 1 non-suit and 1 card of my suit =
*Total of 52 cards. 2 are dealt to you, and 3 are dealt on the flop making a total of 5 known cards for you, and 47 unknown cards. Of the 5 known cards, 4 are of the suit of interest. There leave 9 more suits in the 47 unknown cards.

A better, and mathematically similar approach is:


There is a trick called the Rule of 4 (analogous to the Rule of 72):
We have 9 outs. To get the odds of making your outs come true, simply multiply the number of outs by 4, hence 9*4 = 36% (marvellous approximation!)
(out is defined as undealt cards that are favourable to your cause. The more outs the better!)

Now comes the game theory aspect of poker (with absolute zero knowledge of game theory, I substitute it with the rudimentary knowledge of calculating expectations)

Of course, the concept of "giving a free card" is also very crucial here--not for me, but for my opponents. I have yet to make my hand. I am waiting for a "free card"-- either on the turn or the river. Those who have me beat at the flop should make me pay (by betting heavily before the turn) to get the free card. In case, they don't, I will check quietly, but in case they do, I may just go all-in.
(A case of damned if they don't, and damned if they do, why else would I elect this hand to be my favourite poker moment.)

Why so?

Considering all possibilities when you go all-in. There are 3 and only 3 scenarios:
1) you win the pot outright. Everyone else folds ( the probability here varies. Say, each person has a 10% chance of calling)
P(0 people call) = 0.9^6 = 0.53 (assume 6 people are in the game)
P(at least 1 person call) = 0.47.

We can assume the probability of at least one person calling your all-in to be 50%.
Conversely, the probability of winning the hand outright is also 50% (everybody folds to your all-in)

2) you get called, and win. (0.5*35%=0.175)
3) you get called, and lose. (0.5*65%=0.325)

Calculate expectation.
Let PotMoney = P.
Let your current stack = x.
Let n = total number of opponents who call your all-in
We want the expectation E[all-in] to be as high as possible, and the 2 variables helping us are P and n.

Let's calculate how high P should be for a favourable all-in.
For expectation > 0 (for n=1),
This means that as long as P is 30% of your current stack, it is not too foolish to go all-in (note my cautious choice of words). When n > 1, P becomes immaterial. Simply put, the expectation just gets higher as more people call your all-in. And the best part is, you are taking on the same risks for higher expectation. Mostly, the people who are going to call you are the pairs and the trips. Best if they are holding on to a flush draw like you, but have an inferior high card. The poker people call these suckers drawing dead. (Yea, you keep waiting, keep drawing, and oh thank god when you finally made your flush, but you still get beat!)

Factors to consider before making the quantum leap:
1) The number of people playing
2) The pot
3) How much money I have
4) Chances of anyone making a full house
5) I better be holding the Ace or King of that suit, giving me a nuts flush.
6) Stages of the game (Opening?Endgame?)

I favour amassing chips at the beginning stages of the game. With a large stack, I can proceed to bully the table for the rest of the game, which is something I highly recommend to do once just for the experience. Pointing at my short-stacked opponents, I can either:
1) laugh at them
2) ask, "How much have you?", and throw the amount of bet equal to what they have, forcing them to go all-in for every card they play. And then repeat Step 1.

The gist is: for an all-in, your upside is unlimited (you never know how many "by-catch" are caught on the trailing hook), but your downside is limited to your all-in. When the gambling blood in me goes on full boil, calling all-in is something I relish!

Things I learnt from William Feller

1. The trial of 1 coin thrown 1000 times is different from the trial of 1000 coins in one instant. That is, given the results (statistical characteristics) of both experiments, one can deduce if its the former or the latter.

Reason:
The laws governing prolonged series of individual observation (random walk) is entirely different from the laws derived for a whole population (law of large numbers).
Time average does not have bounded expectation and hence DOES NOT obey Law of Large Number (LLN). Instead it has its own set of "arc sine" laws strictly for waiting times.Ensemble average has bounded expectation and hence obey LLN.

It sounds absolutely preposterous. I must admit the math on LLN and CLT (Central Limit Theorem) get a bit too dense for me here, but please read Feller thoroughly before dismissing the idea. Perhaps ensemble average lacks the dimension of time--that's what makes it different.

2. Given a fixed probability average (which is reasonable if we apply Laws of Large Numbers), increased uniformity in each independent Bernoulli trial increases the variances. For example, given a certain quality p of n machines, the output will be the least uniform if all machines are uniformly equal.

Given a probability distribution of Bernoulli trials with variable probability,




To summarise: Set of the wildly fluctuating {0.1, 0.4, 0.7} is good. Set of the more uniform {0.4, 0.4, 0.4} is no good.

Another shocking revelation. I hope the Six Sigma folks are aware of this fact. I would like to venture forth an analogy from thermodynamics: Entropy of a system is at its maximum when the system is isothermally uniform. Similarly, variance of a system is at its maximum when every point within the system is uniform. Makes one think about our intuitive understanding of the word "variance".

What is Entropy? by Erwin Shrödinger

3. All it takes is 23 people to make it more likely that at least 2 of them share the same birthday than otherwise.

I suspect people who share the same birthdays have an instant affinity to each other because of the perceived rarity of such events. But I for sure won't flinch again if another May girl comes along.

4. German bombs felled over London were found to be perfectly random and homogeneous, despite apparent evidences of some areas being more heavily bombed.

Reason: to the untrained eye, randomness appears as regularity or tendency to cluster.

Anyway, that's what the chi-square tests are for--to determine if the pattern we are seeing is indeed an anomalous pattern or a good fit to the Poisson or Normal distributions, which are essentially random.

5. Given a sampling of German planes and their number plates, statisticians guesstimated the total enemy plane productions in World War 2.

Assumption: the number plates were given sequentially.

6. Even if a game is fair, where the expectation of winning (per trial) = entrance fee (per trial), there is nothing in the law of large numbers to prevent you from losing money with a probability of 1. (= sure lose).
(LLN only says your loss is limited to a magnitude less than n).

Assumption: we are dealing with random variables with divergent expectations (eg random walks)

7. We live in a world of no advance knowledge. For example, even the prospect of the sun rising tomorrow is subject to the conditional probability below:

P( sun will rise tomorrow | sun has risen for past observable 1826213 days)
= n+1/n+2 (Law of Succession of Laplace)
= 1826214 / 1826215
= 0.999999

Assume that we have no prior knowledge of the motions of the celestial bodies that causes the phenomenon we call "sunrise".

Wiki on Sunrise Problem

Proof. Bear with me on this hypothetical situation. Imagine there are 20 parallel universes, each created by a supremely bored Creator who, for pure amusement, determine the lifespan of each universe he created by drawing a ball from an urn containing red balls and white balls for each universe.A red ball drawn denotes the survival of the sun for that year. A white ball drawn means the sun must be extinguished by that year. The 1st universe has an urn of 0 red and 19 white balls. The 2nd universe has an urn of 1 red and 18 white balls and so on... Hence, each universe has different likelihood of being extinguished, with the 1st universe most likely to be dead, and the 20th universe least likely (in fact, impossible, since there are 19 red balls, and 0 white balls, are contained in its urn).

Say, human beings live on one of the universes, but they have no idea which universe they belong to. They have thus far survived for 10 years (i.e. 10 red balls have been drawn).

Let total number of balls in each urn be N.
Let total number of universes be N+1.
Let current number of years the Universe has been around be 10.

The probability of 10 red balls drawn = P(A) =
P(10 red balls from Universe 1).P(Universe 1) + P(10 red balls from Universe 2).P(Universe 2) + ...+ P(10 red balls from Universe 20).P(Universe 20)

The probability of the 11th ball being drawn is red = P(B) = 1/12.
Hence, the probability of the 11th ball being drawn is red, given that the 1st 10 balls drawn are red
=P(B|A) = P(AB) / P(A)
=P(B) / P(A) (as P(AB) = P(B)...think about this, they are mathematically equivalent)
=(1/12) / (1/11)
= 11/12
= (n+1) / (n+2) where n = number of successful observations

Note the interesting and immensely useful approximation:
which works as the Riemann rectangles are being approximated (though underestimated) by a continuous curve x^n, much like how the binomial distribution is approximated by the normal curve. The error becomes smaller as n goes to infinity.

Of course Feller warned that such ideas were already discredited, and could very well be labelled pseudo-science. But the mathematical developments of hyperspace have thrown up the distinct possibility of parallel universes co-existing with our own. Maybe ours might just be the one whose sun goes out tomorrow. Anyway, Pink Martini seems to agree:

If tomorrow's sun doesn't shine,
And no creatures stir in the morning time,
If the clouds go still in the sky,
at least I'll have my Clementine.

If tomorrow's moon doesn't show,
And our dreams go lost in the winter snow,
If the flowers wither and die,
at least I'll have my Clementine.

Download clip sung by China Forbes here