somehow i am sure that gambling is one of the most primal traits that differ human being from other innocent creatures. even though, we are not good at it anyways. let’s have the following scenario:
a coin is tossed 10 times before you enter the game, and all the 10 tosses resulted in heads. what will your ‘reasonable’ bet on its 11th toss?
a typical gambler might say that ‘it cannot carry on to be head forever, let’s bet on a tail.’ while most students of statistics will respond that the tosses are independent therefore, one will be quite indifferent to either head or tail.
but my friend, ‘for unto every one that hath shall be given, and he shall have abundance: but from him that hath not shall be taken away even that which he hath (matthew 25:29)’ we dont’t really know if the coin is fair or not, do we?
but given the 10 previous heads, one can hardly deduce that the coin is fair. but, to what extent are we sure of its biasedness? to simplify the situation, we will assume that: the probability that the coin in concern has two heads is p.
denote the following
we have
bayesian theorem updates the belief of the coin being biased as:
if the initial belief of p > 9%, by now, we are 99% sure that the coin is biased. the expectation of the 11th toss is
for p = 1%, this probability is more than 95.6%.
what is more important is that, this probability will always be greater than a half no matter what the initial belief of p is. thus we have the strategy: always bet according to the history, if there are more heads in history, then bet on heads and vice versa.
now let’s generalise:
assuming k heads are observed in the n previous toss, initial belief of the coin is biased towards head with probability q is p.
similarly denote
in general:
the updated probability of P(A) would be
where x = 2nqk(1 − q)n−k. thus
this proves our intuition.
an argument that is so imperative is that ‘given that the tosses are independent, 10 consecutive heads are of course possible’ however minute the possiblity is.
but just how impossible?
what is the probability of an r run of heads in n tosses, if the probability of a head is p?
if we denote
by definition Sn = ∑ k=rnsk, where sk is the probability that the first r-run occurs at the kth toss. obviously sr = pr.
a recursive relationship can be found for sn:
(1) |
where q = 1 − p. this relationship basically states that to achieve r-run for the first time, the length of the first run of heads has to be less than r.
equation (1) is a difference equation, solvable by generating function. define G(t) = ∑ k=0∞sktk, notice that sk = 0 for k < r:
this is a geometric progression on the right:
now that Sn = ∑ k=1nsk, the generating function of Sn being:
find the polynomial expansion of H(x) is admittedly tedious, thus i quote from Simpson1 :
while it is true that limn→∞Sn = 1, assuming p = . the fact is that it will
require at least 1,500 tosses to make sure that there is a 10-run with a 50%
probability. surely not healthy to toss a coin for that many of times.
it might be counter-intuitive to many, but in a fair gamble, to ride the tide seems to be a reasonable strategy.