A Probablistic Look at the Geometric Series

So talking about the St. Petersburg paradox has reminded me of my undergraduate years, including a certain quick (but kind of handwavy) proof of the formula of the geometric series shown to me by Dr. Vitaly Bergelson.

One takes a coin which has the probability $0 of being heads and probability $q=1-p$ of being tails, and play a familiar game: flip the coin until it lands on tails. The possibilities are exhausted as follows: tails first; heads then tails; two heads then tails; three heads then tails; etc, and always heads forever. The last option has probability zero (not to be confused with impossible) because the probability of that option is less than the probability of getting $n$ heads in $n$ tosses is $1/p^n$ which converges to 0 as $n\to\infty$, so since the probability of always getting heads is less than all these it must be 0.

So we have to have one of these mutually exclusive options, and so their probabilities must add up to 1. The probability of $n$ heads followed by 1 tails is $p^n\cdot q$. So we have that $\sum_{n=0}^{\infty} p^nq=1$, or if we divide by $q$, we get: $\sum_{n=0}^{\infty} p^n= 1/q=1/(1-p)$.

So the geometric series comes out of the wash from playing a coin game. The question Dr Bergelson asked next was what happens when instead of stopping at 1 tails, you stop at 2? What new formula do you get? I admit that I never got the answer to that question despite trying for some time, though I haven't thought about it since my undergraduate days.

Dr Bergelson’s question is answered by noting that the probability of ending with the second tail at the $n+1$th throw is $n p^{n-1} q^2$: two tails, rest are heads, and there are $n$ places where the first tail can go. So this leads you to the expansion $\sum_{n = 1}^\infty n p^{n-1} = \frac{1}{(1-p)^2}$, which interestingly you can also derive by formally taking the first derivative (relative to $p$) of the formula for the sum of the geometric series.