\documentclass[10pt]{article}
\usepackage{amsmath,amssymb,amsthm}
\title{Casanova's Martingale}
\author{Michael Kozdron\\
Duke University}
%\date{}
\renewcommand{\P}{\ensuremath{\textup{\textbf{P}}}}
\newcommand{\E}{\ensuremath{\textup{\textbf{E}}}}
\theoremstyle{plain}
\newtheorem{defn}{Definition}
\newtheorem*{claim}{Claim}
\begin{document}
\maketitle
\setlength{\parindent}{0pt}
\begin{abstract}
\noindent
In this short exposition, the notion of a martingale is introduced through the example of the martingale betting strategy as played by Girolamo Casanova. The majority of this paper is from \cite{gs}. For a more detailed introduction to martingales, consult \cite{williams}.
\end{abstract}
\section{Preliminaries}
In this section we recall some of the basic definitions from measure theory and note their probabilistic analogues. Let $\Omega$ be a set and let $\cal{F}$ be a $\sigma$-algebra of subsets of $\Omega$. Let \P$:\cal{F} \rightarrow$ $[0, 1]$ be a measure on $\Omega$ such that $\P(\Omega) =1$.
\begin{defn}
$(\Omega, \cal{F}, \P)$ is called a \textbf{probability space}.
\end{defn}
Let $X : \Omega \rightarrow \mathbb{R}$ be an $\cal{F}$-measurable function. That is to say, $X^{-1}(A) \in {\cal {F}}$ $\forall A \in \cal B$ where $\cal B$ are the Borel subsets of $\mathbb{R}$. $X$ is called a \textit{random variable}.
\begin{defn}
If $\int |X| d\P < \infty$ (i.e.\ $X$ is integrable), then define
$$\E(X) = \int X d\P.$$
\end{defn}
An element of $\cal{F}$ is called an \textit{event}. Two events $A$ and $B$ are said to be \textit{independent} if $\P(A \cap B) = \P(A) \P(B)$.
\section{A Gambler's Dilemma}
Consider tossing a fair coin and observing whether heads or tails appears. We can construct the following probability space to model this experiment:\\
Let $\Omega = \{H,T\}$.\\
Let ${\cal F} = \{\phi, H, T, \Omega\}$.\\
Define a probability on $\cal{F}$ by $\P(\phi)=0$, $\P(H) = \frac{1}{2} = \P(T)$, $\P(\Omega)=1$.\\
Define the random variable $X$ to be the outcome of the toss so that $X=x$ where $x \in \{H,T\}$.\\
Repeatedly toss this coin. Let $\{X_{j}\}$, $j = 0,1,2,\dots$ be a collection of random variables such that $\P(X_{j}=H) = \frac{1}{2} = \P(X_{j}=T)$.
\begin{claim}
Heads is bound to turn up sooner or later.
\end{claim}
\begin{proof}
Consider the event that no heads have been observed in the first $n$ tosses. Then,
$$\P(X_{1}=T, X_{2}=T, \dots , X_{n}=T) = \P(X_{1}=T)\P(X_{2}=T) \dots \P(X_{n}=T) = \frac{1}{2^n}$$
by independence and since $\P(X_{j}=T)=\frac{1}{2}$ for all $j$.\\
Thus, the probability that no heads ever turns up is equal to the probability that only tails ever turn up.\\
So,
\begin{align*}
\P(\text{no heads ever}) &= \P(X_{1}=T, X_{2}=T, \dots )\\
&= \prod _{j=1}^{\infty} \P(X_{j}=T)\\
&= \lim_{n \rightarrow \infty} \prod _{j=1}^{n} \P(X_{j}=T)\\
&= \lim_{n \rightarrow \infty} \frac{1}{2^n}\\
&= 0
\end{align*}
Hence, $\P(\text{heads eventually turns up}) = 1 - \P(\text{no heads ever}) =1$.
\end{proof}
\ \\
Suppose a gambler wagers \$1 on an ``evens'' bet. That is, the gambler is playing a game in which $\P(\text{win}) = \frac{1}{2} = \P(\text{loss})$ and pays at 1:1 odds.\\
e.g.\ If you wager \$1 and win, then you receive your original \$1 plus \$1.\\
The gambler plays the game using the strategy that if he loses the first play, he wagers \$2 on the second play, and so on until he wins. In general, if he loses on the $n^{th}$ play, then he wagers $\$2^{n}$ on the next play.\\
Each wager is calculated so that his inevitable winnings will cover his lost stakes and profit him \$1. For if he wins on the $(n+1)^{st}$ play then he will have lost $\$(1 + 2 + \dots + 2^{n-1}) = \$(2^{n} - 1)$ and his winnings will be
$$\$2^{n} - \$(1 + 2 + \dots + 2^{n-1}) = \$1.$$
Now if we suppose that the gambler wins on the $N^{th}$ play, then $N$ is a random variable with $\P(N=n) = \frac{1}{2^n}$.\\
By the claim, we have $\P(N < \infty) = 1$. In other words, with probability one, the gambler is guaranteed a win in the long run.\\
However, by this time he will have lost an amount $\$L$, with mean
$$\E(L) = \sum_{n=1}^{\infty} \left(\frac{1}{2}\right)^{n} (1 + 2 + \dots + 2^{n-2}) = \infty.$$
And so the moral is that the gambler must be prepared to lose a lot money, but so too must the casino. Thus, do not follow this strategy unless your initial capital is considerably greater than that of the casino's!\\
\section{The Martingale}
Although casinos now outlaw the use of such strategies, they haven't always forbidden it. In his memoirs, Casanova recalls his stay in Venice in 1754:
\begin{quote}
\small Playing the martingale, continually doubling my stake, I won every day during the carnival. I was fortunate enough never to lose the sixth card, and if I had lost it, I should have been without money to play, for I had 2000 gold coins on that card. I congratulated myself on having increased the fortune of my dear mistress.
\end{quote}
Unfortunately, for dear Casanova:
\begin{quote}
\small{Some days later, I still played the martingale but with such bad luck that I was soon left without a coin. As I shared my property with my mistress, I was obliged to tell her of my losses, and at her request sold all her diamonds, losing what I got for them; she now had only 500 gold coins. There was no more talk of her escaping from the convent, for we had nothing to live on.} \cite{gs}
\end{quote}
As noted by Casanova, the gambling strategy outlined in the preceeding section is called a \textit{martingale betting strategy}.\\
Now if we write $Y_{n}$ for the accumulated gain of the gambler after the $n^{th}$ play with losses counting negative, then
\begin{itemize}
\item $Y_{0}=0$,
\item $|Y_{n}| \le 1 + 2 + \dots + 2^{n-1} = 2^{n} - 1$
\end{itemize}
Furthermore, $Y_{n+1}=Y_{n}$ if the gambler has stopped by time $n+1$.
\begin{equation*}
\therefore Y_{n+1}=
\begin{cases}
Y_{n} - 2^{n} &\text{with probability } \frac{1}{2},\\
Y_{n} + 2^{n} &\text{with probability } \frac{1}{2}.
\end{cases}
\end{equation*}
This implies,
\begin{align*}
\E(Y_{n+1} | Y_{1}, Y_{2}, \dots, Y_{n}) &= \frac{1}{2} (Y_{n} - 2^{n}) + \frac{1}{2} (Y_{n} + 2^{n})\\
&= Y_{n}
\end{align*}
In this fair game, conditional upon the past information, the gambler will expect no change in his present capital on average. This is an example of what is called a \textit{martingale}.\\
\begin{defn}
A sequence $Y=\{Y_{n}\}$, $n = 0,1,2,\dots$ of random variables is a \textbf{martingale} with respect to the sequence $X=\{X_{n}\}$, $n = 0,1,2,\dots$ if for all $n \ge 0$,
\begin{itemize}
\item $\E(|Y_{n}|) < \infty$,
\item $\E(Y_{n+1} | X_{0}, X_{1}, \dots, X_{n}) = Y_{n}$.
\end{itemize}
\end{defn}
In the gambling example above $\{Y_{n}\}$ is a martingale with respect to itself.
\nocite{gs}
\nocite{williams}
\nocite{royden}
\bibliographystyle{plain}
\bibliography{martingales}
\end{document}