# 2.8 Miscellaneous methods

## 2.8.1 Martingale methods

Modern probabilists regard the martingale optional stopping theorem as one of the most important results in their subject. As propaganda for martingales we give below four quick applications of that theorem, and a few more will appear later. All of these results could be proved in alternative, elementary ways. For the reader unfamiliar with martingales, Durrett [133] Chapter 4 contains much more than you need to know: Karlin and Taylor [208] Chapter 6 is a gentler introduction.

###### Lemma 2.27

Given a non-empty subset $A\subset I$ and a function $f(i)$ defined for $i\in A$, there exists a unique extension of $f$ to all $I$ satisfying

 $f(i)=\sum_{j}p_{ij}f(j),\ i\not\in A.$

Proof. If $f$ satisfies the equations above then for any initial distribution the process $M_{t}:=f(X_{\min(t,T_{A})})$ is a martingale. So by the optional stopping theorem

 $f(i)=E_{i}f(X_{T_{A}})\mbox{ for all }i.$ (2.28)

This establishes uniqueness. Conversely, if we define $f$ by (2.28) then the desired equations hold by conditioning on the first step.

###### Corollary 2.28

If $h$ is harmonic, i.e. if

 $h(i)=\sum_{j}p_{ij}h(j)\mbox{ for all }i$

then $h$ is constant.

Proof. Clearly a constant function is harmonic. So the Corollary follows from the uniqueness assertion of Lemma 2.27, taking $A$ to be some singleton.

###### Lemma 2.29 (The random target lemma)

The sum $\sum_{j}E_{i}T_{j}\pi_{j}$ does not depend on $i$.

Proof. This repeats Corollary 2.14 with a different argument. The first-step recurrence for $g_{j}(i):=E_{i}T_{j}$ is

 $g_{j}(i)=1_{(i\neq j)}+1_{(i\neq j)}\sum_{k}p_{ik}g_{j}(k).$

By Corollary 2.28 it is enough to show that $h(i):=\sum_{j}\pi_{j}g_{j}(i)$ is a harmonic function. We calculate

 $\displaystyle h(i)$ $\displaystyle=$ $\displaystyle 1-\pi_{i}+\sum_{j,k}\pi_{j}p_{ik}g_{j}(k)1_{(i\neq j)}$ $\displaystyle=$ $\displaystyle 1-\pi_{i}+\sum_{k}p_{ik}(h(k)-\pi_{i}g_{i}(k))\mbox{ by % definition of }h(k)$ $\displaystyle=$ $\displaystyle\sum_{k}p_{ik}h(k)\ \ +1-\pi_{i}\left(1+\sum_{k}p_{ik}g_{i}(k)% \right).$

But $1/\pi_{i}=E_{i}T^{+}_{i}=1+\sum_{k}p_{ik}g_{i}(k)$, so $h$ is indeed harmonic.

###### Lemma 2.30

For any stopping time $S$ and any states $i,j,k$,

 $j\rightarrow k$
 $j$

Proof. Recall that “before” means strictly before. The assertion of the lemma is intuitively obvious, because each time the chain visits $j$ it has chance $p_{jk}$ to make a transition $j\rightarrow k$, and one can formalize this as in the proof of Proposition 2.4. A more sophisticated proof is to observe that $M(t)$ is a martingale, where

 $M(t):=N_{jk}(t)-p_{jk}N_{j}(t).$
 $j$
 $j\rightarrow k$

And the assertion of the lemma is just the optional stopping theorem applied to the martingale $M$ and the stopping time $S$.

###### Lemma 2.31

Let $A$ be a non-empty subset of $I$ and let $h:I\rightarrow R$ satisfy

(i) $h(i)\geq 0,\ i\in A$

(ii) $h(i)\geq 1+\sum_{j}p_{ij}h(j),\ i\in A^{c}$.

Then $E_{i}T_{A}\leq h(i),\ i\in I$.

Proof. For arbitrary $h$, define $g$ by

 $h(i)=1+\sum_{j}p_{ij}h(j)+g(i)$

and then define

 $M_{t}=t+h(X_{t})+\sum_{s=0}^{t-1}g(X_{s}).$

Then $M_{\min(t,T_{A})}$ is a martingale, so the optional sampling theorem says

 $E_{i}M_{T_{A}}=E_{i}M_{0}=h(i).$

But the hypotheses on $h$ imply $M_{T_{A}}\geq T_{A}$.

## 2.8.2 A comparison argument

A theme running throughout the book is the idea of getting inequalities for a “hard” chain by making a comparison with some “easier” chain for which we can do exact calculations. Here is a simple example.

###### Lemma 2.32

Let $X$ be a discrete-time chain on states $\{0,1,2,\ldots,n\}$ such that $p_{ij}=0$ whenever $j>i$. Write $m(i)=i-E_{i}X_{1}$, and suppose $0. Then $E_{n}T_{0}\leq\sum_{j=1}^{n}\frac{1}{m(j)}$.

Proof. The proof implicitly compares the given chain to the continuous-time chain with $q_{i,i-1}=m(i)$. Write $h(i)=\sum_{j=1}^{i}1/m(j)$, and extend $h$ by linear interpolation to real $0\leq x\leq n$. Then $h$ is concave and for $i\geq 1$

 $\displaystyle E_{i}h(X_{1})$ $\displaystyle\leq$ $\displaystyle h(E_{i}X_{1})\mbox{ by concavity }$ $\displaystyle=$ $\displaystyle h(i-m(i))$ $\displaystyle\leq$ $\displaystyle h(i)-m(i)h^{\prime}(i)\mbox{ by concavity }$ $\displaystyle=$ $\displaystyle h(i)-1$

where $h^{\prime}$ is the left derivative. The result now follows from Lemma 2.31.

## 2.8.3 Wald equations

As mentioned previously, the results above don’t really require martingales. Next we record a genuine martingale result, not directly involving Markov chains but ultimately useful in their analysis. Part (c) is Wald’s equation and part (b) is Wald’s equation for martingales. The result is a standard consequence of the optional sampling theorem: see [133] (3.1.6) for (c) and [133] Theorem 4.7.5 for (a).

###### Lemma 2.33

(a) Let $0=Y_{0}\leq Y_{1}\leq Y_{2}\ldots$ be such that

 $E(Y_{i+1}-Y_{i}|Y_{j},j\leq i)\leq c,\ i\geq 0$

for a constant $c$. Then for any stopping time $T$,

 $EY_{T}\leq cET.$

(b) If in the hypothesis we replace “$\leq c$” by “$=c$”, then $EY_{T}=cET$.

(c) In particular, if $Y_{n}=\sum_{i=1}^{n}\xi_{i}$ for i.i.d. nonnegative $(\xi_{i})$ then $EY_{T}=(E\xi_{i})(ET)$.