# 11.8 Belongs in other chapters

yyy: add to what’s currently sec. 10.2 of Chapter 2, version 9/10/99, but which may get moved to the new Chapter 8.

Where $\pi$ does not vary with the parameter $\alpha$ we get a simple expression for $\frac{d}{d\alpha}{\bf Z}$.

###### Lemma 11.4

In the setting of (yyy Chapter 2 Lemma 37), suppose $\pi$ does not depend on $\alpha$. Then

 ${\textstyle\frac{d}{d\alpha}}{\bf Z}={\bf Z}{\bf R}{\bf Z}.$

xxx JF: I see this from the series expansion for ${\bf Z}$ – what to do about a proof, I delegate to you!

## 11.8.1 Pointwise ordered transition matrices

yyy: belongs somewhere in Chapter 3.

Recall from Chapter 2 section 3 (yyy 9/10/99 version) that for a function $f:S\to R$ with $\sum_{i}\pi_{i}f_{i}=0$, the asymptotic variance rate is

 $\sigma^{2}({\bf P},f):=\lim_{t}t^{-1}{\rm var}\ \sum_{s=1}^{t}f(X_{s})=f\Gamma f$ (11.19)

where $\Gamma_{ij}=\pi_{i}Z_{ij}+\pi_{j}Z_{ji}+\pi_{i}\pi_{j}-\pi_{i}\delta_{ij}$. These individual-function variance rates can be compared between chains with the same stationary distribution, under a very strong “coordinatewise ordering” of transition matrices.

###### Lemma 11.5 (Peskun’s Lemma [280])

Let ${\bf P}$ and ${\bf Q}$ be reversible with the same stationary distribution $\pi$. Suppose $p_{ij}\leq q_{ij}\ \forall j\neq i$. Then $\sigma^{2}({\bf P},f)\geq\sigma^{2}({\bf Q},f)$ for all $f$ with $\sum_{i}\pi_{i}f_{i}=0$.

Proof. Introduce a parameter $0\leq\alpha\leq 1$ and write ${\bf P}^{\alpha}=(1-\alpha){\bf P}+\alpha{\bf Q}$. Write $(\cdot)^{\prime}$ for $\frac{d}{d\alpha}(\cdot)$ at $\alpha=0$. It is enough to show

 $(\sigma^{2}({\bf P},f))^{\prime}\leq 0.$

By (11.19)

 $(\sigma^{2}({\bf P},f))^{\prime}=f\Gamma^{\prime}f=2\sum_{i}\sum_{j}f_{i}\pi_{% i}z^{\prime}_{ij}f_{j}.$

By (yyy Lemma 11.4 above) ${\bf Z}^{\prime}={\bf Z}{\bf P}^{\prime}{\bf Z}$. By setting

 $g_{i}=\pi_{i}f_{i};\quad a_{ij}=z_{ij}/\pi_{j};\quad w_{ij}=\pi_{i}p_{ij}$

we can rewrite the equality above as

 $(\sigma^{2}({\bf P},f))^{\prime}=2\ g{\bf A}{\bf W}^{\prime}{\bf A}g.$

Since ${\bf A}$ is symmetric wsith row-sums equal to zero, it is enough to show that ${\bf W}^{\prime}$ is non-negative definite. By hypothesis ${\bf W}^{\prime}$ is symmetric and $w^{\prime}_{ij}\geq 0$ for $j\neq i$. These properties imply that, ordering states arbitrarily, we may write

 ${\bf W}^{\prime}=\sum\sum_{i

where ${\bf M}^{ij}$ is the matrix whose only non-zero entries are $m(i,i)=m(j,j)=-1;\ m(i,j)=m(j,i)=1$. Plainly ${\bf M}^{ij}$ is non-negative definite, hence so is ${\bf W}^{\prime}$.