# 6.4 Short-time bounds

It turns out that the bound “$\tau^{*}\leq 3n^{2}$ on a regular graph” given by Corollary 6.9 can be used to obtain bounds concerning the short-time behavior of random walks. Such bounds, and their applications, are the focus of this section. We haven’t attempted to optimize numerical constants (e.g. Theorem 6.15 implies that $\tau^{*}\leq 13n^{2}/6$ on regular graphs). More elaborate arguments (see Notes) can be used to improve constants and to deal with the irregular case, but we’ll restrict attention to the regular case for simplicity.

Write $N_{i}(t)$ for the number of visits to $i$ before time $t$, i.e. during $[0,t-1]$.

###### Proposition 6.16

Consider random walk on an $n$-vertex regular graph $G$. Let $A$ be a proper subset of vertices and let $i\in A$.

(i) $E_{i}T_{A^{c}}\leq 4|A|^{2}$.

(ii) $E_{i}N_{i}(T_{A^{c}})\leq 5|A|$.

(iii) $E_{i}N_{i}(t)\leq 5t^{1/2},\ 0\leq t<5n^{2}$.

(iv) $P_{\pi}(T_{i}.

Remarks. For part (i) we give a slightly fussy argument repeating ingredients of the proof of Corollary 6.9, since these are needed for (ii). The point of (iv) is to get a bound for $t\ll E_{\pi}T_{i}$. On the $n$-cycle, it can be shown that the probability in question really is $\Theta(\min(t^{1/2}/n,1))$, uniformly in $n$ and $t$.

Proof of Proposition 6.16. Choose a vertex $b\in A^{c}$ at minimum distance from $i$, and let $i=i_{0},i_{1},\ldots,i_{j},i_{j+1}=b$ be a minimum-length path. Let $G^{*}$ be the subgraph on vertex-set $A$, and let $G^{**}$ be the subgraph on vertex-set $A$ together with all the neighbors of $i_{j}$. Write superscripts $\ {}^{*}$ and $\ {}^{**}$ for the random walks on $G^{*}$ and $G^{**}$. Then

 $E_{i}T_{A^{c}}\leq E_{i}T^{**}_{A^{c}}=E_{i}T^{*}_{i_{j}}+E_{i_{j}}T^{**}_{A^{% c}}$

The inequality holds because we can specify the walk on $G$ in terms of the walk on $G^{**}$ with possibly extra chances of jumping to $A^{c}$ at each step (this is a routine stochastic comparison argument, written out as an example in Chapter 14 yyy). The equality holds because the only routes in $G^{**}$ from $i$ to $A^{c}$ are via $i_{j}$, by the minimum-length assumption. Now write $\mbox{{\cal E}},\mbox{{\cal E}}^{*},\mbox{{\cal E}}^{**}$ for the edge-sets. Using the commute interpretation of resistance,

 $E_{i}T^{*}_{i_{j}}\leq 2|\mbox{{\cal E}}^{*}|j.$ (6.20)

Writing $q\geq 1$ for the number of neighbors of $i_{j}$ in $A^{c}$, the effective resistance in $G^{**}$ between $i_{j}$ and $A^{c}$ is $1/q$, so the commute interpretation of resistance give the first equality in

 $E_{i_{j}}T^{**}_{A^{c}}=2|\mbox{{\cal E}}^{**}|\frac{1}{q}\ -1=2\frac{|\mbox% {{\cal E}}^{*}|}{q}\ +1\leq 2|\mbox{{\cal E}}^{*}|+1\leq|A|^{2}.$

The neighbors of $i_{0},i_{1},\ldots,i_{j-1}$ are all in $A$, so the proof of Lemma 6.10 implies

 $j\leq 3|A|/d$ (6.21)

where $d$ is the degree of $G$. Since $2|\mbox{{\cal E}}^{*}|\leq d|A|$, the bound in (6.20) is at most $3|A|^{2}$, and part (i) follows.

For part (ii), by the electrical network analogy (Chapter 3 yyy) the quantity in question equals

 $\frac{1}{P_{i}(T_{A^{c}} (6.22)

where $r(i,A^{c})$ is the effective resistance in $G$ between $i$ and $A^{c}$. Clearly this effective resistance is at most the distance ($j+1$, in the argument above) from $i$ to $A^{c}$, which by (6.21) is at most $3|A|/d\ +1$. Thus the quantity (6.22) is at most $3|A|+d$, establishing the desired result in the case $d\leq 2|A|$. If $d>2|A|$ then there are at least $d-|A|$ edges from $i$ to $A^{c}$, so $r(i,A^{c})\leq\frac{1}{d-|A|}$ and the quantity (6.22) is at most $\frac{d}{d-|A|}\leq 2\leq 5|A|$.

For part (iii), fix a state $i$ and an integer time $t$. Write $N_{i}(t)$ for the number of visits to $i$ before time $t$, i.e. during times $\{0,1,\ldots,t-1\}$. Then

 $\frac{t}{n}=E_{\pi}N_{i}(t)\leq P_{\pi}(T_{i} (6.23)

the inequality by conditioning on $T_{i}$. Now choose real $s$ such that $ns\geq t$. Since $\sum_{j}E_{i}N_{j}(t)=t$, the set

 $A\equiv\{j:E_{i}N_{j}(t)>s\}$

has $|A|, so part (ii) implies

 $E_{i}N_{i}(T_{A^{c}})\leq 5t/s.$ (6.24)

Now by regularity we can rewrite $A$ as $\{j:E_{j}N_{i}(t)>s\}$, and so by conditioning on $T_{A^{c}}$

 $E_{i}N_{i}(t)\leq E_{i}N_{i}(T_{A^{c}})+s.$

Setting $s=\sqrt{5t}$ and combining with (6.24) gives (iii). The bound in (iv) now follows from (iii) and (6.23).

## 6.4.1 Covering by multiple walks

The first application is a variant of work of Broder et al [64] discussed further in section 6.8.2.

###### Proposition 6.17

On a regular $n$-vertex graph, consider $K$ independent random walks, each started at a uniform random vertex. Let $C^{[K]}$ be the time until every vertex has been hit by some walk. Then

 $EC^{[K]}\leq\frac{(25+o(1))n^{2}\log^{2}n}{K^{2}}\mbox{ as }n\rightarrow\infty% \mbox{ with }K\geq 6\log n.$

Remarks. The point is the $\frac{1}{K^{2}}$ dependence on $K$. On the $n$-cycle, for $K\sim\varepsilon n$ it can be shown that initially the largest gap between adjacent walkers is $\Theta(\log n)$ and that $EC^{[K]}=\Theta(\log^{2}n)$, so in this respect the bound is sharp. Of course, for $K\leq\log n$ the bound would be no improvement over Theorem 6.4.

Proof. As usual write $T_{i}$ for the hitting time on $i$ for a single walk, and write $T^{[K]}_{i}$ for the first time $i$ is visited by some walk. Then

 $\displaystyle P_{\pi}(T^{[K]}_{i}\geq t)$ $\displaystyle=$ $\displaystyle(P_{\pi}(T_{i}\geq t))^{K}$ $\displaystyle=$ $\displaystyle(1-P_{\pi}(T_{i} $\displaystyle\leq$ $\displaystyle\exp(-KP_{\pi}(T_{i} $\displaystyle\leq$ $\displaystyle\exp\left(-\frac{Kt^{1/2}}{5n}\right)$

by Proposition 6.16 (iii), provided $t\leq n^{2}$. So

 $P(C^{[K]}\geq t)\leq\sum_{i}P(T^{[K]}_{i}\geq t)\leq n\exp\left(-\frac{Kt^{1/2% }}{5n}\right),\ t\leq n^{2}.$ (6.25)

The bound becomes $1$ for $t_{0}=\frac{25n^{2}}{K^{2}}\log^{2}n$. So

 $\displaystyle EC^{[K]}$ $\displaystyle=$ $\displaystyle\sum_{t=1}^{\infty}P(C^{[K]}\geq t)$ $\displaystyle\leq$ $\displaystyle\lceil t_{0}\rceil+\sum_{t=\lceil t_{0}\rceil+1}^{n^{2}-1}n\exp% \left(-\frac{Kt^{1/2}}{5n}\right)+\sum_{t=n^{2}}^{\infty}P(C^{[K]}\geq t)$ $\displaystyle=$ $\displaystyle\lceil t_{0}\rceil+S_{1}+S_{2},\mbox{ say,}$

and the issue is to show that $S_{1}$ and $S_{2}$ are $o(t_{0})$. To handle $S_{2}$, split the set of $K$ walks into subsets of sizes $K-1$ and $1$. By independence, for $t\geq n^{2}$ we have $P(C^{[K]}\geq t)\leq P(C^{[K-1]}\geq n^{2})P(C^{[1]}\geq t)$. Then

 $\displaystyle S_{2}$ $\displaystyle\leq$ $\displaystyle P(C^{[K-1]}\geq n^{2})EC^{[1]}\mbox{ by summing over }t$ $\displaystyle\leq$ $\displaystyle n\exp(-(K-1)/5)\cdot 6n^{2}\mbox{ by (\ref{CKn}) and Theorem % \ref{K0}}$ $\displaystyle=$ $\displaystyle o(t_{0})\mbox{ using the hypothesis }K\geq 6\log n.$

To bound $S_{1}$ we start with a calculus exercise: for $u>1$

 $\displaystyle\int_{u^{2}}^{\infty}\exp(-x^{1/2})\ dx$ $\displaystyle=$ $\displaystyle\int_{u}^{\infty}2y\ \exp(-y)\ dy\mbox{ by putting }x=y^{2}$ $\displaystyle\leq$ $\displaystyle 2e^{-1}u\int_{u}^{\infty}\exp(-\left(\frac{u-1}{u}\right)y)\ dy% \mbox{ , using }\frac{y}{u}\leq\exp(\frac{y}{u}-1)$ $\displaystyle=$ $\displaystyle\frac{2u^{2}\exp(-u)}{u-1}.$

The sum $S_{1}$ is bounded by the corresponding integral over $[t_{0},\infty)$ and the obvious calculation, whose details we omit, bounds this integral by $2t_{0}/(\log n-1)$.

## 6.4.2 Bounding point probabilities

Our second application is to universal bounds on point probabilities. A quite different universal bound will be given in Chapter yyy.

###### Proposition 6.18

For continuous-time random walk on a regular $n$-vertex graph,

 $\displaystyle P_{i}(X_{t}=j)$ $\displaystyle\leq$ $\displaystyle 5t^{-1/2},\ t\leq n^{2}$ $\displaystyle\leq$ $\displaystyle\frac{1}{n}+\frac{K_{1}}{n}\exp\left(\frac{-t}{K_{2}n^{2}}\right)% ,\ t\geq n^{2}$

where $K_{1}$ and $K_{2}$ are absolute constants.

In discrete time one can get essentially the same result, but with the bounds multiplied by $2$, though we shall not give details (see Notes).

Proof. $P_{i}(X_{t}=i)$ is decreasing in $t$, so

 $P_{i}(X_{t}=i)\leq t^{-1}\int_{0}^{t}P_{i}(X_{s}=i)ds=t^{-1}E_{i}N_{i}(t)\leq 5% t^{-1/2}$

where the last inequality is Proposition 6.16 (iii), whose proof is unchanged in continuous time, and which holds for $t\leq n^{2}$. This gives the first inequality when $i=j$, and the general case follows from Chapter 3 yyy.

For the second inequality, recall the definition of separation $s(t)$ from Chapter 4 yyy. Given a vertex $i$ and a time $t$, there exists a probability distribution $\theta$ such that

 $P_{i}(X_{t}\in\cdot)=(1-s(t))\pi+s(t)\theta.$

Then for $u\geq 0$,

 $P_{i}(X_{t+u}=j)-\frac{1}{n}=s(t)\left(P_{\theta}(X_{u}=j)-\frac{1}{n}\right).$

Thus, defining $q(t)=\max_{i,j}\left(P_{i}(X_{t}=j)-\frac{1}{n}\right)$, we have proved

 $q(t+u)\leq s(t)q(u);\ t,u\geq 0.$ (6.26)

Now $q(n^{2})\leq 4/n$ by the first inequality of the Proposition, and $s(\tau_{1}^{(1)})=e^{-1}$ by definition of $\tau_{1}^{(1)}$ in Chapter 4 yyy, so by iterating (6.26) we have

 $q(n^{2}+m\tau_{1}^{(1)})\leq\frac{4}{n}\ e^{-m},\ m\geq 1.$ (6.27)

But by Chapter 4 yyy we have $\tau_{1}^{(1)}\leq K\tau^{*}$ for an absolute constant $K$, and then by Corollary 6.9 we have $\tau_{1}^{(1)}\leq 3Kn^{2}$. The desired inequality now follows from (6.27).

## 6.4.3 A cat and mouse game

Here we reconsider the cat and mouse game discussed in Chapter 4 section yyy. Recall that the cat performs continuous-time random walk on a $n$-vertex graph, and the mouse moves according to some arbitrary deterministic strategy. Let $M$ be the first meeting time, and let $m^{*}$ be the maximum of $EM$ over all pairs of initial vertices and all strategies for the mouse.

###### Proposition 6.19

On a regular graph, $m^{*}\leq KN^{2}$ for some absolute constant $K$.

Proof. The proof relies on Proposition 6.18, whose conclusion implies there exists a constant $K$ such that

 $p^{*}(t)\equiv\max_{x,v}p_{vx}(t)\leq\frac{1}{n}+Kt^{-1/2};\ 0\leq t<\infty.$

Consider running the process forever. The point is that, regardless of the initial positions, the chance that the cat and mouse are “together” (i.e. at the same vertex) at time $u$ is at most $p^{*}(u)$. So in the case where the cat starts with the (uniform) stationary distribution,

 $s$ $\displaystyle=$ $s$ where $f$ is the density function of $M$ $\displaystyle\leq$ $\displaystyle\int_{0}^{s}f(u)p^{*}(s-u)du$ $\displaystyle\leq$ $\displaystyle\frac{1}{n}P(M\leq s)\ +K\int_{0}^{s}f(u)(s-u)^{-1/2}du.$

So

 $\displaystyle\frac{t}{n}$ $\displaystyle=$ $s$ $\displaystyle\leq$ $\displaystyle\frac{1}{n}\int_{0}^{t}P(M\leq s)ds\ +K\int_{0}^{t}f(u)du\int_{u}% ^{t}(s-u)^{-1/2}ds$ $\displaystyle=$ $\displaystyle\frac{t}{n}-\frac{1}{n}\int_{0}^{t}P(M>s)ds+2K\int_{0}^{t}f(u)(t-% u)^{1/2}du$ $\displaystyle\leq$ $\displaystyle\frac{t}{n}-\frac{1}{n}E\min(M,t)+2Kt^{1/2}.$

Rearranging, $E\min(M,t)\leq 2Knt^{1/2}$. Writing $t_{0}=(4Kn)^{2}$, Markov’s inequality gives $P(M\leq t_{0})\geq 1/2$. This inequality assumes the cat starts with the stationary distribution. When it starts at some arbitrary vertex, we may use the definition of separation $s(u)$ (recall Chapter 4 yyy) to see $P(M\leq u+t_{0})\geq(1-s(u))/2$. Then by iteration, $EM\leq\frac{2(u+t_{0})}{1-s(u)}$. So appealing to the definition of $\tau_{1}^{(1)}$,

 $m^{*}\leq\frac{2}{1-e^{-1}}(t_{0}+\tau_{1}^{(1)}).$

But results from Chapter 4 and this chapter show $\tau_{1}^{(1)}=O(\tau^{*})=O(n^{2})$, establishing the Proposition.