There are a few (and only a few) examples where one can study $EC$ by bare-hands exact calculations. Write $h_{n}$ for the harmonic sum

$h_{n}=\sum_{i=1}^{n}i^{-1}\sim\log n.$ | (6.7) |

(a) The coupon collector’s problem. Many textbooks discuss this classical problem, which involves $C$ for the chain $(X_{t};t\geq 0)$ whose values are independent and uniform on an $n$-element set, i.e. random walk on the complete graph with self-loops. Write (cf. the proof of Matthews’ method, Chapter 2 yyy) $C^{m}$ for the first time at which $m$ distinct vertices have been visited. Then each step following time $C^{m}$ has chance $(n-m)/n$ to hit a new vertex, so $E(C^{m+1}-C^{m})=n/(n-m)$, and so

$EC=\sum_{m=1}^{n-1}E(C^{m+1}-C^{m})=nh_{n-1}.$ | (6.8) |

(By symmetry, $E_{v}C$ is the same for each initial vertex, so we just write $EC$) It is also a textbook exercise (e.g. [133] p. 124) to obtain the limit distribution

$n^{-1}(C-n\log n)\ \stackrel{d}{\rightarrow}\ \xi$ | (6.9) |

where $\xi$ has the extreme value distribution

$P(\xi\leq x)=\exp(-e^{-x}),\ \ -\infty<x<\infty.$ | (6.10) |

We won’t go into the elementary derivations of results like (6.9) here, because in Chapter 7 yyy we give more general results.

(b) The complete graph. The analysis of $C$ for random walk on the complete graph (i.e. without self-loops) is just a trivial variation of the analysis above. Each step following time $C^{m}$ has chance $(n-m)/(n-1)$ to hit a new vertex, so

$EC=(n-1)h_{n}\sim n\log n.$ | (6.11) |

And the distribution limit (6.9) still holds. Because $E_{v}T_{w}=n-1$ for $w\neq v$, we also have

$EC^{+}=EC+(n-1)=(n-1)(1+h_{n-1})\sim n\log n.$ | (6.12) |

(c) The $n$-star (Chapter 5 Example yyy). Here the visits to the leaves (every second step) are exactly i.i.d., so we can directly apply the coupon collector’s problem. For instance, writing $v$ for the central vertex and $l$ for a leaf,

$\displaystyle E_{l}C$ | $\displaystyle=$ | $\displaystyle 2(n-1)h_{n-2}\sim 2n\log n$ | ||

$\displaystyle E_{v}C^{+}$ | $\displaystyle=$ | $\displaystyle 1+E_{l}C+1=2(n-1)h_{n-1}\sim 2n\log n$ |

and $C/2$ satisfies (6.9). Though we won’t give the details, it turns out that a clever inductive argument shows these are the minima over all trees.

On an $n$-vertex tree,

$\min_{v}E_{v}C\geq 2(n-1)h_{n-2}$ |

$\min_{v}E_{v}C^{+}\geq 2(n-1)h_{n-1}.$ |

(d) The $n$-cycle. Random walk on the $n$-cycle is also easy to study. At time $C^{m}$ the walk has visited $m$ distinct vertices, and the set of visited vertices must form an interval $[j,j+m-1]$, say, where we add modulo $n$. At time $C^{m}$ the walk is at one of the endpoints of that interval, and $C^{m+1}-C^{m}$ is the time until the first of $\{j-1,j+m\}$ is visited, which by Chapter 5 yyy has expectation $1\times m$. So

$EC=\sum_{m=1}^{n-1}E(C^{m+1}-C^{m})=\sum_{i=1}^{n-1}i=\frac{1}{2}n(n-1).$ |

There is also an expression for the limit distribution (see Notes).

The $n$-cycle also has an unexpected property. Let $V$ denote the last vertex to be hit. Then

$\displaystyle P_{0}(V=v)$ | $\displaystyle=$ | $\displaystyle P_{0}(T_{v-1}<T_{v+1})P_{v-1}(T_{v+1}<T_{v})$ | ||

$\displaystyle\ \ \ \ +P_{0}(T_{v+1}<T_{v-1})P_{v+1}(T_{v-1}<T_{v})$ | ||||

$\displaystyle=$ | $\displaystyle\frac{n-(v+1)}{n-2}\frac{1}{n-1}+\frac{v-1}{n-2}\frac{1}{n-1}$ | |||

$\displaystyle=$ | $\displaystyle\frac{1}{n-1}.$ |

In other words, the $n$-cycle has the property

For any initial vertex $v_{0}$, the last-visited vertex $V$ is uniform on the states excluding $v_{0}$.

Obviously the complete graph has the same property, by symmetry. Lovasz and Winkler [219] gave a short but ingenious proof that these are the only graphs with that property, a result rediscovered in [179].

Generated on Mon Jun 2 14:23:48 2014 by LaTeXML