2.6 Matthews’ method for cover times
Theorem 2.26 below is the only non-classical result in this Chapter.
We make extensive use of this
Matthews’ method in Chapter 6 to analyze
cover times for random walks on graphs.
Consider the cover time of the chain,
i.e. the time required to visit every state.
How can we bound in terms of the mean hitting times ?
To appreciate the cleverness of Theorem 2.26
let us first consider a more routine argument.
Write .
Since is unaffected by continuization, we may
work in continuous time.
By (2.20)
|
|
|
By Boole’s inequality, for an -state chain
|
|
|
One can rewrite this successively as
|
|
|
|
|
|
|
|
|
|
In words, this says that the distribution of
is stochastically smaller that the exponential distribution,
implying
and hence
|
|
|
This argument does lead to a bound, but one suspects the
factors and are artifacts of the proof; also,
it seems hard to obtain a lower bound this way.
The following result both “cleans up” the upper bound and gives
a lower bound.
Theorem 2.26 (Matthews [257])
For any -state Markov chain,
|
|
|
|
|
|
where .
Proof.
We’ll prove the lower bound — the upper bound proof is identical.
Let be a uniform random ordering of the states,
independent of the chain.
Define
to be the time until all of
have been visited, in some order.
The key identity is
|
|
|
(2.26) |
where
and
|
|
|
To understand what this says, suppose we are told which are the states
and told the path of the chain up through time .
Then we know whether or not :
if not, then ,
and if so, then the conditional distribution of
is the distribution of the time to hit from the state at
time , which we are told is state .
Writing , the right side of
(2.26) is , and so taking expectations
|
|
|
But obviously by symmetry. So
|
|
|
Allowing for the possibility we see , and the lower bound follows.