5.1.1 Simple symmetric random walk on the integers
It is useful to record some elementary facts about
simple symmetric random walk on the (infinite) set of all
integers.
As we shall observe, these may be derived in several different ways.
A fundamental formula gives exit probabilities:
|
|
|
(5.5) |
An elementary argument is that
satisfies the -step recurrence
|
|
|
|
|
|
whose solution is .
At a more sophisticated level, (5.5) is a martingale
result.
The quantity
must satisfy
|
|
|
where the first equality is the optional sampling theorem for the martingale
, and solving this equation gives (5.5).
For , note that is the “exit time” from the open
interval .
We can use (5.5) to calculate the “exit before return”
probability
|
|
|
|
|
(5.6) |
|
|
|
|
|
|
|
|
|
|
For the walk started at , let be the mean number of
visits to before the exit time .
(Recall from Chapter 2 our convention that “before time ” includes
time but excludes time ).
The number of returns to clearly has a Geometric distribution,
so by (5.6)
|
|
|
(5.7) |
To get the analog for visits to we consider whether or not is hit
at all before exiting; this gives
|
|
|
Appealing to (5.5) and (5.7) gives
the famous mean occupation time formula
|
|
|
|
|
(5.8) |
Now the (random) time to exit must equal the sum of the (random) times
spent at each state.
So, taking expectations,
|
|
|
and after a little algebra we obtain
Lemma 5.2
This derivation of Lemma 5.2 from (5.8) has the advantage
of giving the mean occupation time formula (5.8) on the way.
There are two alternative ways to prove Lemma 5.2.
An elementary proof is to set up and solve the -step recurrence for
:
|
|
|
|
|
|
The more elegant proof uses a martingale argument.
Taking without loss of generality, the first equality below
is the optional sampling theorem for the martingale
:
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
The preceding discussion works in discrete or continuous time.
Exact distributions at time will of course differ in the two
cases.
In discrete time we appeal to the Binomial distribution for the number
of steps, to get
|
|
|
(5.9) |
and a similar expression for odd times .
In continuous time, the numbers of and of steps in time
are independent Poisson variables, so
|
|
|
(5.10) |
5.1.2 Weighted linear graphs
Consider the -vertex linear graph
– – – –
with arbitrary edge-weights ,
where is the weight on edge .
Set to make some later formulas cleaner.
The corresponding discrete-time random walk has transition probabilities
|
|
|
and stationary distribution
|
|
|
where .
In probabilistic terminology, this is a birth-and-death process,
meaning that a transition cannot alter the state by more than .
It is elementary that such processes are automatically reversible
(xxx spells out the more general result for trees),
so as discussed in Chapter 3 yyy the set-up above with weighted
graphs gives the general discrete-time birth-and-death process
with .
But note that the continuization does not give the general
continuous-time birth-and-death process, which has parameters
instead of just parameters .
The formulas below could all be extended to this general case
(the analog of Proposition 5.3 can be found in undergraduate
textbooks,
e.g., Karlin and Taylor [208] Chapter 4)
but our focus is on the simplifications which occur in the “weighted
graphs” case.
Proposition 5.3
(a) For ,
|
|
|
(b) For ,
|
|
|
(c) For ,
|
|
|
Note that we can obtain an expression for , , by reflecting
the weighted graph about its center.
Proof.
These are extensions of (5.5,5.1,5.2) and recycle
some of the previous arguments.
Writing , we have that is
a martingale, so
|
|
|
for .
Solving this equation gives , which is (a).
The mean hitting time formula (b) has four different proofs!
Two that we will not give are as described below Lemma 5.2:
Set up and solve a recurrence equation, or use a well-chosen martingale.
The slick argument is to use the essential edge lemma (Lemma 5.1)
to show
|
|
|
Then
|
|
|
establishing (b).
Let us also write out the non-slick argument, using mean occupation times.
By considering mean time spent at ,
|
|
|
(5.11) |
where is the expectation, starting at , of the
number of visits to before . But
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
Substituting this and (a) into (5.11) leads to the formula
stated in (b).
Finally, (c) can be deduced from (b), but it is more elegant to use
the essential edge lemma to get
|
|
|
(5.12) |
and then use
|
|
|
We now start some little calculations relating to the parameters
discussed in Chapter 4.
Plainly, from Proposition 5.3
|
|
|
(5.13) |
Next, consider calculating .
We could use Proposition 5.3(b),
but instead let us apply Theorem yyy of Chapter 3,
giving in terms of unit flows from to .
In a linear graph there is only one such flow, which for
has ,
and for has ,
and so the Proposition implies
|
|
|
(5.14) |
There are several ways to use the preceding results to compute the
average hitting time parameter . Perhaps the most elegant is
|
|
|
|
|
(5.15) |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
There are sophisticated methods (see Notes) of studying ,
but let us just point out that Proposition 5.23 later
(proved in the more general context of trees) holds in the present setting,
giving
|
|
|
(5.16) |
We do not know an explicit formula for , but
we can get an upper bound easily from
the “distinguished paths” result Chapter 4 yyy.
For the path has
and hence the bound is
|
|
|
(5.17) |
jjj This uses the Diaconis–Stroock version. The Sinclair version is
|
|
|
xxx literature on (van Doorn, etc.)
jjj Also relevant is work of N. Kahale (and others) on how optimal choice of
weights in use of Cauchy–Schwarz inequality for
Diaconis–Stroock–Sinclair leads to equality in case of birth-and-death
chains.
jjj See also Diaconis and Saloff-Coste Metropolis paper, which mentions
work of Diaconis students on Metropolizing birth-and-death chains.
xxx examples of particular w. jjj might just bring up as
needed?
xxx contraction principle and lower bounds on (relating to current
Section 6 of Chapter 4)
By Chapter 4 Lemma yyy,
|
|
|
(5.18) |
5.1.3 Useful examples of one-dimensional chains
This is the birth-and-death chain on with and , where and are
arbitrarily specified. Since and are positive, this does
not quite fit into the framework of Section 5.1.2, but everything is
nonetheless easy to calculate. The stationary distribution is given by
|
|
|
In discrete time, the eigenvalues are and , and in the notation of Chapter 3, Section yyy
for the spectral
representation, the matrix has , , and
with normalized right eigenvectors
|
|
|
The transition probabilities are given by
|
|
|
|
|
|
|
|
|
|
in discrete time and by
|
|
|
|
|
|
|
|
|
|
in continuous time.
It is routine to calculate , , and
|
|
|
and then
|
|
|
and
|
|
|
Example 5.5
Biased random walk with reflecting barriers.
We consider the chain on with reflecting barriers at
and that at each unit of time moves distance rightward with
probability and distance leftward with probability .
Formally, the setting is that of Section 5.1.2 with
|
|
|
where we assume and all asymptotics developed for this
example are for fixed and large . If , there is by
symmetry no loss of generality in assuming , and the case will be treated later in Example 5.8.
Specializing the results
of Section 5.1.2 to the present example, one can easily derive the
asymptotic results
|
|
|
(5.19) |
and, by use of (5.15),
|
|
|
(5.20) |
For , the maximizing in (5.18) equals ,
and this leads to
|
|
|
(5.21) |
The spectral representation can be obtained using the orthogonal polynomial
techniques described in Karlin and Taylor [209] Chapter 10; see
especially Section 5(b) there. The reader may verify that the eigenvalues
of in discrete time are , , and, for ,
|
|
|
with (unnormalized) right eigenvector
|
|
|
In particular,
|
|
|
(5.22) |
The random walk has drift .
It is not hard to show for fixed that the distances
and of Chapter 4 yyy
converge to if
and to if .
jjj include details? In fact, the cutoff occurs at : cf. (e.g.) Example 4.46 in [113]. Continue same paragraph:
In particular,
|
|
|
(5.23) |
Example 5.6
The queue.
We consider the queue. Customers queue up at a
facility to wait for a single server (hence the “”) and are handled
according to a “first come, first served” queuing discipline. The first
“M” specifies that the arrival point process is Markovian, i.e., a Poisson
process with intensity parameter (say); likewise, the second “M”
reflects our assumption that the service times are exponential with
parameter (say). The parameter is the queue size limit;
customers arriving when the queue is full are turned away.
We have described a continuous-time birth-and-death process with constant
birth and death rates and , respectively. If , this is nearly the continuized biased random walk of
Example 5.5, the only difference being in the boundary
behavior. In particular, one can check that the asymptotics
in (5.19)–(5.23) remain unchanged, where , called the traffic intensity, remains fixed and
becomes large. For the queue, the stationary
distribution is the conditional distribution of given ,
where has the Geometric() distribution. The eigenvalues are
and, for ,
|
|
|
with (unnormalized) right eigenvector
|
|
|