As mentioned in Chapter 9 section 5.1 (yyy version 9/1/99) there has been intense theoretical study of the problem of sampling uniformly from a convex set in $R^{d}$, in the $d\to\infty$ limit. This problem turns out to be essentially equivalent to the problem of sampling from a log-concave density $f$, that is a density of the form $f(x)\propto\exp(-H(x))$ for convex $H$. The results are not easy to state; see Bubley et al [78] for discussion.

Here is a special setting in which one can make rigorous inferences from MCMC without rigorous bounds on mixing times. Suppose we have a guess $\hat{\tau}$ at the relaxation time of a Markov sampler from a traget distribution $\pi$; suppose we have some separate method of sampling exactly from $\pi$, but where the cost of one exact sample is larger than the cost of $\hat{\tau}$ steps of the Markov sampler. In this setting it is natural to take $m$ exact samples and use them as initial states of $m$ multiple runs of the Markov sampler. It turns out (see [5] for precise statement) that one can obtain confidence intervals for a mean $\bar{g}$ which are always rigorously correct (without assumptions on $\tau_{2}$) and which, if $\hat{\tau}$ is indeed approximately $\tau_{2}$, will have optimal length, that is the length which would be implied by this value of $\tau_{2}$.

11.5 The diffusion heuristic for optimal scaling of high dimensional
MetropolisBibliography11.7 Notes on Chapter MCMC

Generated on Mon Jun 2 14:23:48 2014 by LaTeXML