STAT
135 SPRING '12
A. Adhikari
PROBABILITY PREREQUISITES
Please make sure that you have a solid grasp of the
following
ideas and results from your probability class. I have included
references from Probability
by Jim Pitman and from the early chapters of our own class text (Mathematical Statistics with Data Analysis, by John Rice). Many
of you will have used Pitman's book as the text in your Stat 134 class.
I have added references from Probability
Theory
by Hoel, Port, and Stone (I will call it HPS), as a few of you have
used that text. If you have used yet another text, please look up
the index of that book to find the topics below.
Expectation: properties
(especially linearity), and the method of indicators. Pitman pages
180-181 are excellent as is the method of indicators on page 168; Rice
Section 4.1 is long but it contains all you need. In HPS see
Theorem 2 of Section 4.2. I don't see the method of indicators in
that text.
Variance: properties,
especially variances of linear transformations, variances of sums and
the relation
with independence, and the square root law. Pitman Section 3.3 is
great and has diagrams that indicate why many of the results are true.
Rice Section 4.2 also has everything but is less visual.
HPS has
the definition and a short discussion of variance on pages 93-94,
variances of sums in Section 4.4 on pages 96-99, and Chebychev's
inequality in Section 4.6 on pages 100-101. See Example 12 on
page 98, on the hypergeometric distribution. See how they write the
finite population correction.
If you have access to Pitman's text, then to test your understanding
please go over the calculation of the mean
and variance of the hypergeometric distribution on pages
241-243. This will remind you of the finite population correction.
Also please study the Summary of
Chapter 3, pages 248-249 of Pitman.
Covariance: this
appears in the calculation of the variance of a sum of dependent
random variables. Pitman 6.4 (pages 430-431), Rice Section 4.3
(pages 138-139), HPS Section 4.5, pages 99-100.
The Law of Large Numbers (a.k.a.
the
law
of
averages):
As
n
increases, the average of n i.i.d.
random
variables
converges
(in
a
particular
sense,
and
under certain
conditions) to the common expectation of the variables. Pitman
page 195, Rice page 178, HPS page 102. Going over this is
especially useful
because it forces you to recall the expectation
and
variance
of
the
mean
of
an
i.i.d. sample.
THE CENTRAL LIMIT THEOREM (yes,
it's
in
bold
upper
case
enlarged
font).
The normal approximation to the probability distribution of the
sum of a large number of i.i.d. random variables with finite variance.
Pitman's treatment is, in my view, more accessible than Rice's.
Read Pitman page 196 and the example on page 197. Go over
the pictures on pages 199-201. Then read page 268. Go back and
thoroughly absorb the special case of the normal approximation to the
binomial, on pages 98-102.
HPS has a clear treatment in Section 7.5, pages
183-187.
Change of variable for densities:
The density of a function of a random variable whose density you
already know. The most commonly used case in Stat 135 is the linear
function. Read the para above the box at the bottom of page 302 of
Pitman, and then the formula in the box will be easy to rederive from
first principles should you forget it. One-to-one non-linear functions
are covered on page 304; see page 306-307 for what to do when the
function is not one-to-one. The diagrams are very helpful for
understanding the derivations, and if you remember the picture clearly
then you can rederive the formula yourself if necessary.
Rice has the general one-to-one case in Proposition B on
page 62. HPS does it page 119. The special case of the linear function
is in Example 8 on page 121. Neither of these texts provides a diagram.
Special distributions: I expect
you to be thoroughly familiar with properties of the following
distributions: Bernoulli (that is, the distribution of an indicator),
binomial, hypergeometric, geometric, negative binomial, uniform (both
discrete and continuous), exponential and gamma, beta, normal (in
one and two dimensions).