STAT 135  FALL 06
A. Adhikari

PROBABILITY PREREQUISITES

Please make sure that you recall thoroughly the following ideas and results from your probability class.  I have included references from Probability by Jim Pitman, and from the early chapters of our own text.  Many of you will have used Pitman's book as the text in your Stat 134 class.  I have added references from Probability Theory by Hoel, Port, and Stone (I will call it HPS), as a few of you have used that text.  If you have used yet another text, please look up the index of that book to find the topics below.

Expectation: properties (especially linearity), and the method of indicators  Pitman pages 180-181 are great, as is the method of indicators on page 168; Rice Section 4.1 is long but it contains all you need.  In HPS see Theorem 2 of Section 4.2.  I don't see the method of indicators in that text.

Variance: properties, especially linear transformations, variances of sums and the relation with independence, and the square root law.  Pitman Section 3.3 is great and has diagrams that indicate why many of the results are true.  Rice Section 4.2 has everything but is less visual.  HPS has the definition and a short discussion of variance on pages 93-94, variances of sums in Section 4.4 on pages 96-99, and Chebychev's inequality in Section 4.6 on pages 100-101.  See Example 12 on page 98, on the hypergeometric distribution. See how they write the finite population correction.

If you have access to Pitman's text, then to test your understanding please go over the calculation of the mean and variance of the hypergeometric distribution on pages 241-243.  This will remind you of the finite population correction.

Also please read the Summary of Chapter 3, pages 248-249 of Pitman.

Covariance:  the term that appears in the calculation of the variance of the sum of dependent random variables.  Pitman 6.4 (pages 430-431), Rice Section 4.3 (pages 138-139), HPS Section 4.5, pages 99-100.

The Law of Large Numbers (a.k.a. the law of averages):  As n increases, the average of n i.i.d. random variables converges (in a particular sense, and under certain conditions) to the common expectation of the variables.  Pitman page 195, Rice page 178, HPS page 102.  Going over this is especially useful because it forces you to recall the expectation and variance of the mean of an i.i.d. sample.

THE CENTRAL LIMIT THEOREM (yes, it's in bold upper case enlarged font).  The normal approximation to the probability distribution of the sum of a large number of i.i.d. random variables with finite variance. Pitman's treatment is, in my view, much more accessible than Rice's.  Read Pitman page 196 and the example on page 197.  Go over the pictures on pages 199-201.  Then read page 268. Go back and thoroughly absorb the special case of the normal approximation to the binomial, on pages 98-102.  You will find many calculations that are similar to what we've done in the first couple of lectures.
   HPS has a clear treatment in Section 7.5, pages 183-187.