STATISTICS 135, FALL 06
Ani Adhikari
WHAT I EXPECT YOU TO KNOW "COLD" (that is, without having to look it up)
Probability pre-requisites. Basics from Stat 134, with particular emphasis on:
1. E and Var of aX+bY+c, whatever the nature of the dependence between X and Y. Cov(aX+bY, cW+dZ).
2. Cdf's and densities; change of variable formula for densities, especially the linear case.
3. Law of large numbers, Central Limit Theorem.
4. For the binomial, Poisson, normal, and exponential: all basic facts.
a) Probability/density function
b) Cdf if there's a nice form
c) E and Var
d) Relations and functions: e.g. sum of independent Poissons is Poisson; properties of binomial proportion
e) Limits
5. For the gamma:
a) Density function and how to operate with it. E.g. how to find moments.
b) Relations
c) Limits
Material covered in this class
1. All terminology: every defined term, e.g. "unbiased", "significance level", "Fisher information", "composite hypothesis", etc.
2. Delta-method for
approximating E and Var of functions of a familiar random variable.
I suggest that rather than memorizing the approximations, you
just learn how to derive the approximations by the Taylor expansion. The only thing to remember is that for E you go out to the
square term in the expansion, whereas for Var the linear term is
enough.
In what follows, "estimation" means: the estimate itself, its
distribution (or asymptotic distribution) including expectation and
variance, confidence intervals and their interpretation. I have
listed other issues in each particular case.
3. Estimation
of the population mean and variance when sampling with replacement from
a finite population. You should have a good intuitive idea of what
changes when the sampling is without replacement, but you don't have to
remember all the additional factors that enter into the standard errors.
4. Method of moments estimation: the procedure and convergence properties.
5. Maximum likelihood
estimation: How to find the likelihood; maximizing; Fisher information;
asymptotic normality (inluding asymptotic mean and variance), relation
with Cramer-Rao lower bound on variance.
6. Testing hypotheses: all terminology.
Likelihood ratio test including Neyman-Pearson Lemma about power;
generalized likelihood ratio test. All parametric tests for
population means. Calculation of power.
Important: the assumptions! In a given situation, should you do a test at all? If so, which one?
7. Facts about the easy standard distributions
(binomial, Poisson, normal, exponential): In how many ways can
you estimate the parameters? Do the estimates differ based on the
method of estimation? What are the properties of the estimates?
In the case of the normal and the binomial, what are the standard
tests of hypotheses about the parameters?