1


2


3


4


5

 Let X be the number rolled with a fair die.
 Question: What is the expected
value of X?

6

 Question: Let Z be a variable with a binomial(10, ½ ) distribution.
 What is E[Z]?

7


8


9

 For any two random variables X and Y defined over the same sample space
 E(X+Y) = E(X) + E(Y).
 Consequently, for a sequence of random variables X_{1},X_{2},…,X_{n},
 E(X_{1}+X_{2}+…+X_{n}) = E(X_{1})_{ }+
E(X_{2})_{ }+…+ E(X_{n}).
 Therefore the mean of Bin(10,1/2)
= 5.

10

 Multiplication rule: E[aX] = a E[X].
 E[aX] = å_{x}
a x P[X = x] = a å_{x} P[X = x] = a E[x]
 If X ¸ Y then E[X] ¸ E[Y].
 This follows since XY is non negative and
 E[X] – E[Y] = E[XY] ¸ 0.

11

 Let T be the sum of two dice. What’s E(T)?
 The “easy” way:
 E(T) = å_{t} tP(T=t).
 This sum will have 11 terms.
 We could also find the center of mass of the histogram (easy to do by
symmetry).

12


13

 Or we can using the addition rule:
 T=X_{1} + X_{2},
where X_{1} = 1^{st} role, X_{2} = 2^{nd}:
 E(T) = E(X_{1})+ E(X_{2}) = 3.5+3.5 = 7.

14


15


16


17


18

 Let X^{i } = indicator of
success on the i^{th }coin toss (X^{i} = 1 if the i^{th }coin toss = H head, and X^{i}
= 0 otherwise).
 The sequence X^{1 },^{ }X^{2 }, ^{… },^{
}X^{n }is a sequence of n independent variables with
Bernoulli(p) distribution over {0,1}.
 The number of heads in n coin tosses given by S_{n} = X^{1 }+^{
}X^{2 }+ ^{… }+^{ }X^{n.}
 E(S_{n}) = nE(X^{i}) = np
 Thus the mean of Bin(n,p) RV = np.

19

 Let Y be the number of aces in a poker hand.
 Then:
 Y = I_{1st ace }+ I_{3rd ace }+ I_{4th ace }+ I_{5th
ace }+ I_{2nd ace }.
 And: E(Y) = 5*P(ace) = 5*4/52 = 0.385.
 Alternatively, since Y has the hypergeometric distribution we can
calculate:

20


21

 The equality
 X(outcome)=å_{i } I_{i}(outcome)
 follows since if X(outcome) = i, then
 outcome2 A_{1}ÅA_{2}Å…ÅA_{i. }but not to A_{j},
j>i.
 So (I_{1}+I_{2}+…+I_{i}+I_{i+1}+…)(outcome)
=
 1+1+…+1+0+0+… = i.

22


23


24


25


26


27


28


29


30


31

 Monomials:
 g(X)=X^{k} ) E[g(x)]=å_{x} x^{k}P(X=x).

32


33

 If X and Y are two random variables we obtain:
 E(g(X,Y))= å_{{all (x,y)}} g(x,y)P(X=x, Y=y).
 This allows to prove that E[X+Y] = E[X] + E[Y]:
 E(X) = å_{{all (x,y)}} x P(X=x, Y=y);
 E(Y) = å_{{all (x,y)}} y P(X=x, Y=y);
 E(X+Y) = å_{{all (x,y)}} (x+y) P(X=x, Y=y);
 E(X+Y) = E(X) + E (Y)

34

 E(g(X,Y))= å_{{all (x,y)}} g(x,y)P(X=x, Y=y).
 Product:
 E(XY) = å_{{all (x,y)}} xy P(X=x, Y=y);
 E(XY) = å_{x} å_{y }xy P(X=x, Y=y);
 Is E(XY) = E(X)E(Y)?

35

 However, if X and Y are independent,
 P(X=x,Y=y)=P(X=x)P(Y=y)
 then product formula simplifies:
 E(XY) = å_{x} å_{y }xy P(X=x) P(Y=y)
 = (å_{x}x P(X=x)) (å_{y }y P(Y=y)) =
 E(X) E(Y)
 If X and Y are independent then:
 E(XY) =E(X) E(Y);

36

 If we repeatedly sample from the distribution of X then P(X=x) will be
close to the observed frequency of x in the sample.
 E(X) will be approximately the longrun average of the sample.

37

 The Mode of X is the most likely
possible value of X.
 The mode need not be unique.
 The Median of X is a number m
such that both P(X·m) ¸ ½ and
P(X ¸ m) ¸ ½.
 The median may also not be unique.
 Mean and Median are not necessarily possible values (mode is).

38

 For a symmetrical distribution, which has a unique Mode, all three: Mean,
Mode and Median are the same.

39

 For a distribution with a long right tail Mean is greater than the Median.

40


41

 Suppose we want to be $1 on Red. Our chance of winning is 18/38.
 Question:
 What should be the payoff to make it a fair bet?

42

 This question really only makes sense if we repeatedly bet $1 on Red.
 Suppose that we could win $x if Reds come up and lose $1,
otherwise. If X denotes our
returns then P(X=x) = 18/38; P(X=1)=20/38.
 In a fair game, we expect to break even on average.

43

 Our expected return is:
 x*18/38  1*20/38.
 Setting this to zero gives us
 x=20/18=1.1111111… .
 This is greater than the payoff of 1:1 that is offered by the casino.
