STAT8310 Topic5 1 2021
STAT8310 Topic5 1 2021
STAT8310 Topic5 1 2021
2021
Topic 5.1
Generating functions
= k (k − 1) · · · 1ak
∞
X
+ j (j − 1) · · · (j − k + 1) aj z j−k .
j=k+1
• Thus
✷
• We shall assume from now on that X is non-negative.
• From above, we can find fX (x) from the formula
k
1 d
fX (k) = k
GX (z)
k! dz z=0
GX (z) = (1 − p) z 0 + pz
= 1 − p + pz.
where
r r(r − 1) · · · (r − i + 1)
= for r any real number
i i!
Gives
∞
X −k
p−k = (−1)y (1 − p)y .
y=0
y
= pr z r {1 − (1 − p) z}
−r
r
pz
STAT8310 2021 = . Topic 5.1 12
1 − (1 − p) z
• Note that the pgf in 5 is the rth power of the pgf in 4, and that
the pgf in 2 is the nth power of the pgf in 1. This is not by
accident!
• Exercise: Now find P (X = x) from the above pgfs.
• The function G(z) is differentiable for |z| < 1 and its derivative is
X
′
G (z) = xf (x)z x−1 < ∞.
x
• Thus
X
G′X (1) = xfX (x)
x
= E (X) .
• Similarly,
′′ X
GX (z) = x (x − 1) fX (x)z x−2 .
x
• Hence
2
2
var X = E X − {E (X)}
′′ 2
= GX (1) + E (X) − {E (X)}
′′ 2
= GX (1) + G′X (1) − {G′X (1)} .
and
n o n ′′ o
3 ′′′ ′′ ′ ′
E (X − µX ) = GX (1) + 3GX (1) + GX (1) − 3GX (1) GX (1) + G′X (1)
n ′ o3 n ′ o3
+ 3 GX (1) − GX (1)
′′′ ′′ ′ ′ ′′
= GX (1) + 3GX (1) + GX (1) − 3GX (1) GX (1)
n ′ o3
′ 2
− 3 {GX (1)} + 2 GX (1) .
Let ξ and η be independent rvs such that ξ takes the values 0, 1 and
2 with probability 1/3 each, and η takes the values 0 and 1 with
probabilities 1/3 and 2/3 respectively. Define X = ξ and
Y = (ξ + η)(mod 3).
k
Therefore the moments µ′k =E X can be obtained either by
tk
1. finding the coefficient of k! in the series expansion of MX (t)
about t = 0, or
The mgf is related to the pgf, since for a discrete random variable X,
MX (t) = GX (et ) .
Thus
1
µ′k = for k = 0, 1, 2, 3, . . .
k+1
σ 2 = µ′2 − µ2
1 1 1
= − =
3 4 12
etc.
• Computation of the moments is much easier directly!
The mgf of X is
tX
MX (t) = E e
Z∞ ( 2 )
1 1 x−µ
= etx √ exp − dx
2πσ 2 2 σ
−∞
Z∞
1 1
exp − 2 x2 − 2µx + µ2 − 2σ 2 tx dx
=√
2πσ 2 2σ
−∞
and
2
E X = M ′′ (0)
2
= e0+0 σ 2 + (µ + 0) e0+0
= σ 2 + µ2 .
var X = µ2
= σ 2 + µ2 − µ2
= σ2 .
3. X ∼ Poisson(λ) .
e−λ λx x = 0, 1, 2, . . .
x!
fX (x) =
0 otherwise
• Hence
′
E (X) = MX (0)
= eλ(1−1) λe0 = λ
and
2
(0) = λ + λ2
′′
E X = MX
var (X) = λ + λ2 − λ2 = λ.
E (X) = λ
E X = λ + λ2
2
3
E X = λ3 + 3λ2 + λ.
Extension
Example
n
Y
MY (t) = MXi (t)
i=1
Yn
= MX (t)
i=1
n
= {MX (t)} .
n
!
Y
=E etai Xi
i=1
n
Y
tai Xi
= E e (by independence of the Xi′ s)
i=1
Yn
= MXi (ai t)
i=1
Yn
= MX (ai t) if the Xi s are iid.
i=1
and so
n
Y t
MX (t) = MX
i=1
n
n
t
= MX .
n
Further examples
n
1. The mgf of a rv X ∼ Bin(n, p) is (1 − p + pet ) , (by putting
z = et in its pgf).
• Suppose X1 ∼ Bin(n1 , p) , independently of X2 ∼ Bin (n2 , p) .
λ(et −1)
=e ,
Pn
where λ = i=1 λi .
• Thus Y ∼ Poisson(λ) by application of the uniqueness
theorem.
Question: Is X Poisson?
Answer: No, since X can take non-integer values.
• The cgf of two independent rvs is thus the sum of the two cgfs.
Pn ′
• Let Y = Xi where the Xi s are i.i.d.
i=1
• Then
κ0 = 0
κ1 = µ′1 = µ
2
κ2 = µ′2 − (µ′1 ) = σ 2 .
• Thus the only cumulants of Y and X which are different are the
first cumulants.