Chapter 6
Chapter 6
Chapter 6
By definition, ( ) = y −1e − y dy , for > 0. Also, (1) = e − y dy = 1.
0 0
If X, the number of success has a Poisson distribution with parameter = t , then Y, the waiting
time until the first success has an exponential density function f ( y ) = e− y , for y > 0.
For the gamma distribution, the rth moment about the origin is r = r ( + r ) / ( )
Proof:
+ r ( + r )
r = E ( X r ) = x r f ( x)dx = x + r +1e − x / /[ ( )]dx = = r ( + r ) / ( )
0 0 [ ( )]
Proof: M X (t ) =
0
x + r +1e− x / +tx dx
=
0
x + r +1e− x (1− t ) / dx
=
( )[ /(1 − t )]
= (1 − t )−
( ) ( ) ( )
1 ( )( )
1
Since
0
f ( x)dx = 1 , we have that 0
x −1 (1 − x) −1 dx =
( + )
= B( , ) , beta integral.
The mean and variance of the beta distribution are given by = and
+
2 = .
( + ) ( + + 1)
2
By using y = 1 – x with –dy = dx [For x = 0, y = 1 and for x = .1, y = .9]. Answer = 0.2639
1
Proof for mgf: Note that 1 =
− 12 [( x − ) / ]2
f ( x)dx = e dx .
−
2 −
Definition: The normal distribution with = 0 and = 1 is called the standard normal
1 − 12 z 2
distribution. Its density is given by f ( z ) = e , for − z , where Z = ( X − ) / .
2
The moment generating function of standard normal distribution is M Z (t ) = et
2
/2
.
11 − 12 z 2 1
To find P(0 < Z < 1): P(0 Z 1) = f ( z )dz = e dz . This integral cannot be obtained
0 0
2
directly. We use the normal probability table. [See Table III on page 500]. The tabulated area is
z 1 − 12 x2
that of e dx .
0
2
0.4
0.3
f(z)
0.2
0.1
0.0
-3 -2 -1 0 1 2 3
z
P(a X b) = P(a < X b) = P(a X < b) = P(a < X < b) because X is continuous.
Because of symmetry P(Z > 0) = P(Z < 0) = 0.5 and P(Z < − a) = P(Z > a)
Suppose we want to find the probability of any normal random variable X with mean and
standard deviation ,
X −
• Standardized the random variable X to obtain Z =
• Follow previous method to find the required probability.
Chapter 6 Page 5 of 6
− 1
2(1− 2 )
[( x − )/ ] −2 [( x − )/ ][( y −
1 1
2
1 1 2 )/ 2 ]+[( y − 2 )/ 2 ]
2
e
f ( x, y ) = , for − x, y , where 1 , 2 0 ,
2 1 2 1 − 2
− 1 , 2 , and −1 1 .
The parameters 1 , 2 , 1 and 2 , are respectively, the means and the standard deviations of the
two random variable X and Y. The marginal density of X is the univariate normal distribution
with parameters 1 and 1 . This can be obtained by integrating out y. The parameter is the
correlation coefficient between X and Y.
Theorem 6.9: If X and Y have a bivariate normal distribution, the conditional density of Y given
that X = x is normal with mean Y |x = 2 + 12 ( x − 1 ) and variance Y2|x = 22 (1 − 2 ) .
Similarly, the conditional density of X given Y = y is normal with mean X | y = 1 + 12 ( y − 2 )
and variance X2 | y = 12 (1 − 2 ) .
Theorem 6.10: If two random variables have a bivariate normal distribution, they are
independent if and only if = 0 .