Nothing Special   »   [go: up one dir, main page]

Chapter 3

Download as pdf or txt
Download as pdf or txt
You are on page 1of 6

Chapter 3

Mathematical Expectation

3.1 Mathematical Expectation


The mathematical expectation or mean of a random variable X gives
a single value which acts as a representative or average value of X. It is a
measure of central tendency.

Definition 1. Let X be a random variable with probability distribution f (x).


The expectation value or mean of X is
X X
µ = E(X) = xf (x) = xk P (X = xk )
x k

if X is discrete, and
Z +∞
µ = E(X) = xf (x)dx
−∞

if X is continuous.

Example 3.1: Find the expected number of chemical engineers on a committee


of size 3 selected at random from 4 chemical and 3 electrical engineers.

Example 3.2: Let X be the random variable that denote the life in hours of a
certain electronic device. The probability distribution function is given by
(
20,000
x3
, x > 100
f (x) =
0, elsewhere

find the expected life of this type of device.

1
Class Notes on
3.1. MATHEMATICAL EXPECTATION Applied Probability and Statistics ECEG-342

Theorem 1. Let X be a random variable with probability function f (x). The


mean or expected value of the random variable g(x) is
X
µg(X) = E[g(X)] = g(x)f (x)
x

if X is discrete, and
Z +∞
µg(X) = E[g(X)] = g(x)f (x)dx
−∞

if X is continuous

Example 3.3: Suppose that the number of cars, X, that park in a particular area
between 10:00 A.M. and 12:00 A.M. on any Friday has the following probability
distribution:
x 4 5 6 7 8 9
1 1 1 1 1 1
P (X = x) 12 12 4 4 6 6

Let g(X) = 41 X − 0.1 represent the amount of money in Birr paid to the attendant
by the park manager. Find the attendant’s expected earnings for this particular
time period.

Definition 2. Let X and Y be random variables with joint probability density


f (x, y). The mean or expected value of the random variable g(X, Y ) is
XX
µg(X,Y ) = E[g(X, Y )] = g(x, y)f (x, y)
x y

if X and Y are discrete, and


Z +∞ Z +∞
µg(X,Y ) = E[g(X, Y )] = g(x, y)f (x, y)dxdy
−∞ −∞

if X and Y are continuous.

Some Theorems on Expectation


Theorem 2. If a and b are constants, then

E(aX + b) = aE(X) + b

Setting a = 0, we see that E(b) = b and setting b = 0, we have E(aX) =


aE(X).

Murad Ridwan, 2 of 6
Dep. of Electrical & Computer Engineering
AiOT, Addis Ababa University.
Jul 2010.
Class Notes on
3.2. VARIANCE AND STANDARD DEVIATION Applied Probability and Statistics ECEG-342

Theorem 3. E(X ± Y ) = E(X) ± E(Y )

Theorem 4. E[g(X) ± h(X)] = E[g(X)] ± E[h(X)]

Theorem 5. E[g(X, Y ) ± h(X, Y )] = E[g(X, Y )] ± E[h(X, Y )]

Theorem 6. If X and Y are two independent random variables, then

E(XY ) = E(X)E(Y )

3.2 Variance and Standard Deviation


Definition 3. Let X be a random variable with probability distribution f (x)
and mean µ. The variance of X is
X
σ 2 = E[(X − µ)2 ] = (x − µ)2 f (x)
x

if X is discrete, and
Z +∞
2 2
σ = E[(X − µ) ] = (x − µ)2 f (x)dx
−∞

if X is continuous. The positive square root of the variance, σ, is called the


standard deviation of X.

The variance is sometimes represented as Var(X) or σx2 .

The variance (or the standard deviation) is a measure of the dispersion or


scatter of the values of the random variable about the mean µ. If the values
tend to be concentrated near the mean, the variance is small; while if the
values tend to be distributed far from the mean, the variance is large.

Example 3.4: The distribution with the density


(
1
, a<x<b
f (x) = b−a
0, elsewhere

is called the uniform distribution on the interval a < x < b. Find σ 2 .

Murad Ridwan, 3 of 6
Dep. of Electrical & Computer Engineering
AiOT, Addis Ababa University.
Jul 2010.
Class Notes on
3.3. COVARIANCE Applied Probability and Statistics ECEG-342

Some Theorems on Variance


Theorem 7. σ 2 = E(X 2 ) − µ2
Theorem 8. If a and b are constants, then
Var(aX ± b) = a2 Var(X) or 2
σaX+b = a2 σX
2

2 2 2
If a = 1, σX+b = σX and if b = 0, σaX = a2 σ X
2
.
Theorem 9. If X and Y are independent random variables, then
Var(aX ± bY ) = a2 Var(X) + b2 Var(Y ) or 2
σaX±bY = a2 σX
2
+ b2 σY2
Theorem 10. If a random variable X has mean µ and variance σ 2 , then the
corresponding random variable
X −µ
Z=
σ
has mean 0 and variance 1.

Z is called the standardized variable corresponding to X.


X
Theorem 11. Var[g(X)] = E{[g(X) − µg(X) ]2 } = [g(x) − µg(X) ]2 f (x) if
x
X is discrete, and
Z +∞
Var[g(X)] = [g(x) − µg(X) ]2 f (x)dx
−∞

if X is continuous.

3.3 Covariance
The concept of variance given for one variable can be extended to two or
more variables. Thus, for example, if X and Y are two continuous random
variables having joint density f (x, y) then, the mean and variances are
Z +∞ Z +∞
µX = E(X) = xf (x, y)dxdy
−∞ −∞
Z +∞ Z +∞
µY = E(Y ) = yf (x, y)dxdy
−∞ −∞
Z +∞ Z +∞
2 2
σX = E[(X − µX ) ] = (x − µX )2 f (x, y)dxdy
−∞ −∞
Z +∞ Z +∞
σY2 = E[(Y − µY )2 ] = (y − µY )2 f (x, y)dxdy
−∞ −∞

Murad Ridwan, 4 of 6
Dep. of Electrical & Computer Engineering
AiOT, Addis Ababa University.
Jul 2010.
Class Notes on
3.3. COVARIANCE Applied Probability and Statistics ECEG-342

Another quantity which arises in the case of two variables X and Y is the
covariance.
Definition 4. Let X and Y be random variables with probability distribution
f (x, y). The covariance of X and Y is
XX
σXY ≡ Cov(X, Y ) = E[(X − µX )(Y − µY )] = (x − µX )(y − µY )f (x, y)
x y

if X and Y are discrete, and


Z +∞ Z +∞
σXY ≡ Cov(X, Y ) = E[(X−µX )(Y −µY )] = (x−µX )(y−µY )f (x, y)dxdy
−∞ −∞

if X and Y are continuous.

The covariance between two random variables is a measure of the nature of


the association between the two.

Some Theorems on Covariance


Theorem 12. σXY = E(XY ) − µX µY
2
Theorem 13. σaX+bY = a2 σX
2
+ b2 σY2 + 2abσXY
Theorem 14. If X and Y are independent random variables, then σXY = 0

Example 3.5: The fraction X of male runners and the fraction Y of female runners
who complete marathon race is described by the joint density function
(
8xy, 0 ≤ x ≤ 1, 0 ≤ y ≤ x
f (x, y) =
0, elsewhere

Find the covariance of X and Y .

Exercise 3.1:

1. Let X (cm) be the diameter of bolts in a production. Assume that X has


the probability density
(
k(x − 0.9)(1.1 − x), 0.9 < x < 1.1
f (x) =
0, elsewhere

determine k, graph f (x) and find µ and σ 2 .


Answer : k=750; 1, 0.002.

Murad Ridwan, 5 of 6
Dep. of Electrical & Computer Engineering
AiOT, Addis Ababa University.
Jul 2010.
Class Notes on
3.3. COVARIANCE Applied Probability and Statistics ECEG-342

2. Let X and Y be random variables with joint density function


(
4xy, 0 < x < 1, 0 < y < 1
f (x, y) =
0, elsewhere

Find the expected value of Z = X2 + Y 2
Answer : 0.9752

3. Let X represent the number that occurs when a green die is tossed and Y
the number that occurs when a red die is tossed. Find the variance of the
random variable

(a) 2X − Y
(b) X + 3Y − 5
175 175
Answer : (a) 12 (b) 6

4. The joint probability function of two discrete random variables X and Y is


given as (
c(2x + y), 0 ≤ x ≤ 2, 0 ≤ y ≤ 3
f (x, y) =
0, elsewhere
2
find c, µX , µY , µXY , µX 2 , µY 2 , σX , σY2 , σXY

1 29 13 17 17 32 230 55 20
Answer : 42 , 21 , 7 , 7 , 7 , 7 , 441 , 491 , − 147

Murad Ridwan, 6 of 6
Dep. of Electrical & Computer Engineering
AiOT, Addis Ababa University.
Jul 2010.

You might also like