Nothing Special   »   [go: up one dir, main page]

0% found this document useful (0 votes)
4 views3 pages

MATH 3075 3975 s1

Download as pdf or txt
Download as pdf or txt
Download as pdf or txt
You are on page 1/ 3

MATH3075/3975 Financial Derivatives

Tutorial 1: Solutions

Exercise 1 (a) The conditional distributions of X given Y = j are:

for j = 1 : (1/5, 3/5, 1/5),


for j = 2 : (2/3, 0, 1/3),
for j = 3 : (0, 3/5, 2/5),

and thus the conditional expectations EP (X| Y = j) are computed as follows:

EP (X| Y = 1) = 1/5 + 6/5 + 3/5 = 2,


EP (X| Y = 2) = 2/3 + 0 + 3/3 = 5/3,
EP (X| Y = 3) = 0 + 6/5 + 6/5 = 12/5.

(b) The marginal distributions of X and Y are:

for X : (4/18, 9/18, 5/18),


for Y : (10/18, 3/18, 5/18).

On the one hand, we obtain


3
X 4 9 5 37
EP (X) = i P(X = i) = +2· +3· = .
18 18 18 18
i=1

On the other hand, we get


3
 X 10 5 3 12 5 37
EP EP (X| Y ) = EP (X| Y = j)P(Y = j) = 2 · + · + · = .
18 3 18 5 18 18
j=1

Hence the equality EP (X) = EP EP (X| Y ) holds, as was expected.
(c) Random variables X and Y are not independent since the condition

P(X = i, Y = j) = P(X = i) P(Y = j), ∀ i, j = 1, 2, 3,

is not satisfied. For instance, if we take i = 1 and j = 2, then


4 5
P(X = 1, Y = 3) = 0 6= · = P(X = 1) P(Y = 3).
18 18
Exercise 2 (a) We need to show that
Z ∞Z ∞ Z ∞Z ∞
1 − xy −y ?
f(X,Y ) (x, y) dxdy = e e dxdy = 1.
−∞ −∞ 0 0 y

1
We first compute the marginal density of Y . For y ≥ 0, we obtain
Z ∞ Z ∞
1 − xy −y 1 −x 1 −x ∞
h i
fY (y) = e e dx = e−y e y dx = e−y −ye y = e−y .
0 y y 0 y 0

Of course, we have fY (y) = 0 for all y < 0. Therefore,


Z ∞Z ∞ Z ∞ Z ∞
1 − xy −y
e e dxdy = fY (y) dy = e−y dy = 1.
0 0 y 0 0

(b) For any fixed y ≥ 0, the conditional density of X given Y = y equals

f(X,Y ) (x, y) 1 −x
fX|Y (x| y) = = e y, ∀ x ≥ 0,
fY (y) y

and fX|Y (x| y) = 0 for x < 0. Consequently, for every y ≥ 0


Z ∞ Z ∞ Z ∞
x − xy
EP (X| Y = y) = xfX|Y (x| y) dx = e dx = y ze−z dz = y.
0 0 y 0

Exercise 3 We first compute the conditional cumulative distribution function of X given the event
{X < 0.5}
FX|X< 0.5 (x) := P(X ≤ x| X < 0.5), ∀ x ∈ R.
We obtain

 0,
 if x ≤ 0,
P(X ≤ x, X < 0.5) 
FX|X< 0.5 (x) = = 2 P(X ≤ x) = 2x, if x ∈ (0, 0.5),
P(X < 0.5) 

1, if x ≥ 0.5.

so that the conditional density of X given the event {X < 0.5} equals


 0, if x ≤ 0,

fX|X< 0.5 (x) = 2, if x ∈ (0, 0.5),


0, if x ≥ 0.5.

Therefore, Z ∞ Z 0.5
EP (X| X < 0.5) = xfX|X< 0.5 (x) dx = 2x dx = 0.25.
−∞ 0

Exercise 4 We have
1 −x
fX (x) = e λ , ∀ x > 0,
λ
and thus Z ∞
x
P(X > x) = 1 − FX (x) = fX (u) du = e− λ , ∀ x > 0.
x
Consequently,
 x−1
P(X>x) −
P(X > x, X > 1) P(X>1) = e
 λ , if x ≥ 1,
P(X > x| X > 1) = =
P(X > 1)  1, if x < 1.

2
Hence the conditional density equals
( x−1
1
λ e− λ , if x ≥ 1,
fX|X>1 (x) =
0, if x < 1,

and thus Z ∞
x − x−1
EP (X| X > 1) = e λ dx = 1 + λ = 1 + EP (X).
1 λ

Exercise 5 Since

Cov(X, Y ) = EP (XY ) − EP (X) EP (Y ) = EP (X 3 ) − EP (X) EP (X 2 ) = 0

since EP (X) = EP (X 3 ) = 0. Therefore, the random variables X and Y are uncorrelated. They are
not independent, however, since, for instance

EP (Y | X = 1) = 1 6= 4 = EP (Y | X = 2).

Recall that under independence of X and Y we have EP (Y | X) = EP (Y ) and EP (X| Y ) = EP (X).

Exercise 6 (MATH3975) We have

Cov(X, Y ) = EP (XY ) − EP (X) EP (Y )



= EP (U + V )(U − V ) − EP (U + V )EP (U − V )
= EP U 2 − V 2 − EP (U + V ) EP (U ) − EP (V ) = 0.
 

since EP (U ) = EP (V ) and EP (U 2 ) = EP (V 2 ). Hence the random variables X and Y are uncorrelated.


To check whether X and Y are independent, we need first to specify the joint distribution of X
and Y . For instance, if we take U = V , then X = 2U and Y = 0 so that X and Y are independent.
However, if we take U and V independent (but not deterministic), then X and Y are not necessarily
independent. For instance, we may take as U and V the i.i.d. Bernoulli random variables with
P(U = 1) = p = 1 − P(U = 0). It is then easy to check that X and Y are not independent.

You might also like