Nothing Special   »   [go: up one dir, main page]

0% found this document useful (0 votes)
156 views6 pages

Tutorial 01 - Solution Set Problems Discussed in Tutorials

Download as pdf or txt
Download as pdf or txt
Download as pdf or txt
You are on page 1/ 6

MTH3241 Rand. Proc. in Sci. & Eng.

Tutorial 01 Solution Set Problems discussed in tutorials


1. Let 1 , 2 , . . . denote the Bernoulli processes associated to the ips of the coin, with head being the success. As in lectures, denote Xn = 1 + . . . + n and S1 the time/trial of the rst success. (a) Y = S1 is geometric p: pY (y) = p(1 p)y1 , y = 1, 2, . . .

(b) X is one/any of the s. E[X] = p, var(X) = p(1 p) (c) K = Xn is binomial (n, p): ( ) n k pK (k) = p (1 p)nk , k E[K] = np and var(K) = np(1 p). (d) This conditional probability can be written as P(1 = 7 = 0|X9 = 6), which, by a simple renumbering of the rst 9 ips, becomes P(8 = 9 = 0|X9 = 6) = P(8 = 9 = 0, X9 = 6) P(8 = 9 = 0, X7 = 6) = . P(X9 = 6) P(X9 = 6) k = 0, . . . , n,

Since X7 , 8 and 9 are independent, we end up with () (1 p)(1 p) 7 p6 (1 p) P(8 = 0)P(9 = 0)P(X7 = 6) 1 6 (9 ) = = . P(X9 = 6) 12 p6 (1 p)3 6 (e) H = X20 = X18 + 19 + 20 and C = {X18 = 10}. Using the independence of X18 , 19 and 20 , we get E[H|C] = E[X18 + 19 + 20 |X18 = 10] = 10 + E[19 + 20 ] = 10 + 2p and var(H|C) = var(X18 + 19 + 20 |X18 = 10) = var(10 + 19 + 20 |X18 = 10) = var(19 + 20 ) = 2p(1 p). 2. (a) The number of additional trials up to and including the next trial on which 2 heads result is geometric pD pG with PMF: pD pG (1 pD pG )k1 , k = 1, 2, . . .. (b) Let D be the event that Don ipped a head on that trial, G be the event that Greg ipped a head on that trial. Then, P(D|D G) = pD P(D) = . P(D) + P(G) P(D G) p D + p G pD p G

Tutorial 01 (c) Brute Force approach Let TD be the time to Dons rst head, and TG be the time of Gregs rst head. P(TD < TG ) = = = =
t=1 t=1 t=1 t=1

P(TD < TG |TD = t)P(TD = t) P(t < TG |TD = t)P(TD = t) P(TG > t)P(TD = t) (1 pG )t pD (1 pD )t1
s=0

TD , TG independent

= pD (1 pG )

(1 pG )s (1 pD )s

pD (1 pG ) pD (1 pG ) = = 1 (1 pG )(1 pD ) pD + pG p D pG Intuitive approach Think of this game as a race between Don and Greg; a race that ends when either of them gets a head. As the outcome of the race is independent of when it ends, the probability that Dons next ip of a head will occur before Gregs next ip of a head is simply P(D Gc |D G) = pD (1 pG ) . p D + pG p D pG

(d) Let XD be the number of heads in Dons rst n ips, XG be the number of heads in Gregs rst n ips. Then the amount of money earned by Don is dXD , the amount of money earned by Greg is gXG and the total amount of money earned by the two players during the rst n trials, Z, has transform (MGF): E[etZ ] = E[et(dXD +gXG ) ] = E[etdXD etgXG ] = E[etdXD ]E[etgXG ] = [(1 pD ) + pD etd ]n [(1 pG ) + pG etg ]n . 3. Let p be the probability of a single opportunity to cross the lane: p = P(T > d) = fT (t)dt.
d

Let 1 , 2 , . . . be the Bernoulli process that describes the opportunities to cross, Xn = 1 + . . . + n , S1 , S2 , . . . be the trial numbers of the consecutive successes and, k be the time between the arrivals of the (k 1)st and the kth cars. (a) The probability that we can cross for the rst time just before the nth car goes by is ( ) ( d )n1 n1 P(S1 = n) = p(1 p) = fT (t)dt fT (t)dt .
d 0

MTH3241 Rand. Proc. in Sci. & Eng.

(b) The probability that we shall have had exactly m opportunities by the instant the nth car goes by is ( ) ( ) ( )m ( d )nm n m n nm P(Xn = m) = p (1 p) = fT (t)dt fT (t)dt . m m d 0 (c) The probability that the occurrence of the mth opportunity is immediately followed by the arrival of the nth car is ( P(Sm = n) =
l=42

) ( ) ( )m ( d )nm n1 m n1 nm p (1p) = fT (t)dt fT (t)dt . m1 m1 d 0

4.

pY17 (l) = P(Y17 42) = P(17th arrival in a Bernoulli p occurs at or after trial 42) =
16 (41) k=0

P(in the rst 41 trials, at most 16 arrivals occurred) = words, a = 16 and b = 41.

pk (1p)41k . In other

5. Let 1 , 2 , . . . denote the Bernoulli process that describes Freds dog food distribution, Xn = 1 + . . . + n and Tk be the time/trial of the kth success. (a) The probability that Fred gives away his rst sample on his third call is P(T1 = 3) = 1/8, since T1 is geometric 1/2. (b) Look at the process started on the ninth call. The conditional probability that Fred will give away his fth sample on his eleventh given that he has given away exactly four samples on his rst eight call, simply equals the unconditional probability that Fred gives away his rst sample on his third call; that is 1/8. (c) The probability that he gives away his second sample on his fth call is P(T2 = 5) = 4 (1/2)2 (1/2)3 = 1/8. (d) Look at the process started on the third call. The conditional probability that he will leave his second sample on his fth call is the probability that Fred gives away his rst sample on his third call, 1/8. (e) The probability that he completes at least ve calls before he needs a new supply is P(T2 5) = P(X4 1) = P(X4 = 0) + P(X4 = 1) = 1/16 + 4/16 = 5/16. (f) Consider the Bernoulli process that describes, for homes with dogs, whether or not the call was answered. Here a call without a dog is not counted (no missed opportunity). The probability of success is p = 3/4. Let Hm be the time (home with a dog) of the mth success (the call is answered). Then Dm = Hm m and Hm has a Pascal distribution of order m with parameter 3/4. Then E[Dm ] = m E[Hm ] = m 6. Challenging problem. m m(1 p) 4m m = and var(Dm ) = var(Hm ) = = . p 3 p2 9

Tutorial 01 (a) The total number of rounds played (until the rst time where a loss by Bob immediately follows a loss by Alice) is simply the rst time the Bernoulli process that describes the rounds (two plays) has a success (a success occurs when a loss by Bob immediately follows a loss by Alice). Since the probability of success is (1/3)2 = 1/9, this time has a geometric distribution with parameter p = 1/9 and PMF: p(1 p)t1 = 8t1 , 9t t = 1, 2, . . .

(b) Let T1 be the round at which Bob loses for the rst time. Then T1 is geometric with parameter q = 1/3. Alices gain up to round n is n G2k1 and therefore her net k=1 gain up to the time of the rst loss by Bob has MGF [ T1 ] [ [ T1 ]] E es k=1 G2k1 = E E es k=1 G2k1 |T1 . Since T1 is independent of the Gk s and the Gk s are independent amongst each other, ] ] [ t [ T1 E es k=1 G2k1 |T1 = t = E es k=1 G2k1 |T1 = t ] [ t = E es k=1 G2k1 [ ]t = E esG ( )t 1 2s 1 s 1 3s = e + e + e . 3 2 6 Thus the MGF is [( ( ) )T ] 1 1 2s 1 e + 1 es + 6 e3s 1 2s 1 s 1 3s 1 3 3( 2 ). E e + e + e = 3 2 6 1 2 1 e2s + 1 es + 1 e3s 3 3 2 6 (c) The round T3 at which Bob has his third loss has a Pascal distribution of order 3 with parameter q = 1/3. Z = 2T3 and its PMF is ( pZ (z) = z/2 1 2 ) ( )3 ( )z/23 1 2 , 3 3 z = 6, 8, 10 . . .

(d) Let and be the rounds at which Alice, respectively Bob, wins for the rst time. Then N = max(, ), {N > n} = { > n} { > n} and P(N > n) = P( > n) + P( > n) P( > n)P( > n) = 2q n q 2n . Finally, E[N ] =
n=0

P(N > n) =

[ n=0

] 2q n q 2n =

1 9 15 2 =3 = . 2 1q 1q 8 8

7. Let p = 1/5 be the probability of a robbery attempt and q = 3/4 the probability that a robbery attempt is successful.

MTH3241 Rand. Proc. in Sci. & Eng. (a) K is geometric with parameter q = 3/4: pK (k) = q(1 q)
k1

3 = 4

( )k1 1 , 4

k = 1, 2, . . .

(b) D is the sum of three independent random variables, the number of robbery attempts up to (and including) the rst successful robbery, the o time O after the rst successful robbery and, the number of robbery attempts that follow, up to the second successful robbery, the distribution of which is that of . As a sum of independent random variables, the distribution of D is more conveniently described by its MGF ( )2 ( ) pqez 1 2z 1 4z zD z zO z E[e ] = E[e ]E[e ]E[e ] = e + e . 1 (1 pq)ez 2 2
1 Alternatively, the PMF of D is: P(D = 4) = P( + = 2, O = 2) = 2 (pq)2 , 1 P(D = 5) = P( + = 3, O = 2) = 2 2 (pq)2 (1 pq) = (pq)2 (1 pq) and P(D = d) = 1 P( + = d 2, O = 2) = 2 (d 3)(pq)2 (1 pq)d4 + 1 (d 3)(pq)2 (1 pq)d6 , for 2 d 6.

(c) S is the sum of two independent random variables taking values 1, 2 and 3 with equal probabilities (discrete uniform over [1,3]). Its PMF of S is: P(S = 2) = (1/3)2 , P(S = 3) = 2(1/3)2 , P(S = 4) = 3(1/3)2 , P(S = 5) = 2(1/3)2 and P(S = 6) = (1/3)2 . (d) Let X1 , . . . , X10 be the number of candies stolen at each robbery attempt. X1 , . . . , X10 are independent and have PMF pX (0) = 1 q = 1 q 1 and pX (1) = pX (2) = pX (3) = = . 4 3 4

Therefore X is discrete uniform over [0, 3]. Again, since Q = X1 + . . . + X10 , the distribution of Q is more conveniently described by its MGF: MQ (z) = E[ezQ ] = E[ezX ]10 = Therefore, ( E[Q] = 10E[X] = 20q = 15 and var(Q) = 10 14 q 4q 2 3 ) = 25 . 2 (1 + ez + e2z + e3z )10 . 410

8. Let X be the number of fatalities in 200 operations. X is binomial n = 200 and p = 0.01, and the probability that there will be at least 2 fatalities in 200 operations is P(X 2) = 1 P(X = 0) P(X = 1) which can be approximated by 1 P(Y = 0) P(Y = 1) = 1 e2 2e2 = 1 3e2 , where Y is Poisson with parameter = np = 200 0.01 = 2. 9. X is binomial with n = 50 5 = 250 and p = 0.02. (a) E[X] = np = 5 and P(X = E[X]) = P(X = 5) = (250)
5

(0.02)5 (0.98)245 = 0.1772476.


5

(b) E[X] = np = 5 and P(X = E[X]) = P(X = 5) P(Y = 5) = e5 5 = 0.1754674, 5! where Y is Poisson with parameter = np = 5.

Tutorial 01 (c) Let Z1 , . . . , Zn be the nes paid on each of the n = 250 days. Then the PMF of Zk is p(0) = 0.98, p(10) = 0.01, p(20) = 0.006 and p(50) = 0.004, the total amount you pay for trac tickets during the year is W = Z1 + . . . + Zn , E[W ] = 250 (0.98 0 + 0.01 10 + 0.006 20 + 0.004 50) = 250 0.42 = 105 and, var(W ) = 250 var(Z) = 250 (13.4 0.422 ) = 3305.9. p(1 p) (d) The standard deviation of P is s = . For p to be within 5s of P , p must 250 satisfy, p [0, 1] and p(1 p) p(1 p) |p P | 5 that is (p P )2 25 . 250 250 Solving for p, we nd that p must be in (0.002929, 0.124346).

You might also like