Ch06 - Probality and Random Process
Ch06 - Probality and Random Process
Ch06 - Probality and Random Process
Random Variables
6.7 INTRODUCTION
Thus far we have discussed the transmission of deterministic signals over a channel, and we have not
emphasized the central role played by the concept of "randomness" in communication. The word random
means unpredictable. If the receiver at the end of a channel knew in advance the message output from the
originating source, there would be no need for communication. So there is a randomness in the message
source. Moreoveq transmitted signals are always accompained by noise introduced in the system. These
noise waveforms are also unpredictable. The objective of this chapter is to present the mathematical
background essential for further study of communication.
6.2 PROBABILITY
A. Random Experiments
In the study of probability, any process of observation is referred to as an experimenr. The results of an
observation are called the outcomes of the experiment. An experiment is called a random experimenr if its
outcome cannot be predicted. Typical examples of a random experiment are the roll of a die, the toss of a
coin, drawing a card from a deck, or selecting a message signal for transmission fiom several messages.
C. Algebra of Events
I . The complement of event A, denoted Z , is the event containing all sample points in S but not in A.
2. The union of events A and^B, denoted A U,B, is the event containing all sample points in eitherl
or.B or both.
3. The intersection of events A andB, denoted A n ^8, is the event containing all sample points in both
A and B.
4. The event containing no sample point is called the null event, denoted Z. Thus O corresponds to
an impossible event.
5. Two eventt A and B are called mutually exclusive or disjoinr if they contain no coilrmon sample
point, that is, A n B - A.
By the preceding set of definitions, we obtain the following identities:
S - a e -S
SUA:S,SOA:A
AUA-SA)Z:a7,-A
D. Probabitities of Events
An assignment of real numbers to the events defined on S is known as the probability measure. In the
axiomatic definition, the probability P(A) of the event A is a real number assig,re d to A that satisfies the
following three axioms'.
Axiom l: P(A) > o (6.1)
Axiom 2:
-1
P(s) (6.2)
Axiom 3: P(AuB):P(A)+P(B)ifA.B:a (6.3)
With the preceding axioms, the fbllowing useful properties of probability can be obtained (Solved
Problems. 6.1-6.4):
1. PG)-1-P(A) (6.4)
2. P(z) -- 0 (6.5)
3. P(A) < P(B) if A c B (6.6)
4. P(A) < t (6.7)
s. P(A u B) = P(A) + P(B) - P(A . B) (6.8)
Note that property 4 can be easily derived from axiom 2 and property 3. Since A C S, we have
P(A) S P(S): I
--
Thus, combining with axiom 1, we obtain
0<P(A)<t (6.e)
Property 5 implies that
P(AuB)<P(A)+P(B) (6.10)
since P(A a B) > 0 by axiom 1.
Probability and Random Vaiobles t 6r-I
One can also define P(A) intuitively, in terms of relative frequency. Suppose that arandom experiment
is repeated n times. If an eventl occurs nrtimes, then its probability P(A) is defined as
P(A): hm k (6.1 I )
ft+(x) n
Pt : Pz: ... : Pn
n(A)
and P(A): (6.1s)
where n(A) is the number of outcomes belonging to event A andn is the number of sample points in S.
F. Conditiona[ Probability
The conditional probability of an event A given the event B, denotedby P(AIB), is defined as
is the conditional probability of an event .B given event A. FromEqs (6. l6) and (6.17)we have
Equation (6.18) is often quite useful in computing the joint probability of events.
From Eq. (6.18) we can obtain the following Baltes rule:
P(B lA) P(A)
G. Independent Events
Two events A and B are said to be (statistically) independent if
P(AIB)
- P(A) and P(BIA) : P(B) (6.20)
This, together with Eq. (6.19), implies that for two statistically independent events
P(A. B): P(A)P(B) (6.2r)
We may also extend the definition of independence to more than tw-o events. The events Ap A2, .. ., A, are
independent if and only if for every subset {A,,, A,., ..., A,*} (2 < k < n) of these events,
' P(A,,)A,, ... nAit): P(A,,)P(A,.) ...P(A,r) (6.22)
H. Total Probabitity
The events Ap Ar, ..., A, are called mutually exclusive and exhaustive tf
3.q,: Aru Azu... An: Sand Ai)Aj: Q i *i (6.23)
,=1
which is known as the total probability of event B (Prob.6.13). LetA : A,rtEq. (6.19); using Eq. (6.2a)
we obtain
B. Distribution Function
The distributionfunction lor cumulative distributionfunction (cdf)l ofXis the function defined by
F*(*) : P(X < x) - oo (.x ( oo (6.26)
Properties of F*(x):
l. 0 < F*(x) <I (6.27a)
2. F*(*r) < Fx@) if x, < x, (6.27b)
3. &(co) - I (6.27c)
4. rx(-m) - 0 (6.27d)
5. F*(o'): Fx(a) q* - a * e (6.27e)
,lr,T,
From definition (6.26) we can compute other probabilities:
P(a<X<b)-FAb)-F*(o) (6.28)
P(X>a):l-F*(a) (6.2e)
Flx)
- 0 if r = x,(i -
-
2. Pr(*) 1,2, .. .) (6.33b)
3. f pr(r,): t (6.33c)
The functionfr(x) is called the probability densiQ,function (pd0 of the continuous r.v. X.
Properties offx@)'
t. f*(x) > 0 (6.37a)
2. f-@)dx - t (6.37b)
[:
3. f*(r) is piecewise continuous.
every element of S. The joint cumulative distribution function (or joint cdfl of X and I, denoted by
Fn(*,-y), is the function defined by
Fn(r,Y): P(X < x, Y 1Y) (6.3e)
Two r.v.'s X and Izwill be called independent if
The cdf's Fr(*) and Fr$,,), when obtained by Eqs (6.41a) and (6.41b), are referred to as tbe marginal
cdf's ofXand I, respectively.
The funct ion Pn(xi,11) iscalled thejoint probability mass function (jornt pmf) of (X,Y).
Properties of Pn@i,l):
l. o< pn@i,!) < I (6.43a)
2. Ix, D pxy(xi,!):t
v.
(6.43b)
(6.4sb)
Similarly, PrU) - D Pxv@pYti)
-r- .
Analog and Digital Communications
The pmf'spx@) and Pr(1t,), when obtained by Eqs @.a5a) and (6.45b), are referred to as the marginal
pmf's ofXand d respectively. If X and Y are independent r.v.'s, then
F*(x'
.fn6,-y) :
02 Y) (6.47)
dx0y
The functionf*r.(x,y) is calle dthe joint probability densityfunction (joint pdf ) of (X, f ). By integrating
Eq. $.a7 ), we have
-f,@)
: y) dv (6.50a)
[*-f-y@,
fvT) : v) dx (6.s0b)
[:.fn@,,
The pdf 's .f*(x) and l"r(x). when obtained by Eqs (6.50a) and (6.50b), are referred to as the marginal pdf's
ofXand I, respectively. If Xand Y are independent r.v's, then
f*r(*, )') : fx@)frO) (6.s r)
defines a new r.v. Y. With y a given number, we denote D, the subset of R, (range of ,D such that S@) S y.
Then
(Y < y) -- ls$ ) S yj - (x e Dy)
where (X e 4,) is the event consisting of all outcomes A such that the point X(^) € Dr.Hence,
Fr(y) : P(Y 1r) : Pls6) { yf - P(X e D") (6.s4)
IfXis a continuous r.v. with pdf/r(x), then
Note that if g(x) is a continuous monotonic increasing or decreasing function, then the transformation
y: g(x) is one-to-one. If the transformation y: g(x) is not one-to-one,fy(y) is obtained as follows:
Denoting the real roots of y - g(x) by xo, that is
fvl)- Dr lYf
g'G)l r
(6.s8)
/- g(X, Y) (6.se)
is a new random variable. With z a given number, we denote by D, the region of the xy plane such that
g(x, y) S z. Then
lZ<rl- {s(X,\Az\: {(X,Y)e D,}
where {(X, Y) e D,} is the event consisting of all outcomes A such that the point {,f(l), f(^)} is in D,.
Hence,
Fr(r):P(Z1z)-P{(X,n€D,) (6.60)
If X and Y are continuous r.v.s with joint pdf fyr(x,y), then
Fz@) -- [
DZ
I fn@, y) dx dy' (6.61)
I 6.1O I Analog and Digitol Communications _
C, Two Functions of Two Random Variabtes
Given two r.v.s X and Y and two functions g(x, y) and h(x, y) the expression
7 : g(X, Y) W : h(X, Y) (6.62)
defines two new r.v.s Z and W. With z and w two given numbers we denota D,*the subset of Rr, [range
of (X, Y)l such that g(x, y) I z and h(1, i ( w. Then
(Z 12, W < w): [g(x, !') < z. h(x,y) 3w]
{(X, Y) e D,*} -
where {({ Y) e D,,,) is the event consisting of all outcomes .\ such that the point {X (A), y(A)} e D,,,.
Hence,
lae asl I d, 0z
a"l la.
J(x.vl--lo*'-t-l 0y
(6.68)
ph ahl @- 0w
la,' arl I a, a,
which is the Jacobian of the transformation (6.65).
[\ x,p^'('r,) X :discrete
llx: E(X) : )i (6.6e)
x-ft(x) dx X: continuous
Lr "
Probability ond Random Voiables
dx (continuous case)
(6.70)
trt: s(x)-fx@)
The expectation of Z
- g(X, Y)is given by
t-- f
tt s(xi, )')Pxt,(x,, y) (discrete case)
E(z)-Els(x,Dl :1 , rc.1D
y)-fxt(x..r') dx dy (continuous case)
t/ : /_. rtr,
Note that the expectation operatioz is linear, that is,
EIX+Yf:ElXl+ElYl (6.72)
Elrxl -- cElxl (6.73)
where c is a constant (Solved Problem 6.45).
B. Moment
The rrth moment of a r.v. Xis defined by
tI xi p*@,) x:discrete
E(X'\: I i- (6.74)
C. Variance
The variance of a r.v. X, denoted by or Var(X), is defined by
"tr
Var(X) : o2* - El6- tr)') (6.7s)
Thus,
tD t *! yi p*r(x,,' y)
r x : discrete
ffik:E(Xoyn)-]T-'r 6.7g)
,n -f n(*' y) dx d"- x continuous
t/",. f* ro :
p(X,Y):pxy- on (6.83)
o xov
It can be shown that (Solved Problem 6.53)
lpnl ( I or- 1< pxi-! I (6.84)
A. Binomial Distribution
A r.v. Xis called a binomial r.v. with parameters (n, p) if its pmf is given by
where0<p(land
(,\ : nt
[o,J k\n 1)l
which is known as the binomial coefficient. The corresponding cdf ofXis
Fx@):f llorrt-p)'-rm<x<m*t (6.86)
flo lf/ '
Probability ond Random Variables
A point in the sample space is a sequence of nA's and 7's. Apoint wtth kA's and n - kZ '.s will be assigned
a probability of po qn-o.Thus, if X is the random variable associated with the number of times that
.4 occurs in z trials, then the values ofXare the integers ft
- 0, l, ..., n.
In the study of communications, the binomial distribution applies to digital transmission when
X stands for the number of errors in a message of r digits. (See Solved Problems 6. 17 and 6.41.)
B. Poisson Distribution
A r.v. X is calle d a Poisson r.y. with parameter CI(> 0) if its pmf is given by
The mean and variance of the Poisson r.v. X are (Solved Problem 6.42)
px:a oi:o (6.90)
The Poisson distribution arises in some problems involving counting, for example, monitoring the
number of telephone calls arriving at a switching center during various intervals of time. In digital
communication, the Poisson distribution is pertinent to the problem of the transmission of many data
bits when the error rates are low. The binomial distribution becomes awkward to handle in such cases.
However, if the mean value of the error rate remains finite and equal to a, we can approximate the
binomial distribution by the Poisson distribution. (see Solved Problem 6.t9 and 6.20.)
I
F*@): d€: t- e-€,zo, (6.e2)
h [-*r-"-tL\2t(2o2) ..l2" SG-Dro
T--_--..:---]
.l 6,74 | Analog and Digital Communicotions
This integral cannot be evaluated in a closed form and must be evaluated numerically. It is convenient to
use the function QQ) defined as
(6.e4)
The function Q@) is known as the complementary error function or simply the Q function. The function
QQ) is tabulated in Table C-l (App. C). Figure 6.3 illustrates a normal distribution. The mean and vari-
ance of X are (Solved Problem 6.43)
0Fx 0p
(a) (b)
The normal (or gaussian) distribution has played a significant role in the study of random phenomena
in nature. Many naturally occurring random phenomena are approximately normal. Another reason for
the importance of the normal distribution is a remarkable theorem called the central-limit theorem.This
theorem states that the sum of a large number of independent random variables, under certain conditions,
can be approximated by a normal distribution.
Probabitity
P(s): l: P(A)+Pa1Z1
Thus P(A)-I1P(A)
Probability and Rondom Variobles
P(o1 : g
Let A c B. Then from the Venn diagram shown in Fig. 6.4, we see that s
B-Au(B)h and An(B)47:q
Hence, from dxiom 3,
From the Venn diagram of Fig. 6.5, each of the sets,4 U ,B and B can be expressed respectively, as
a union of mufually exclusive sets as follows: S
' A.UB-AU(,enq and B:(AnB)u17aa1
Thus, by axiom 3,
Shadedregion: Aaa
P(AuB):P(A) +P(Z aa1 (6.e6)
Let P(,4) .+ 0.9 and P(8) : 0.8. Show that P(A n B) : 0.7.
P(A.B)>0.9+0.8-r-0.7
Equation (6.99) is known as Bonferronib inequality.
P(A)-P(A.B)+P(A. B)
6.7, G, idsr,.fi.felegrapH.Sotlres generating two symbols: dot and dash" We oo-serued that the dots
were twiCe as likely io rccur as the dashes. Find the probabilities of the dot's occurring and
dash's occurring.
P(dot) 2P(dash)
-
Then, by Eq. (6.12)
. P(dot) * P(dash)
- 3P(dash) : I
Show tha!, P{AIB) &efiryby Eq, (5.16) satisfies the three axioms of aprobatility, that is,
(a) P(AIB) >0, (b) P(qB): 1, and (c) P(Au CIB) : PQaIB) + P(ClB),if A i C: a
(a) By axiom 1, P(A n B) ) 0. Thus,
P(AIB) > o
(A)B)n(C)B)-s
Hence, by axiom 3
Probability and Rondom Variables
P(AIB):'(!.?.') : P(a) *0
P(B) P(B) CoB
(b) It A C 8, then A n B : A and
B) P(A)p(A)
P(A:B):!4#:;(B)
(c) IfB C A,thenA a B - B and
P (A. B) P(B) i
P(AIB) - P (B) P (B)
P(BIA):
y# - P(A)P(B)
P (A)
,,].,
6-I[,Let,r' fi , a, tr ppace ,S. Show tlat if/ aiA g are independent, then
sCI are "
(a) l and .B and (b) 7. erCA.
P(B.z)-P(DPG)
which indicates that Z and B are independent.
Analog and Digitol Communications
6,1? LetA and.B be events defined in a sample space S. Shov that if both P(A) and P(B) are nonzero,
thenevpnts,{and8.'cantotbebothmutuallyexclusiveandindeponderrt.
Let A and B be mutually exclusive events and P(A) = 0, P(B) = 0. Then P(A a B) : P(a1 - g
and P(A)P(B) * 0. Therefore
P(A)B)=P(A)P(B)
That is, A and B are not independent.
- tk:l P(BIA)P(Ak)
,6i ffi*6-$}a.0i.sr,,.!1 & *d.#B Ro,isa,a:
Now by the MAP decision rule, we decide mrif ro is received if P(molrr) > P(mtlro), that is,
0.9 P(mo) 0.4-0.4P(mn)
0.5 P (m) + 0.4 0.s P (m) + 0.4
or 0.9 P(m,,) > 0.4 - 0.4 P(mo)or 1.3 P(mo) > 0.4 or P(mo) > 9! - 0.31
1.3
Thus, the range of P(m) for which the MAP criterion prescribes that we decide moif rois received is
0.31 < P(m)!l
(b) Similarly, we have
'
P(r) - P(rrlmr)P(mo) * P(r,,lmr)P(m,,): 0.1 P(*i + 0.6[1 - P(*)1
:-0.5 P(mi +0.6
P(rtlm')P(mi 0.1P (mo)
P(rnolrr) - P (r)
- P(*i + 0.6
-0.s
P(rtlmt)P(m'). 0.6U- P(mi) _ 0.6 45 P (mr)
P(mrlrr) - -
P -0.s P(m) + 0.6
(rt) - o.s P (mi + 0.6
Now by the MAP decision rule, we decide m, if r, is receiredif P(mtl.,) > P(molrt), that is,
0.6 - (m) ,
0.6 P 0.1P (mo)
Thus, the range of P(m) for which the MAP criterion prescribes that we decide m, if r, is received is
0<\mo)<0.86
(c) From the result of (6) we see that the range of P(m) for ryhich we decide mrif r, is received is
P(mi > 0.86
f 6.,,l
Combining with the result of (a), the range of P(mr) for which we decide mo no matter what is
received is given by
0.86< P(m)<l
(d) Similarly, from the result of (a) we see that the range of P(m) for which we decide m, if ro
is received is
P(mi < 0.31
Combining with the result of (b), the range of P(m) for which we decide ml no matter what is
received is given by
0l PQno) < 0.31
6.16 Coasider an experiment consisting of the observation of six successive pulse positions on a com-
munication link. Suppose that at each of the six possible pulse positions there can be a positive
pulse, a negative pulse, or no pulse. Suppose also that the individual experiments that determine
the kind of pulse at each possible position are independent. Let us denote the event that the ith
pulse is positive by {x, : + 1}, that it is negative by {x, --
I }, and that it is zero by {x, 0}. *
Assume that
(a) Since the individual experiments are independent, by Eq. t6.22) the probability that all pulses
are positive is given by
Pl(xr
- +l) O (xz: +l) n ... O (-ro: +l)l
: P(xr: +1)P(xz: *1) .. P(xo: +l)
: p6 : (0.4)u : 0.0041
(b) From the given assumptions, we have
P(x,
- 0):1-p-q--0.3
Thus, the probability that the first three pulses are positive, the next two are zero, and the last is
negative is given by
P[(-rr : +1) o (xz: +l) o (xr: *l) n (ra:0) t'-l (xs:0) n (.ru: -l)]
: P(xt: *l)P(xz: *1)P(rr: +1)P(xq - O)P(xs:0)P(xo - -l)
: p3(t - p - q)'q - (0.4)3(0.3)'(0.3) : 0.0017
Random Variabtes
6.17 A binary source generates digits I and 0 randomly with probabilities 0.6 and 0.4, respectively.
(a) What is the probability that two ls and thee 0s will occur in a five-digit sequence?
(b) What is the prub,ability that at least three ts will oecur in a five-digit sequCIrce?
| 6.22 | Anolog and Digitol Communications
(a) Let X be the random variable denoting the number of ls generated in a flve-digit sequence.
Since there are only two possible outcomes (1 or 0) and the probability of generating I is
constant and there are five digits, it is clear that X has a binomial distribution described by
Eq. (6.85) with n : 5 and k:2. Hence, the probability that two ls and three 0s will occur in
a five-digit sequence is
(a * b)' - f,
*=o
l:)ru"-r \K )
n n l.^\
rhus,byEq (6.8s), D pxa): -(p*1-p)'-rn ==t
E
t:0 l;)n(r-p)'-o
\^ /
6.19 Show that whon,n is verJ/r,large (n >> k) andp yery small (p << 1), the binomial distribution
[Eq, (6,85)] can be ppproximated by the following Poisson distribution [Eq. (6.88)]:
(a) Calculate the probability of more than one error in l0 received d.igits.
(b) Repeat (a), using the Poisson approximation, Eq. (6.I03).
(a) Let Xbe a binomial random variable denoting the number of errors in l0 received digits.
Then using Eq. (6.85), we obtain
P(X > l) = I -
(0'1)0 (0'l)r : 0.0047
"-o' 0! -r--o''
6,21 Let X be a,,Foi*rorl r"v; ,firith'parameter e. Show that pr(k) given by Eq. (6.88) satisfies
By Eq. (6.88),
x- -k
D Px(k) - e-a \-u -e
-AAa
e -I
k-0 akl
6.22 Yeify Eq. (6.3s).
From Eqs (6.6) and (6.28), we have
P(X - x) < P(x- e ( X < *) - Fx(x) - F*(x - e)
)
for any e 0. As fr(x) is continuous, the right-hand side of the preceding expression approaches
:
J-
tt alxlb
Jx@):
lo o*n1.*ir"
whereftisaconstant.
(a) Determine the value of k.
I
(b) Let a -- -l asd b - 2. Calculate P(lXl ( c) for c -
2
Anolog ond Digitol Communications
(a) From property I otf,(x) [Eq. (6.37a)), k must be a positive constant. From property 2 of fr(x)
[Eq. (6.37b)],
a): I
[: f.@)d. - [:
kdx -
- Kb
from which we obtain k - ll(b - a). Thus,
Ir
I alxlb
fx@) - lb- a (6. r 05)
[o otherwise
f*(*):
1
3
-l< x12
0 otherwise
tl
r[r x r=
iJ - p[-1.
I z- - =:): [:, ftx)dx: I:, -dx:
33 -
From p.op..iy I of -f*(x) [Eq. (6.31a)], we must have k 2 0. From property 2 of f*(x)
lEq. (6.37b)1,
A random variableXwith the pdf given by Eq. (6.106) is called anexponential random variable'
with parameter a.
,]
6.25 All manufactured devices and machines fail to work sooner or later. If the failurE rate i$
constant, the time to failure I is modeled as an exponential random vanabtre. Suppose that a
particular class of co@s-x m chips has beon found to fa.4y! &0 e. failure larr of
:
Eq. (6.106) in hours. '
:
(a) Measurements show that the pmbability that the time to failure exceeds lOa hours (h) for
chips in the given class is e I(n; 0.363). Calculate the value of parmreter a for this case.
Probobility and Rondom Voiables
{b) Using the val :,of.i$ai,ameler o determined in part (a), calculate the time /o such'that the
probability is 0.05 ttrat the tirne to failure is less than /0.
(a) Using Eqs (6.38) and (6.106), we see that the distribution function of 7is given by
Fr(t): (l -
I'_* Jr(r)dr -
e-'t)u(t)
: L
,L- [:f*.(€,q)d€d't - o I, fo* "'"'+oD 4qd't - o
[o* "''ta€ .[o "-urdrl ab
I
Hence, k-ab.
6.27 Thejoint pdf ofXand lris given by
t ,-:ry) _ *y-y'*, + y')t2u(x)u(y)
\ / \r/
6;28 ' ,f,andoffirf fibffiffffidlY aia said to be jointly nonmal random variablesr:if theirioint:pdf is
given by
llr
fn(*,v):
ffi*et- rr*rt
"[t -oxr.)',l-,of.*.ltryl
*l{, -"l. ox oY . ov
]t J'I ,Ijl [=ol]] (6,07)
t ['- ,, ]'i
"*rf -
fx@):'.oi-zl. *
,l] r* r
Wr-*W
.*{-
,i41,*-,1=?l]'},
:-{-'t#J']
,[G"* J
r* ,
-n ,[z"or(l - -
p')t''
[
expl-
v"v^Yl , o- *^')J*'
Il',, 1t, OL(* l'l a,
F)l
,"?1_ p,.l' - - ox - I
Comparing the integrand with Eq. (6.91), we see that the integrand is a normal pdf with mean
"?Q-p')
Thus, the integral must be unity, and we obtain
1
fx@): exp{-(x - 1t*12t2.o2*1 (6.108)
G;
In a similar manner, the marginal pdf of Iis .
Probobility and Rondom Vaiables
fn(*, y) : rh".o{ _ .
1r=r]' [=rJ' ]]
I --l ,(*-r*l'l , -t t(y-t)'l
W*o[ ,l;) )Wa'^ol ,l;) )
: fx@)-fr(Y)
Hence, X and Y are independent.
6.29 If X is i(pr; o';, then show that Z : (X - p,)/o is a standard normal r.v.; that is, i(0; 1).
The cdf of Z is
and '\
.fzQ) / -- dFr.(z) :
dz ]:Jz" "t'o
which indicates that Z - N(0; 1).
Assume thaty g(x) is a continuous monotonically increasing function [Fig. 6.10(a)]. Then it has
- t(y) hb).Then
an inverse that we denote by x S - -
Fr(y): P(Y 1y) - PIX < h(y)l - Fxlh(y)j (6. r l0)
f,(D : fxlh{ilfuwy1
----
'lr--:----
--1
fr(y) -f*k)+
dy
x - h(y) (6.1 I 1)
(b)
Fig. 6.10
631 Let Y *.2'X'+ 3. If a r.andorn variableX is uniformly distributed over l*l, 2f, fitdfy{y).
From Eq. (6.105) (Solved Problem 6.23), we have
Ir
l^
fAx): 'l I -l<x<-2
Io otherwise
The equationy
- g(x)x,- I2x2 * 3 has a single solution xt: (y -3)12, the range ofy is [1, 7], and
g'(x)
- 2. Thus, -l _< and by Eq. (6.58)
Theequation.y-g(x):ax *bhasasinglesolutionxr:(y-b)la,andg/(4):a.Therangeof
y is (-co, oo). Hence, by Eq. (6.58)
r,(y)
''
r I \'' .^r[- )"lLl-r]'l
Jr" lolo"^'l zo' I o " ) )
f*(x):
""' +
tl2n "-x2/2
(6.1 18)
'
Since/r(x) is an even function from Eq. (6.117), we have
fr(y): *f*d,
ly
)uU): +
lz7ty
e'Y/2u(),) (6.11e)
The input to a noisy commtrnication channel is a binary random variable X with P(X - 0) :
P(X: I) - 1. The'ou ut of Ch nnel Z is given by X +'[ where,:]Fis fie:additirc'noirc
introduaed by the channell Assuming that Xand I'are independent and I- N{0;1), find the
Since Z_X*Y
P(Z < zW : 0) : P(X + Y < zW : 0) : P()' I z) : Fr(r)
Similarly, P(Z < zV : l) : P(X + Y < z{ : l) : P(Y I z - l) : Fv@ - l)
lt
Fz@):
Hence,
,rr(r) + ,Fr(z - t)
Since I-. N(0;l),
f,(Y): e-"'2
#
I
dz -t ,fik)+;frQ-t)
h@)-dF'(z)
arrd
- t[+
2l{2"" * L"-{z-r1'rzf
"-z'/z ' JG"
l
(6.120)
I
a, arl
J(x, y)
-
la*
la*
ayl
a*l
:l: 'rl: ad-bc
t__t
la* ayl
Eq. (6.67) yields
W_Y
Probability and Random
r(x, y\:l'*
"l - l'I0 'lll : ,
l0* 0*l
la, arl
Eq. (6.67) yields [or by setting a : b : d : I and c - 0 in Eq. (6.122)l
fr*(r, w) : f*y(z - w, w) (6.123)
Hence, by Eq. (6.50a), we obtain
I --,,. |
fx@): ,^ e-t'tz fr(y) : -v2t2
I tIt tl2"
-e
rf--x. ,L +
,12""-12-'w72r2 ,l2n "-w2/2
4*
:
* l: '.e[- *rr - 2zw +z-'t)a*
:
* t :.^, - \+ .lt-* - ft)'] }
{
L
-
h'-" ^
h I:'.,[- llt-* - ft)']'-
Analog and Digitol Communications
Since the integrand is the pdf of N(0; l), the integral is equal to unity, and we obtain
I I r't1z1rfz1'1
lztz): ^-2214- ,"_ (6.126)
f;Jve-=-*:60
which is the pdf of Ar(O: Jl).
Thus, Z, is anormal random variable with zero mean and varian"" ,fi.
6-38 ,ConsidEr e transformation
Y
x2 +Y2 O : tan-l {6.,tr27}
X
fiudfi*(ro,d)lin,te.rrqs,affxx, ,y)., , ' ,'
tl x'+Y2 -rtanlZ -o
x
has a single solution:
x -r cos 0 !: r sin 0
Since [Eq. (6.68)]
la*
0x 0x
o.rl
i (x,y): la,
a, ael
ae lcos d -rsin 0l|
lu, 0yavl -
_t
(6.128)
{a:.iegD
t6.:130)
Probability ond Random Vaiables
: R cos (.,,'r - O)
- (6.t32)
-e-lt1zo21
zTto
Statistical Averages
- -6.fhs ::
:
6.41 Binary data are transmitted over a noisy communicatio:r channel in a block of 16 binary digits.
The probability that a received digit is in error due.to.chuT.lnoise is 0.01. Assume that the
errors occurring in various digit positions within a block are independent.
(a) Find the mean (average number of) errors per block
(b) Find the variance of the number of errot's per block
(c) Find the probability that the number of errors per block is greater than or equal to 4.
(a) Let X be the random variable representing the number of errors per block. Then X has a
binomial distribution with n:76 andp : 0.0l.By Eq. (6.87) the average number of errors
per block is
E(X) - np -_ (16X0.01) : 9.16
oo k-l oo m
Cl A
:O8'- -1 \--\ :Qe -(t -de :Q,
)
uo:, (t-1)! D
m:0 -:(Ye'-e 7nl.
Similarly,
.x, k
Elx(x-l)l : t k(k- l)P(x : k) :o + o + \--1
>e -o
Q.
t:0 fr (k -2)l
z _,r13 oo_,
m
^z^_rS A 2-oa
e -o 2
Etxl: I s* v
(o),+ 1t)s-!2t2av
v 4/r
i ld -L
g_{_ 6-1t)2t(2o2)
f*
U -,X, O' " dx - ,tr"
Multiplying bofh sides by ozl ,tC, we have
l^-
(* - tD' "-(x-D2t(2o') d* : EL6 - - o,* : o,
W t: tO2)
ffin:EW'l:
_ [o n-zk*l
r' ' { (6. r 38)
[t.3.... ..(n *l)o' n
-2k
f* e-,r*rdx _ (6.13e)
J-n
fr times with respect to a when n 2k,, we obtain .
-
. | 6.36 | Analog and Digital Communications
* | .3 ...... (2k - l) E
f *2ku-ax2 dx -
J -n 2k \l azk+t
Setting a - 7/(2o2), we have
Letfn(x,y)be the joint density function ofXand L Then using Eq. (6.71), we have
l: t: xfxv(x,v)dxdv:
l:,[/: fn(*,norfo.
_ f* *f*(*)dx-EIX)
J -oo
6;*7 Fi ,,,,fta,,,co sf and f if (*) they are independent and (b) I is related to X by
Y:d{+b.
(a) If X and )' are independent, then by Eqs (6 . 8 I ) and (6. I 40)
6.48 Let Z : aX + bY,uftere a and b are arbitrary constants. Show that if X and )'are independent,
then
: I t o<e.2tr
frr(0) l2n
Io otherwise
Using Eqs (6.69) and (6.701, we have
Elxl -
f :xf*(x)dx- Iif'os a)f6(0)d0 -
l-*
[r* cos g d0 : 0
Similarly,
I 12"
ElYl: znJo sin0d0-0
I r.2"
and EIXY): cos0sin?d0
ZnJo
: i4r J/^"rir, :
o ---- 20 de: o ElxlElyl
Thus, from Eq. (6.82), X and Y are uncorrelated.
6.50 If fx(r)
- 0 for x ( 0, then show that, for any a ) 0,
P(X>a): f* 7*61a*
Jo
Probability and Rondom Variables
Hence,
[* *a> dx - P(x >- o) < 4L
where LIxTEIXI artd o! is the variance ofX. This is known as the Chebyshev inequality.
By Eq. (6.7s)
"'-
: I: (*-t*)'-f*(x)dx2
J
flx-p,.1>s$-tr)'f*(x)dx>e2 Jfl.r- y ,l?e
f*@)dx
Hence, f f*@)dx
Jix-pri>e - "
t+ U,
2
or P(W - pxl2 ul < 4
632 tietffan ' Ue*bal oar variaUtelll-l n** r.*Yornenrs. Show rhat
,,, ,,.,i, ,'. ,, ,r. (rtrrl),<EWrlEIyzl (6.148)
El6-aY)2)> 0
A: ElxY)
,r4
which results in the inequality
.l 6.4O I Analog and Digital Communications
_ (ElxYl)'
Elx'l- tg >o
or (E W YD2 < ElX',l ElY',)
to': !*
2
Then
o'xo;,
s,
from which it follows that
lpxl s 1
6.1 For any three events A, B, and C, show that 6.5 A certain computer becomes inoperable
P(A u B u C) P(A) + P(B) + P(C) if two components A and B both fail. The
- probability that A falls is 0.01, and the prob-
-P(AnB)-P(B)C) ability that B fails is 0.005. However, the
-P(C.A)+P(A.B.C) probability that B fails increases by a factor
lHint)Write A.u B t) C : A u (Bu C) and of 3 if A has failed.
then apply Eq. (6.8).1 (a) Calculate the probability that the com-
6.2 -
Given that P(A) 0.9, P(B) 0.8, P(A n
- puter becomes inoperable.
B) :0.7s, find (a) P(A u B); (b) P@ n E); (b) Find the probability that A will fail if
and (c) eQ a E1 B has failed.
lAns. (a) 0.95; (b) 0.15; (c) 0.0s.1 lAns. (a) 0.00015 (b) 0.031
6.3 Show that if events A and B are indepen- 6.6 A certain binary PCM system transmits the
n B) : f1Zy1B1
- * 1, X : -l with equal
dent, then P(A two binary states X
lHint: Use Eq. (6.102) and the relationl probability. However, because of channel
,q : (tr n B) u (Z a Bl noise, the receiver makes recognition errors.
Also, as a result of path distortion, the
6.4 Let A and B be events defined in a sample receiver may lose necessary signal strength
space S. Show that if both P(A) and P(B) to make any decision. Thus, there are three
are nonzero, then the events I and B possible receiver states: Y: *1, Y 0, and
cannot be both mutually exclusive and Y : where Y : 0
-
corresponds to "loss
-1,
independent.
lHint: Show that condition (6.21) will not
of signal". Assume - - llX:
that P(Y +l)
- 0.1, P(f - +llx - -l) - 0.2,0.05.
and
hold.l P (Y : )lX: +l) P(Y :0lX -l) -
- -
f 6/il
(a) Find the probabilities P(Y
- *t), 6.11 Let X and Y be two independent random
P(Y
- -l), and P(r- 0). variables with
(b) Find the probabilities P(X -- +lJf : ae-* u(x) fr(Y') -- !3e-'t'tt(1)
.f*(*) -
*1) and P(X - -rlY - -1).
lAns. (a) P(Y - +l) - 0.525, P(Y - -1) Find the density function of Z: X + Y.
la2 ze-"'u1z1
6.7 Suppose 10000 digits are transmitted over a t : ol O
noisy channel having per-digit error proba-
6.12 Let X be a random variable uniformly dis-
bility p: 5 x l0-5. Find the probability that
tributed over fa, Dl. Find the mean and the
there will be no more than two-digit errors.
variance of X.
lAns.0.9856l
6.8 Show that Eq. (6.91) does in fact define a true t b+a z @- r)'1
probability density; in particular, show that L-
.v v -
-
t2 |
l
rx) 6.13 Let (X, I) be abivariate r.v. If X -tand Y are
J'-* f*(*)dx : L
independent, show that X and Y are uncor-
related.
fHint: Make a change of variable f.y - (, - p)l
lHint: Use Eqs (6.78) and (6.51)l
ol and show that I : [:r-f'' dy
-,[G1 6.14 Let (X, )') be a bivariate r.v. with the joint
which can be proved by evaluattng 12 by using pdf
the polar coordinates.
x2 + Y2 e
frr(*, Y) :
\x2 + v\tz
6.9 A noisy resistor produces a voltage V,(t)- At
t: t1, the noise level X: Vn(tr) is known to 4r
be a gaussian random variable with density -oo ( x,y 1xt
.l Show that X and I'are not independent but
"fr@)
:'w ,-x2t(2o2) are uncorrelated.
lHint: Use Eqs (6.50) and (6.69).1
Compute the probabilify that lXl > ko for 6.15 Given the random variable X with mean
k: 1,2,3. pt* and o'* , find, the linear transformation
lAns. P(W )
o) :0.3173, P(lXl > 2o) Y: o** b such that pty: 0 and o? : l.
- 0.0455,, P(lXl > 3o) 0.00271 : fr o: |
6.10 Consider the transformation LlX. Y: lnt. ox ,b:-'*l
ox)
(a) Find/r(y) in terms of f*(x). l.
6.16 Define random variables Z and Wby
(b) If f*(x1 : -3b-findfy1).
e.' + x' Z_X*aY W:X_aY
where a is a reai number. Determine a such
1v0): +
v-f.fL)
Ans.(a) that Z and W are orthogonal.
\v)
(b) fr(y) -
tt(o,tr)
- .:
uaz + y2 lo*,
6.17 The moment generating functron
Note that X and Y are known as Cauchv
defined by
random variables.
Analog and Digitai Communications
M*(\): Eleuf :
[: fx(x)e\'dx : 1. * ab + ,'), Elxtj
it|'
where .\ is a real variable. Then
- M(9
ffik: Elxol (o) k :1,2, ...
- 1(a' * b2a t baz + o')l
4'
where u(gfol - dk tut *.(x) l^ _ o
d^k 6.18 Show that if Xand Y are zero-mean jointly
(a) Find the moment generating function of normal random variables, then
Xuniformly distributed over (a, b).
(b) Using the result of (a), find EWl,
n1x2r21 - Elxzl EIY'y + 2(ElxYD2
EW' l, and EIX'). lHint: Use the moment generating function
ofXand Igiven by
i /,b _
lens'
(a) - "),af
- ql
M.o(\r, )r) - Ele^,x+ ^rr1
t1u) EIxl :
^@ : *Ll;)"*"*o1x{t;r'
*ru
* a), Elx'l [