Chapter 6
Chapter 6
Chapter 6
1 Introduction
2 Information sources
3 Source Coding
4 Entropy Coding
Rb = RB log2M , M = 2N (1)
Example:
Transmit 7 symbols in 1 second
{Red, Orange, Yellow , Green, Cyan, Blue, Purple}
RB = 7Bauds
Using Rb
{000, 001, 010, 011, 100, 101, 110}
Rb = 7 ∗ 3 = 21bps
If the information transmission rate of M-ary PSK is 1500bps, M=8.
Determine the symbol transmission rate.
If the symbol transmission rate of QPSK is 3000 Bauds. Determine the
rate of the information transmission.
Bandwidth Efficiency
Bit rate ; nb = Rb /B measured in bps/Hz
Bit rate; nB = RB /B measured in bps/Hz
What is Information?
An information source may be viewed as an object which produces an
event, the outcome of which is selected at random according to a
probability distribution.
A discrete information source is a source which has only a finite set of
symbols as possible outputs.
The set of source symbols is called the source alphabet, and the
elements of the set are called symbols or letters.
Information sources can be classified as having memory or being
memoryless
And since
p(xi , yj )
p(yj /xi ) =
p(xi )
p(yj |xi )
I (xi , yj ) = = I (yj , xi )
p(yj )
Information provided by yj about xi is the same as that provided by xi
about Y = yj
1
I (xi |yj ) = log2 = −log2 p(xi |yj )
p(xi |yj )
Which leads to
Properties of I (xi )
The information content of a symbol xi , denoted by I (xi ) satisfies the
following properties
I (xi ) = 0 for P(xi ) = 1
I (xi ) ≥ 0
I (xi ) > I (xj ) if P(xi ) < P(xj )
I (xi , xj ) = I (xi ) + I (xj ) if xi and xj are independent
I (xi ) must satisfy the following requirements:
I (xi ) must approach 0 as P(xi ) approaches infinity
The information content I (xi ) must be a non-negative quantity since
each message contains some information. In the worst case, I (xi ) can
be equal to zero
The information content of a message having higher probability of
occurrence is less than the information content of a message having
lower probability
Exercise -1:
1. A source produces one of four possible symbols during each interval
having probabilities; P(x1 ) = 0.5, P(x1 ) = 0.25, P(x1 ) = 0.125.
Obtain the information content of each of these symbols.
2. Calculate the amount of information if it is given that P(xi ) = 0.25
3. In a binary PCM if ’0’ occur with probability = 0.25 and ’1’ occur with
probability equal to 0.75. Then calculate the amount of information
carried by each digits
n X
m n X
m
X X p(xi , yj )
I (X ; Y ) = p(xi , yj )I (xi , yj ) = p(xi , yj ) log2 ≥0
p(xi )p(yj )
i=1 j=1 i=1 j=1
(4)
Similarly, average self-information is given by:
n
X n
X
H(X ) = p(xi )I (xi ) = − p(xi ) log2 p(xi ) (5)
i=1 i=1
If p(xi ) = 1/n for all i, then the entropy of the source becomes.
X1 1
H(X ) = log2 = log2 n
n n
n X
m
X 1
H(X /Y ) = p(xi , yj ) log2 (6)
p(xi /yj )
i=1 j=1
n X
X m
I (X , Y ) = p(xi , yj ) log p(xi /yj ) − log p(xi )
i=1 j=1
R = rH(X ) (8)
Exercise-2:
1. A DMS X has four symbols x1 , x2 , x3 , x4 with probabilities;
P(x1 ) = 0.4, P(x2 ) = 0.3, P(x3 ) = 0.2, P(x4 ) = 0.1. Calculate
a. H(X ).
b. Find the amount of information contained in the messages x1 , x2 , x3
and x4 x3 x3 x2 , and Compare with the H(X ) obtained in part (a).
2 For a source transmitting two independent messages m1 and m2 having
probabilities of p and (1–p) respectively, prove that the entropy is
maximum when both the messages are equally likely. Also, plot the
variation of entropy (H) as a function of probability (p) of one of the
messages.
A discrete source emits one of five symbols once every millisecond with
probabilities 1/2, 1/4, 1/8, 1/16 and 1/16 respectively. Determine the
source entropy and information rate.
L ≥ H(X )
H(x)
Code Efficiency = × 100 (9)
L
Exercise-5:
A Discrete Memoryless Source X has five equally likely symbols.
a. Construct a Shannon-Fano code for X, and calculate the efficiency of
the code.
b. Construct another Shannon-Fano code and compare the results.
c. Repeat for the Huffman code and compare the results.