4 20240 456
4 20240 456
4 20240 456
Class: Communication
By Dr. Ahmed A. Hamad
Recommended Books:
1. B.P. Lathi, Modern Digital and Analog Communications, 3rd Ed.,
Oxford University Press, 1998.
2. H. P. Hsu, Theory and Problems of Analog and Digital
Communications, Schaums Outline Series, McGRAW-HILL, 1993.
3. T. M. Cover and J. A. Thomas, Elements of Information Theory,
JOHN WILEY & SONS, INC., 1991.
4. S. Lin, Error Control Coding: Fundamentals and Applications,
Prentice Hall., 1983.
5. C. B. Schlegel and L. C. Pe rez, Trellis and Turbo Coding, John Wiley
& Sons, Inc., Publication, 2004.
6. T. K. Moon, Error Correction Coding Mathematical Methods and
Algorithms, John Wiley & Sons, Inc., Publication, 2005.
Source
coder
Channel
coder
Channel
Channel
decoder
Source
decoder
1
= ( )
( )
(2.1)
( ) = 0 ( ) = 1
( ) 0
( ) > ( ) ( ) < ( )
The unit of ( ) is; bit if b=2, hartley if b=10, and nat (natural unit) if b=e. it is
standard to use b=2. The conversion of these units to other units can be achieved
by the following relationships.
2 =
=
2 2
Example 2.1: How much information you need if you want to know that number
2 appears in flipping a fair die?
U
Solution:
U
p(no. 2)=1/6
(fair die)
1
I(no. 2)= 2 =
6
6
2
= 2.5849 bits
level of brightness?
Information/dot= 2 =
8
8
2
= 3
H.W: solve above example for colored TV, each dot has 16 equiprobable colors
each color has 8 equiprobable level of brightness?
U
The mean value of ( ) over the alphabet of source X with m different symbols
is given by
() = [( )] = ( )( )
=1
= ( )2 ( ) /
=1
(3.1)
(3.2)
0 () 2
where m is the size (number of symbols) of the alphabet of source X. The lower
bound corresponds to no uncertainty which occurs when one symbol has
probability ( )=1 while ( )=0 for j i, so X emits the same symbol xi all the
time. The upper bound corresponds to the maximum uncertainty which occurs
when ( )=1/m for all , that is, when all symbols are equally likely to be
emitted by X.
4. SOURCE ENTROPY RATE (INFORMATION RATE)
U
Where
= 1/.
= () = ()/
b/sec
(4.1)
Example 4.1: A source produces dots and dashes, the probability of dote is
twice the probability of dash. The duration of the dot is 10 ms. If the duration of
the dash is three times the duration of the dot. Find the source entropy rate.
U