0% found this document useful (0 votes)
80 views30 pages

Information Theory: Prepared By: Amit Degada Teaching Assistant, ECED, NIT Surat

This document provides an overview of information theory. It begins by defining information theory as the study of communication engineering and mathematics. It then discusses key concepts in information theory including information measure, average information per symbol, information rate, coding, and Shannon-Fano coding. The document provides examples of how to calculate information content and entropy for discrete sources. It also discusses the maximum information rate and Kraft's inequality as it relates to coding for discrete memoryless sources.

Uploaded by

Sana Jamil
Copyright
© Attribution Non-Commercial (BY-NC)
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPT, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
80 views30 pages

Information Theory: Prepared By: Amit Degada Teaching Assistant, ECED, NIT Surat

This document provides an overview of information theory. It begins by defining information theory as the study of communication engineering and mathematics. It then discusses key concepts in information theory including information measure, average information per symbol, information rate, coding, and Shannon-Fano coding. The document provides examples of how to calculate information content and entropy for discrete sources. It also discusses the maximum information rate and Kraft's inequality as it relates to coding for discrete memoryless sources.

Uploaded by

Sana Jamil
Copyright
© Attribution Non-Commercial (BY-NC)
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPT, PDF, TXT or read online on Scribd
You are on page 1/ 30

Information Theory

Prepared by:
Amit Degada
Teaching Assistant,
ECED, NIT Surat
Goal of Today’s Lecture
 Information Theory……Some Introduction
 Information Measure
 Function Determination for Information
 Average Information per Symbol
 Information rate
 Coding
 Shannon-Fano Coding
Information Theory
 It is a study of Communication Engineering
plus Maths.

 A Communication Engineer has to Fight with


 Limited Power
 Inevitable Background Noise
 Limited Bandwidth
Information Theory deals with
 The Measure of Source
Information
 The Information Capacity of
the channel
 Coding

If The rate of Information from a source does not exceed the


capacity of the Channel, then there exist a Coding Scheme such that
Information can be transmitted over the Communication Channel with
arbitrary small amount of errors despite the presence of Noise
Equivalent noiseless
Channel

Source Channel Noisy Channel Source


Encoder Encoder Channel Decoder Decoder
Information Measure
 This is utilized to determine the information rate of
discrete Sources

Consider two Messages

A Dog Bites a Man  High probability  Less information

A Man Bites a Dog  Less probability  High Information

So we can say that

Information α (1/Probability of Occurrence)


Information Measure
 Also we can state the three law from Intution

Rule 1: Information I(mk) approaches to 0 as Pk


approaches infinity.

Mathematically I(mk) = 0 as Pk  1

e.g. Sun Rises in East


Information Measure

Rule 2: The Information Content I(mk) must be Non


Negative contity.

It may be zero

Mathematically I(mk) >= 0 as 0 <= Pk <=1

e.g. Sun Rises in West.


Information Measure

Rule 3: The Information Content of message


having Higher probability is less than the
Information Content of Message having
Lower probability

Mathematically I(mk) > I(mj)


Information Measure
Also we can state for the Sum of two messages that the
information content in the two combined messages
is same as the sum of information content of each
message Provided the occurrence is mutually
independent.

e.g. There will be Sunny weather Today.


There will be Cloudy weather Tomorrow

Mathematically

I (mk and mj) = I(mk mj)


= I(mk)+I(mj)
Information measure
 So Question is which function that we can use that measure the
Information?

Information = F(1/Probability)

Requirement that function must satisfy


1. Its output must be non negative Quantity.
2. Minimum Value is 0.
3. It Should make Product into summation.

Information I(mk) = Log b (1/ Pk )

Here b may be 2, e or 10

If b = 2 then unit is bits


b = e then unit is nats
b = 10 then unit is decit
Conversion Between Units

ln v log10 v
log 2 v  
ln 2 log10 2
Example
 A Source generates one of four symbols
during each interval with probabilities P1=1/2,
P2=1/4, P3= P4=1/8. Find the Information
content of three messages.
Average Information Content
 It is necessary to define the information content of
the particular symbol as communication channel
deals with symbol.

 Here we make following assumption…..

1. The Source is stationery, so Probability remains


constant with time.

2. The Successive symbols are statistically


independent and come out at avg rate of r symbols
per second
Average Information Content
 Suppose a source emits M Possible symbols s1, s2,
…..SM having Probability of occurrence
p1,p2,…….pm
M

 Pi  1
i 1

For a long message having symbols N (>>M)

s1 will occur P1N times, like also


s2 will occur P2N times so on…….
Average Information Content
 Since s1 occurs p1N times so information
Contribution by s1 is p1Nlog(1/p1).
 Similarly information Contribution by s2 is
p2Nlog(1/p2). And So on…….

 Hence the Total Information Content is


M
1
Itotal   NPi log  
i 1  Pi 
 And Average Information is obtained by
Itotal M 1
H   Pi log   Bits/Symbol
N i 1  Pi 
It means that In long message we can expect H bit of information
per symbol. Another name of H is entropy.
Information Rate
 Information Rate = Total Information/ time taken
n
 Here Time Taken
Tb 
r

 n bits are transmitted with r symbols per second.


Total Information is nH.

 Information rate nH
R
n
 
r
R  rH Bits/sec
Some Maths
 H satisfies following Equation

0  H  log 2 M
Maximum H Will occur when all the message having equal Probability.
Hence H also shows the uncertainty that which of the symbol will occur.
As H approaches to its maximum Value we can’t determine which message
will occur.

Consider a system Transmit only 2 Messages having equal probability of


occurrence 0.5. at that Time H=1
And at every instant we cant say which one of the two message will occur.

So what would happen for more then two symbol source?


Variation of H Vs. p
 Let’s Consider a Binary Source,
means M=2

Let the two symbols occur at the probability


p and
1-p Respectively.
Where o < p < 1.
So Entropy can be
1  1 
H  p log 2    (1  p ) log 2  
p
   1  p 

 ( p ) Horse Shoe Function


Variation of H Vs. P
Now We want to obtain the shape of the curve

dH d ( p )
 0
dp dp

 1 p 
log  0
 p 
Verify it by Double differentiation
1

d 2H 1 1
   0
dp 2 p 1 p

0 0.5 1
Example
Maximum Information rate
We Know that

R  rH
Also
H max  log 2 M

Hence

R max  r log 2 M
Coding for Discrete memoryless Source

 Here Discrete means The Source is emitting


different symbols that are fixed.

 Memoryless = Occurrence of present symbol is


independent of previous symbol.

 Average Code Length


M
N   piNi
i 1

Where
Ni=Code length in Binary
digits (binits)
Coding for Discrete memoryless Source

Efficiency

R H
   1
rb N
Coding for Discrete memoryless Source

Kraft’s inequality

M
K  2  Ni
1
i 1

If this is satisfied then only the Coding is uniquely Decipherable


or Separable.
This Code is not
Example Uniquely Decipherable
Find The efficiency and Kraft’s inequality

mi pi Code I Code II Code III Code IV

A ½ 00 0 0 0

B ¼ 01 1 01 10

C ¼ 10 10 011 110

D ¼ 11 11 0111 111
Shannon –Fano Coding Technique
Algorithm.
Step 1: Arrange all messages in descending
order of probability.

Step 2: Devide the Seq. in two groups in such a


way that sum of probabilities in each
group is same.

Step 3: Assign 0 to Upper group and 1 to Lower


group.

Step 4: Repeat the Step 2 and 3 for Group 1 and 2 and


So on……..
Example
Messages
Pi Coding Procedure No. Of Code
Mi Bits

M1 ½ 0 1 0

M2 1/8/ 1 0 0 3 100

M3 1/8 1 0 1 3 101

M4 1/16 1 1 0 0 4 1100

M5 1/16 1 1 0 1 4 1101

M6 1/16 1 1 1 0 4 1110

M7 1/32 1 1 1 1 0 5 11110
1/32 1 1 1 1 1
This can be downloaded from

www.amitdegada.weebly.com/download

After 5:30 Today


Questions
Thank You

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy