Information Theory: Mohamed Hamada
Information Theory: Mohamed Hamada
Mohamed Hamada
Software Engineering Lab
The University of Aizu
Email: hamada@u-aizu.ac.jp
URL: http://www.u-aizu.ac.jp/~hamada
Today’s Topics
• Channel Coding/Decoding
• Hamming Method:
- Hamming Distance
- Hamming Weight
• Hamming (4, 7)
1
Digital Communication Systems
1. Huffman Code.
3. Lemple-Ziv Code.
Source Source
Encoder Decoder
Channel Channel
Encoder Decoder
Modulator De-Modulator
Channel
2
Digital Communication Systems
1. Memoryless
2. Stochastic
3. Markov
Information 4. Ergodic User of
Source Information
Source Source
Encoder Decoder
Channel Channel
Encoder Decoder
Modulator De-Modulator
Channel
3
INFORMATION TRANSFER ACROSS CHANNELS
Sent Received
messages messages
symbols
Information Source Channel Channel Source
Channel receiver
source coding coding decoding decoding
Capacity vs Efficiency
4
Channel Coding/Decoding
Hamming Method
5
INFORMATION TRANSFER ACROSS CHANNELS
Sent Received
messages messages
symbols
Information Source Channel Channel Source
Channel receiver
source coding coding decoding decoding
Capacity vs Efficiency
6
Channel Coding/Decoding
7
8
Channel Coding/Decoding
9
Channel Coding/Decoding
Hamming Method
- It was the first complete error-detecting and error-correcting procedure.
- It represents one of the simplest and most common method for the
transmission of information ( in the presence of noise ).
10
Hamming Codes
• (4,7) Hamming Code detects all one- and two-bit
errors
• Corrects all 1-bit errors
• Magic: Any two different codewords differ in at
least 3 places!
0000000 0001011 0010111 0011100
0100110 0101101 0110001 0111010
1000101 1001110 1010010 1011001
1100011 1101000 1110100 1111111
11
Hamming Distance
• Number of places in which two bit strings
differ
• 1 0 0 0 1 0 1 = Hamming distance 3
1 0 0 1 1 1 0
13
Example
14
Performance
A code with minimum distance dmin is capable of
correcting t errors if
dmin 2 t + 1.
15
Hamming codes
We assume that the sequence of symbols generated by
the information source is divided up into blocks of K
symbols.
• Minimum distance 3
• Construction
k
• where m = 2 k 1
Im is identity matrix
16
Example: Hamming (7, 4) codes
We assume that the sequence of symbols generated by
the information source is divided up into blocks of 4
symbols.
• Generating matrix
• G = I4 P
C1=u2+u3+u4
P is a 4x3 matrix determined by: C2=u1+u3+u4
C3=u1+u2+u4
Where + is modulo 2:
0+0=1+1=0 and
1+0=0+1=1
and ui are I4 elements 17
Example: Hamming (7, 4) codes
We assume that the sequence of symbols generated by
the information source is divided up into blocks of 4
symbols. Codewords have length 7
u4
u2 u3
u1 C1 C2 C3
• Generating matrix
I4 1000 011 P
0100 101
• G = I4 P = 0010 110
0001 111
Let G = [ Ik P ]
For Hamming(7, 4) code: n=7 and k=4
19
Example: Suppose that y=(1111011) is received
Ik P
1000 011 PT In-k
n=7 and k=4
Step 1 0111 100
0100 101
G= H= 1011 010
0010 110 1101 001
0001 111