0% found this document useful (0 votes)
48 views36 pages

For Sync Class - Module 7 - 3TSY2223

Uploaded by

teodoro kubaron
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
48 views36 pages

For Sync Class - Module 7 - 3TSY2223

Uploaded by

teodoro kubaron
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 36

DIGITAL COMMUNICATIONS /

COMMUNICATIONS 2
Chapter 7
Source Coding

■ Subtopics: 1) Fixed-length and Variable-length Source Codes,


2) Huffman Coding and Shannon-Fano Coding
■ At the end of the chapter, the learner should be able to:
– Define and discuss types of Fixed-length and Variable-length source coding
– Solve problems involving Huffman coding and Shannon-Fano Coding
Fixed Length and Variable Length
Source Codes
Subtopic #1
• Data communications codes are often used to
represent characters and symbols, such as letters,
digits, and punctuation marks.

• Therefore, data communications codes are called


character codes, character sets, symbol codes, or
character languages.
• Three major data communication codes:

• Baudot Code
• ASCII Code
• EBCDIC Code
• The first fixed-length character code, which is also
known as Telex code

• Developed by Thomas Murray in 1875, and name after


Emile Baudot, a pioneer in telegraph printing

• A five-bit character code, primarily for low-speed


teletype equipment
• American Standard Code for Information Interchange

• Initially developed in 1963, with latest version in 1997,


which are recommended by the ITU and ISO

• A seven-bit fixed length character set, and probably the


code most often used in data communications networks
today.
• Bit b7 is not part of the ASCII code but is generally reserved for
an error detection bit called the parity bit.

• LSB is transmitted first during serial transmission


Capital Letters:
41 to 5A (Hex)
Capital Letters:
41 to 5A (Hex)
Small Letters:
61 to 7A (Hex)
Small Letters:
61 to 7A (Hex)
Numbers:
30 to 39 (Hex)
• Extended Binary-Coded Decimal Interchange Code

• Developed in 1962 by the International Business Machine


(IBM) for exclusive use with IBM computers

• An eight-bit fixed length character set, similar to the


binary sequence of binary coded decimal (BCD)
• Each character of a message is encoded as a binary
string or codeword.
• We would like to find a binary code that encodes the file
using as few bits as possible, i.e., compresses it as
much as possible.
• Options:
• Fixed-Length Source Codes
• Variable-Length Source Codes
• In a fixed-length source codes, each codeword has the
same length.

• Examples: Baudot Code, ASCII Code, and EBCDIC


Codes.
• In a variable-length source codes, codewords may have
different lengths. Examples: Prefix free codes, Huffman
coding, Shannon-Fano Coding
Problem: Suppose we want to store messages made up of 4 characters a,
b, c, d with frequencies 60%, 5%, 30%, and 5% respectively. What are the
fixed-length codes and prefix-free codes that use the least space?
Huffman Coding and
Shannon-Fano Coding
Subtopic #2
• Developed by David Huffman in 1951

• There are mainly two major parts in Huffman Coding:


1) Build a Huffman Tree from input characters.
2) Traverse the Huffman Tree and assign codes to
character
Given the probability distribution of 5 letters, implement
Huffman coding to generate the optimal variable-length prefix
code.

Characters a b c d e
Probability (%) 20 15 5 15 45
n4 (100)
0 1

n3 (55) e (45)
0 1

n2 (35) n1 (20)
0 1 0 1
a (20) b (15) c (5) d (15) e (45)
Given the probability distribution of 5 letters, implement
Huffman coding to generate the optimal variable-length prefix
code.

Characters a b c d e
Probability (%) 20 15 5 15 45
Huffman Code 000 001 010 011 1
• Named after by Claude Shannon and Robert Fano

• Utilizes coding based on relatively equivalent probabilities.


Given the probability distribution of 5 letters, implement
Shannon-Fano coding perform compression:

Characters a b c d e
Probability (%) 22 28 15 30 5
Xi P(Xi) [%] Stage 1 Stage 2 Stage 3 Code
D 30 0 0 00
58
B 28 0 1 01
A 22 1 22 0 10
C 15 42 1 1 0 110
20
E 5 1 1 1 111
Given the frequency distribution of 5 letters, implement
Shannon-Fano coding perform compression:

Characters a b c d e
Probability (%) 22 28 15 30 5
Shannon-
10 01 110 00 111
Fano Code
Additional Example
[1] Ampoloquio, J. M. (2005). Self-Sufficient Guide to Electronic Communications Engineering.
[2] Beasley, Jeffrey. (2014) Electronic Communications. Pearson.
[3] Frenzel, Louis. (2016) Principles of Electronic Communication Systems. Mc Graw Hill Higher
Education. 4th Edition
[4] Gupta. (2016) An Integrated Course in Electronics and Communication Engineering. S. K. Kataria
& Sons
[5] Ha, Tri. T. (2011) Theory and Design of Digital Communication Systems. Cambridge University
Press
[6] Meadows, Jennifer H. (2018) Communication Technology Update and Fundamentals. Routledge.
[7] Rice, Michael. (2018) Digital Communications: A Discrete-Time Approach
[8] Sharma, Sanjay. (2015) Digital Communications. S. K. Kataria & Sons
[9] Sklar, Bernard. (2009) Digital Communications: Fundamentals and Applications. Aitbs India. 2nd
Edition
[10] Tomasi, W. (2014). Advanced Electronic Communications Systems. Harlow: Pearson Education
Limited.
QUESTIONS?

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy