Open navigation menu
Close suggestions
Search
Search
en
Change Language
Upload
Sign in
Sign in
Download free for days
0 ratings
0% found this document useful (0 votes)
11 views
12 pages
1st Unit Itc
Info. Theory & coding unit 1
Uploaded by
honeysinghjarwal
AI-enhanced title
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content,
claim it here
.
Available Formats
Download as PDF or read online on Scribd
Download
Save
Save 1st unit itc For Later
0%
0% found this document useful, undefined
0%
, undefined
Embed
Share
Print
Report
0 ratings
0% found this document useful (0 votes)
11 views
12 pages
1st Unit Itc
Info. Theory & coding unit 1
Uploaded by
honeysinghjarwal
AI-enhanced title
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content,
claim it here
.
Available Formats
Download as PDF or read online on Scribd
Carousel Previous
Carousel Next
Download
Save
Save 1st unit itc For Later
0%
0% found this document useful, undefined
0%
, undefined
Embed
Share
Print
Report
Download now
Download
You are on page 1
/ 12
Search
Fullscreen
INTRODUCTION To INForMaTION Tuory 1 (025 10g 0.25 + 0.2 10g 02 +02 ogo + 0.1 1og0.1+0.1 fog: * 0.05 log 0.05 + 900 +0105 1og.0.95) = 2.74 bits/message As the sampling frequency is 10 kHz, the message Fate = 10,000 messages/sec. Hence, the rate of information is R=rH= 10,000 x 2.74=27,400 bitsisee. itt 76°76 Probabilities. Determine (R.TU. 2017) Q3 Define Joint Entropy. (RTE. 2016} Ans. Joint Entropy : The joint entropy of two discrete random variables X and Y is merely the entropy of their This implies that if X and Y are independent, int entropy is the sum of their individual entropies. For examples, if (X,Y) represents the position ‘ofa chess piece — X the row and Y the column, then the joint entropy of the row of the piece and the column of the piece will be the entropy of the position of the piece. (X,Y) = Exy[-loge(uy)) 2)+ Hos, (4) + flog; (6) + toe, (16) 1 +7gloe: (16) =)5 +0.5+0.375 +0.25 +0.25 = 1.875 bits/syinbol Information rate, R R=1,H1(s) bits/sec = 1000 x 1.875 bits/sec Acontinuous signal is band limited to SkHz. The Signal is quantized in 8 levels of a PCM system =-lss}lose(oy) with the probabilities 0.25, 0.2, 0.2, 0.4, 0.1, 0.05, % 405 and 0.05. Calculate the entropy and rate of information. [Note : Read SkHz = $ kHz}. QA Define Information Rate. [REU. 2016) —— | ‘Ans. Information Rate : The information rate is represented by R and it is given as, Information Rate R = rH Here R is the information rate. His the entropy or average information. JR.TU. 2016] The signal should be sampled at a frequency ‘210 kiiz (Sampling theorem) . Each sample is then to one of the eight levels. Looking at each (level as a message. ;an formation wil facts”. The main channels below from richest to leans + Richest channels: face-to-face meeting. oral presentation ‘+ Rich channels: online meeting: video ¢g P ‘+ Lean channels: teleconference, message; video (e.g., Face time) Leanest channels: blog; report elements and 16 + 10° pien Pictures are repeated of abo coperent brighaness tevel® bss aiferen Od por sec. AU picture element sire be dependent ond vets have a wrance, calculate the Tiare of ovcnronce celulte the ps ie ved by this TV pic congo serge information service? newsletter; flier; email; phohe text go, a ooe— posts (¢.g., Twitter, Facebook). ‘Ans. Given 7 Oral communications tend to be richer picture element = 2 » 10 because information can be conveyed th symbols = 16 well as nonverbally through tone of voice na repeatation rate = 32/sec language. Oral forms of communication can rage then H=log,M casual conversation with a colleague tog log! presentation in front of many employees, are well suited to complex (or information, since they can provide meaning, reiterate information, and. = 54 * 10° symbols/sec While written communication does then Info rate advantage of immediacy and interaction, ite R-H ive means of conveying large amounts = 64% 10°» 4 biusee Written communication is an effective chavs tan | Contest, supporting data, and detailed exphan necessary to inform or persuade o an of written communications is that misunderstood or misinterpreted by an a doesn’t have subsequent uestions or otherwise respond. . The followi xi ¢ o A dere "esi suited fr shag fails (such 4 Soliciting conta ofc Phone number and nati ae Providemf message “a dor bites # man Ow te sa Drobabiity of aman big day this becomes a news, i. quite aan is commun i nem i communicated by the mesage "yma ; us, we see that thes should be some srt of antes, Fier a discrete memory less binary channel aaa Bown in fe 8 % associated with it, and vice versa. Thus, KR 1] I = ae re) ; 3 % : ‘ (Find channel matrix of the channel. i oc 2) is an event with a probability poq), and (i) Find Piy), Ply) when P(x,) = Pls, =.5 {iy Find PC, ¥_) and P(xy ¥_) when PCs = Pl) = 5 % unt of information associated wiht elke (i) Mutual Information : Mutual information sa quantity ren POG) | thatmeasuresarcatonshipbetweentworandomaratien ru. that are sampled simultaneously. n particular, itmeasures how much information is communicated, on average, im ‘one random variable about another. Intuitively, one might [rain ro. ask, how much does one random variable tell me about Pop) = [py |x.) POs |%2) anette, (i) Channel matrix eae ” For example, suppose X represents the roll of a P(yix) = i i} fair 6-sided die, and Y represents whether the roll is even ss (0 if even, 1 if odd). Clearly, the value of ¥ tells us Gi) P(y,) and P(yp) when P (x;) = PQq) = 5 something about the value of X and vice versa. That is, 9 A] these variables share mutual information. (PO)] = (PO)} [POF LS 5] [ oe 81 Mutual information measures he amount (roy)= information that can be obtained about one random arable ey at ah (=. t spears 5 ag Ttis ee in communication FP i, yo] = [POO] Py] iy fey hea " no oe oo { sie a information of XrelativetoY is given 0 SH2 8} 45.05] : 1G) Easel] -[% | s ; . P(x, yo) =.05 ne e . Phony) =.10.S25 pic 2. te of pictare levels/element. ‘ ; 1 = 30 frames’se teens ofiaformation =? We know thatentropy is given Uy . oi. Ci Hixy= =D, PODIog, P(Ai) bits/symbo i Here, given that m=256 Pixi)~ 556 jon (i) we get, ol valves neque (1) : log, | —| = 8 bits/element. toes 356) ; at which symbols are generated is (52) (525).(30) = 8.268 x 1 elements/ee. Hence average rate of information converged is given by R= rH(X) = 8.268 « 105 x g = 66.144 * 10° bitls = % 66 Mb/s 10) Consider a binary channel shown below, z Er(« Jog, “ey E Les 7, )log: but we know that YA tog; (*)
HX) = EN) = 2 PO )) pes =H(p)=H(q) ___ AplotofH, as@ function ofp, as in (shows in Fig. The condition for maximum entropy and its value can be found as follows: Differentiating Eq. ({) wart. p and equating it to zero yields. =~ © P(x) log, 1(x,)bits/symbol The quantity H(X) is called the entropy of source X. js a measure of the average information content per symbol. The sure entropy H(X) can be considered the average amount of uncertainty within source X that resolved by use of the alphabet. tn 2—logp+ én2+ log(t-P) Note that for a binary source X that generates ive. log p= log (1 -P) ependent symbol 0 and i with equal probability, the ieee ie. p=05 entropy H (X) is : that there is either a maxima oF & ‘This concludes 2 ae 4 derivative of His pst Lah H(X) =- lee 3-708 2 1b/symbol inimeetp 3 08 the second dete SHI The source entropy H(X) satisfies the following then there is a minima and if itis neg’ vation a maxima. Now, fH = 0.< H(X) < log, m cs ; ‘here m is the size (number of symbols) of the dp’ y, = i psbet of source X. The lower bound corresponds to.no Hence, H has a maximam 02: Wi fPetainty, which occurs when one symbol has probability
You might also like
DC Digital Communication MODULE IV PART1
PDF
100% (3)
DC Digital Communication MODULE IV PART1
23 pages
Information Theory: Info Rmatio N Types
PDF
No ratings yet
Information Theory: Info Rmatio N Types
52 pages
Itc
PDF
No ratings yet
Itc
304 pages
For Sync Class - Module 6 - 3TSY2223
PDF
No ratings yet
For Sync Class - Module 6 - 3TSY2223
25 pages
Information Theory Part1
PDF
No ratings yet
Information Theory Part1
25 pages
Important Questions..
PDF
No ratings yet
Important Questions..
18 pages
Information Theory and Coding by Chitode J S PDF Free
PDF
No ratings yet
Information Theory and Coding by Chitode J S PDF Free
411 pages
4
PDF
No ratings yet
4
16 pages
ITC Module - I
PDF
No ratings yet
ITC Module - I
98 pages
Lec.1n - COMM 552 Information Theory and Coding
PDF
No ratings yet
Lec.1n - COMM 552 Information Theory and Coding
25 pages
ADC Unit 5 PPT 3
PDF
No ratings yet
ADC Unit 5 PPT 3
15 pages
Prob Pi Ii Avg - Info Produling
PDF
No ratings yet
Prob Pi Ii Avg - Info Produling
51 pages
ECEVSP L02 Info Theor
PDF
No ratings yet
ECEVSP L02 Info Theor
45 pages
Information Theory
PDF
No ratings yet
Information Theory
27 pages
Information Theory: M Ication System
PDF
No ratings yet
Information Theory: M Ication System
7 pages
Information Theory
PDF
No ratings yet
Information Theory
97 pages
Ce Unit Iii PDF
PDF
No ratings yet
Ce Unit Iii PDF
116 pages
Communication Ii 4 - Year 3Hrs Theor, 1 HR Pratical
PDF
No ratings yet
Communication Ii 4 - Year 3Hrs Theor, 1 HR Pratical
29 pages
Information Theory
PDF
No ratings yet
Information Theory
29 pages
Unit 1
PDF
No ratings yet
Unit 1
94 pages
Concepts & Information Theory
PDF
No ratings yet
Concepts & Information Theory
68 pages
CH 11
PDF
No ratings yet
CH 11
36 pages
ITC Module1
PDF
No ratings yet
ITC Module1
31 pages
Information Theory and Coding
PDF
No ratings yet
Information Theory and Coding
226 pages
ECE4007 Information Theory and Coding: DR - Sangeetha R.G
PDF
No ratings yet
ECE4007 Information Theory and Coding: DR - Sangeetha R.G
44 pages
Communication System CH#2
PDF
No ratings yet
Communication System CH#2
40 pages
CE Notes
PDF
No ratings yet
CE Notes
32 pages
DC Book
PDF
No ratings yet
DC Book
53 pages
Ec23ec4211itc PPT
PDF
No ratings yet
Ec23ec4211itc PPT
148 pages
Information Theory
PDF
No ratings yet
Information Theory
37 pages
Chapte-2 Information Theory and Coding
PDF
No ratings yet
Chapte-2 Information Theory and Coding
68 pages
DC Unit3
PDF
No ratings yet
DC Unit3
97 pages
Module-1
PDF
No ratings yet
Module-1
40 pages
Information Theory and Coding System: PMSCS 676 Summer 2016
PDF
No ratings yet
Information Theory and Coding System: PMSCS 676 Summer 2016
64 pages
Week 5 Information Theory Part1
PDF
No ratings yet
Week 5 Information Theory Part1
26 pages
Information Theory Final
PDF
No ratings yet
Information Theory Final
50 pages
All Coding
PDF
No ratings yet
All Coding
52 pages
ITC Unit1 Book
PDF
No ratings yet
ITC Unit1 Book
33 pages
Noise, Information Theory, and Entropy: CS414 - Spring 2007
PDF
No ratings yet
Noise, Information Theory, and Entropy: CS414 - Spring 2007
44 pages
Digital Communication Chapter 3
PDF
No ratings yet
Digital Communication Chapter 3
37 pages
The Information Theory: C.E. Shannon, A Mathematical Theory of Communication'
PDF
No ratings yet
The Information Theory: C.E. Shannon, A Mathematical Theory of Communication'
43 pages
01-Syllabus and Intro
PDF
No ratings yet
01-Syllabus and Intro
21 pages
Measure of Information
PDF
No ratings yet
Measure of Information
92 pages
Com2 Lesson 2
PDF
No ratings yet
Com2 Lesson 2
4 pages
Comm... System CH2-Lec1
PDF
No ratings yet
Comm... System CH2-Lec1
36 pages
Unit 3 Entropy
PDF
No ratings yet
Unit 3 Entropy
25 pages
Amount of Information I Log (1/P)
PDF
No ratings yet
Amount of Information I Log (1/P)
2 pages
Section 15 - Digital Communications
PDF
0% (1)
Section 15 - Digital Communications
27 pages
Classical Information Theory
PDF
No ratings yet
Classical Information Theory
6 pages
Information Theory
PDF
No ratings yet
Information Theory
38 pages
Infotheory&Coding BJS Compiled
PDF
No ratings yet
Infotheory&Coding BJS Compiled
91 pages
ITCT Lab Manual 2018-19
PDF
100% (3)
ITCT Lab Manual 2018-19
40 pages
T4 NoiseAndMutualInformation
PDF
No ratings yet
T4 NoiseAndMutualInformation
8 pages
Lec35 - 210108062 - ZAINAB ALI
PDF
No ratings yet
Lec35 - 210108062 - ZAINAB ALI
9 pages
Digital Communication: Information Theory
PDF
No ratings yet
Digital Communication: Information Theory
5 pages
3rd Unit Itc
PDF
No ratings yet
3rd Unit Itc
16 pages
QB Unit 1 DC
PDF
No ratings yet
QB Unit 1 DC
8 pages
1st Unit CD
PDF
No ratings yet
1st Unit CD
16 pages
Information Theory Coding 6 Sem Ec Notes
PDF
91% (22)
Information Theory Coding 6 Sem Ec Notes
174 pages
2nd Unit CD
PDF
No ratings yet
2nd Unit CD
14 pages
Information Theory PDF
PDF
No ratings yet
Information Theory PDF
26 pages
5th Unit Itc
PDF
No ratings yet
5th Unit Itc
8 pages
10032025031242-media (1)
PDF
No ratings yet
10032025031242-media (1)
2 pages