0% found this document useful (0 votes)
11 views12 pages

1st Unit Itc

Info. Theory & coding unit 1

Uploaded by

honeysinghjarwal
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF or read online on Scribd
0% found this document useful (0 votes)
11 views12 pages

1st Unit Itc

Info. Theory & coding unit 1

Uploaded by

honeysinghjarwal
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF or read online on Scribd
You are on page 1/ 12
INTRODUCTION To INForMaTION Tuory 1 (025 10g 0.25 + 0.2 10g 02 +02 ogo + 0.1 1og0.1+0.1 fog: * 0.05 log 0.05 + 900 +0105 1og.0.95) = 2.74 bits/message As the sampling frequency is 10 kHz, the message Fate = 10,000 messages/sec. Hence, the rate of information is R=rH= 10,000 x 2.74=27,400 bitsisee. itt 76°76 Probabilities. Determine (R.TU. 2017) Q3 Define Joint Entropy. (RTE. 2016} Ans. Joint Entropy : The joint entropy of two discrete random variables X and Y is merely the entropy of their This implies that if X and Y are independent, int entropy is the sum of their individual entropies. For examples, if (X,Y) represents the position ‘ofa chess piece — X the row and Y the column, then the joint entropy of the row of the piece and the column of the piece will be the entropy of the position of the piece. (X,Y) = Exy[-loge(uy)) 2)+ Hos, (4) + flog; (6) + toe, (16) 1 +7gloe: (16) =)5 +0.5+0.375 +0.25 +0.25 = 1.875 bits/syinbol Information rate, R R=1,H1(s) bits/sec = 1000 x 1.875 bits/sec Acontinuous signal is band limited to SkHz. The Signal is quantized in 8 levels of a PCM system =-lss}lose(oy) with the probabilities 0.25, 0.2, 0.2, 0.4, 0.1, 0.05, % 405 and 0.05. Calculate the entropy and rate of information. [Note : Read SkHz = $ kHz}. QA Define Information Rate. [REU. 2016) —— | ‘Ans. Information Rate : The information rate is represented by R and it is given as, Information Rate R = rH Here R is the information rate. His the entropy or average information. JR.TU. 2016] The signal should be sampled at a frequency ‘210 kiiz (Sampling theorem) . Each sample is then to one of the eight levels. Looking at each (level as a message. ; an formation wil facts”. The main channels below from richest to leans + Richest channels: face-to-face meeting. oral presentation ‘+ Rich channels: online meeting: video ¢g P ‘+ Lean channels: teleconference, message; video (e.g., Face time) Leanest channels: blog; report elements and 16 + 10° pien Pictures are repeated of abo coperent brighaness tevel® bss aiferen Od por sec. AU picture element sire be dependent ond vets have a wrance, calculate the Tiare of ovcnronce celulte the ps ie ved by this TV pic congo serge information service? newsletter; flier; email; phohe text go, a ooe— posts (¢.g., Twitter, Facebook). ‘Ans. Given 7 Oral communications tend to be richer picture element = 2 » 10 because information can be conveyed th symbols = 16 well as nonverbally through tone of voice na repeatation rate = 32/sec language. Oral forms of communication can rage then H=log,M casual conversation with a colleague tog log! presentation in front of many employees, are well suited to complex (or information, since they can provide meaning, reiterate information, and. = 54 * 10° symbols/sec While written communication does then Info rate advantage of immediacy and interaction, ite R-H ive means of conveying large amounts = 64% 10°» 4 biusee Written communication is an effective chavs tan | Contest, supporting data, and detailed exphan necessary to inform or persuade o an of written communications is that misunderstood or misinterpreted by an a doesn’t have subsequent uestions or otherwise respond. . The followi xi ¢ o A dere "esi suited fr shag fails (such 4 Soliciting conta ofc Phone number and nati ae Provide mf message “a dor bites # man Ow te sa Drobabiity of aman big day this becomes a news, i. quite aan is commun i nem i communicated by the mesage "yma ; us, we see that thes should be some srt of antes, Fier a discrete memory less binary channel aaa Bown in fe 8 % associated with it, and vice versa. Thus, KR 1] I = ae re) ; 3 % : ‘ (Find channel matrix of the channel. i oc 2) is an event with a probability poq), and (i) Find Piy), Ply) when P(x,) = Pls, =.5 {iy Find PC, ¥_) and P(xy ¥_) when PCs = Pl) = 5 % unt of information associated wiht elke (i) Mutual Information : Mutual information sa quantity ren POG) | thatmeasuresarcatonshipbetweentworandomaratien ru. that are sampled simultaneously. n particular, itmeasures how much information is communicated, on average, im ‘one random variable about another. Intuitively, one might [rain ro. ask, how much does one random variable tell me about Pop) = [py |x.) POs |%2) anette, (i) Channel matrix eae ” For example, suppose X represents the roll of a P(yix) = i i} fair 6-sided die, and Y represents whether the roll is even ss (0 if even, 1 if odd). Clearly, the value of ¥ tells us Gi) P(y,) and P(yp) when P (x;) = PQq) = 5 something about the value of X and vice versa. That is, 9 A] these variables share mutual information. (PO)] = (PO)} [POF LS 5] [ oe 81 Mutual information measures he amount (roy)= information that can be obtained about one random arable ey at ah (=. t spears 5 ag Ttis ee in communication FP i, yo] = [POO] Py] iy fey hea " no oe oo { sie a information of XrelativetoY is given 0 SH2 8} 45.05] : 1G) Easel] -[% | s ; . P(x, yo) =.05 ne e . Phony) =.10. S25 pic 2. te of pictare levels/element. ‘ ; 1 = 30 frames’se teens ofiaformation =? We know thatentropy is given Uy . oi. Ci Hixy= =D, PODIog, P(Ai) bits/symbo i Here, given that m=256 Pixi)~ 556 jon (i) we get, ol valves neque (1) : log, | —| = 8 bits/element. toes 356) ; at which symbols are generated is (52) (525).(30) = 8.268 x 1 elements/ee. Hence average rate of information converged is given by R= rH(X) = 8.268 « 105 x g = 66.144 * 10° bitls = % 66 Mb/s 10) Consider a binary channel shown below, z Er(« Jog, “ey E Les 7, )log: but we know that YA tog; (*) HX) = EN) = 2 PO )) pes =H(p)=H(q) ___ AplotofH, as@ function ofp, as in (shows in Fig. The condition for maximum entropy and its value can be found as follows: Differentiating Eq. ({) wart. p and equating it to zero yields. =~ © P(x) log, 1(x,)bits/symbol The quantity H(X) is called the entropy of source X. js a measure of the average information content per symbol. The sure entropy H(X) can be considered the average amount of uncertainty within source X that resolved by use of the alphabet. tn 2—logp+ én2+ log(t-P) Note that for a binary source X that generates ive. log p= log (1 -P) ependent symbol 0 and i with equal probability, the ieee ie. p=05 entropy H (X) is : that there is either a maxima oF & ‘This concludes 2 ae 4 derivative of His pst Lah H(X) =- lee 3-708 2 1b/symbol inimeetp 3 08 the second dete SHI The source entropy H(X) satisfies the following then there is a minima and if itis neg’ vation a maxima. Now, fH = 0.< H(X) < log, m cs ; ‘here m is the size (number of symbols) of the dp’ y, = i psbet of source X. The lower bound corresponds to.no Hence, H has a maximam 02: Wi fPetainty, which occurs when one symbol has probability

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy