0% found this document useful (0 votes)
188 views21 pages

IT2022Final Answers

Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF or read online on Scribd
0% found this document useful (0 votes)
188 views21 pages

IT2022Final Answers

Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF or read online on Scribd
You are on page 1/ 21
Second Term: 2021 /2022 1 Course: Information Theory | Course Code: 1820) Lecturer: Dr. Osama Farouk [Date 29/6/2022 Total marks: 80 Time allowed: 3h-__] Answer the following questions: Question (Select the correct answer) (80 marks) a Nd code is a decodable but the converse is true 2. A code satisfies Kraft MeMillan’s inequality is necessarily be a prefix code A. True B. False 3. The event with minimum probability has least number of bits. A. True B. False 4. The variable length code words should depend on the probability A. True B. False - 5. A significant amount of-data compression can be realized when there is a wide differences in probabilities of the symbols. B. False 6. Can the outputs of an information source be represented by a source code whose average length less than the source entropy? A True B. False] 7. A code is not uniquely decodable if two symbols have the same codeword. [A:True B. False’ S + representation of information in another form. A. Code [B. Encoder CC. Decoder D. Source 9. For 64 equally likely and independents messages find the information content (in bits) in each of the messages. AL 6 B.7 cme D.5 10. A source sending 3 symbols. X , Y , Z , if the entropy H =0.48 and self information is 0.6 find the robability P(XYZ). ema B. 0.288, aay D. 02 11. The method of converting a word to stream of bits is called as coding B. Binary coding _C. Bitcoding D. Cipher coding 12. For M equally likely messages, the average amount of information H A. H=2logioM log: CC. H= logioM D. H= logioM 13. How many bits per s je 32 different symbols? ae, C165 eas Dap) ’ 1 »and P= = The rate of information with 16 14. An event has two possible with probability yer second is : oo bi B. 38/64 bits/sec C.38/2bitssce __D. 38/32 bits/sec tet OX.X.) be independent random variables. X; has mean 0 and variance 1, whi 4 15. Let (i oe he mutual information (Xr; X.) between X; and Xo in bits ¢ le X# as mean 1 [A._0 bits | B. 1 bits. C. 2bits D. 4 bits 16.A binary random variable X takes the value +2 or -2. The probability P(X=42)gauaall 5 puvuny vr GUlpUters oor Department of Information Systems 3, - its/symbol A. 0.25 bits/symbol T Tbits/symbol =D. 2 bits/sym! 17. For a system having 16 distinct symbols, maximum entropy is obtained when proba ili A. 1/8 bits/symbol B. 1/4 bits/symbol C. 1/3 bits/symbol a limi 18. The information rate R for given average information H= 2.0 for analog signal ban‘ Hz A. 8B bits/sec B._ 4B bits/sec C. 2B bits/sec D. 168 bits/sec 19, Discrete source SI has pect probable symbols while discrete source $2 has 16 equal probable symbols . When the entropy of these two sources is compared, the entropy oft A. Slis greater than S2 B. Slisless than S2 C. Slis equal to $2 D. Depends on rate of symbols/second =. 20. In a binary source, zeros occur three times as often as ones. What is the information contained in the ones? A. 0415 bit B. 0.333 bit C. 3 bit ID. 2 bit 21. Processing done before placing info into channel A. Code B. Encod C._ Decoder D. Source 22. Read the following expression regarding mutual information I(X;Y). Which of the following expressions is/are correc (3) 1;¥)=H(X)+H()-H(%¥) (4) 1(.X;¥)=H(X)+H(¥)+H(XY) A. (1)and (3) B. (2) and @) C. (3) and (4) D. (1) and (4) 23.For designing a communication system, which among the following parameters should be maximum? (1) Transmission rate (2) Received signal-to-noise ratio @) Error probability (4) Bandwidth requirement [A] () and @) B. (3) and (4) . (and 3) D. (1)and (4) ASCII code is a [A-Fixed length code] C. Fixed and Variable length code D. None of the mentioned 25. Which reduces the size of the data? coding] B. Channel coding C. Source & Channel coding _D. None of the mentioned 26. Which achieves greater compression? A. Lossless coding [B. Lossy coding C. Lossless & Lossy coding D. None of the mentioned 27. A code is a mapping from A. Binary sequence to discrete set of symbols ‘set of C. All of the mentioned D. None of the mentioned 28. Which are uniquely decodable codes? B, Variable length code C. Fixed and Variable length code D. None of the mentioned 29, Mutual information should be A. Positive B. Negative 30. What is the entropy of a communi ‘V8 , 1/8, 1/8, 1/4 ,and 1/4 tively 2 A. Lbiwimessage """ B28 bisimese2 CS humessage —D, ay 3. The method of com of bits is called as i . 4.5 bits/mes, A. Binary coding C. Bit coding . None of the ment consists of six messages with probatinae D. Cipher coding CamScanner = Lis 4 gauaall 32. When ® and Y are statistically independent, then I (X,Y) is determined ‘sail [B.0] C Ln2 D. Cannot be e ss 33, Some times the output of the source decoder must be an exact (replica of the information ©, computer data) called i ing with Distortion A. Variable Length Code B. Fixed code CN ding] _D. Coding wit y 34. Other times the output of the source decoder can be approximately equal to the information (e.g. Music, Tv , Speech) - called... A. Variable Length Code —_B Fixed code C. Noiseless Coding 35. Assume N is the number of symbols, If the symbols have the same probal length (L) The entropy (H)) . aS ho D. Cannot be determined 36. When the source symbols are not equally probable , A Fixed length code C. Fixed and Variable length code D. None of the mentioned Bae To be efficient one using knowledge of the statistics of the source such that: Frequent source symbols should be assigned SHORT code words, Rare source symbols should be assigned LONGER code words ‘A. Variance Code Length B. Average Code Length D. None of the mentioned 38, Simple discrete source, Alphabet size of 4(A.B,C.D) , P(A) = P(B) = P(C) = P(D) = p, Calculate the average of the code L A. Obit bit [C2 bits D. 4bits 39. Calculate the average code word length for the following symbols AL_297 [Symbol S | PG) | Code B. 2.27 Coa 0.25 | 1 oem) 31030100 a Lae 12 | 010 D 0.15 | OL E 0.18 | 10 40. If minimum possible code word length equal the average code word length then the code efficiency is, A A 0 BL cae D.2 41. Determine whether the following codes are uniquely decodable: Code 1:{0, 01,11} , Code 2: {0,01, 10} Both Codes Code 1 and Code 2 are uniquely decodable C. Code 2 is uniquely decodable D. Neither Code 1 or Code 2 are uniquely decodable 42, The difference between the average length and the entropy A. Self Information [B. Redundancy | C. Probability D!Noneonthee 43, Determine whether the following codes are Prefix and Uniquely decodable: Code1: {0,1,00, 11} , Code2:(0,01,011, 0111}, Code 3: (0, 10, 110, 11 .cove B.Gole 2 [ae Neekin 44, Calculate the information capacity for 100 words, each containing 6 symbols are ch from a 33-are alphabet (23 letters, 10 numerical digits and one blank space) A. 2068 B. 4652 s 45. is concerned with the theoretical limitations and potentials of sys Quant a “A. Probability theory (B. Information theory} CamScanner = Lis 4 gauaall indicates that 0 rion is received 46, When probability of error during transmissio i NO ee ation is recei, Cancale vent [| Chanel is ery & No inform: ved C, None of the mentiones i i r message 1 ae sis of signal Bl Ae ea pel . Information in a signal D. Allo! 4 iit iven: 48. Consider source transmitting six symbols with probability as giv ex) y (U6) 20169) , rn U8), m8), k (132) Find the average information Db. 137 49. The resented in average number of bits of information pel oeecaiee ieee B. entropy C. joint entropy . 50. For a given event with entropy H,(X) = 1.56, the units should be A. bits, B. 4-its C.e-its 1 [D- nats | 51. The following statements describe which quantity? A . “It is Fe sent of information one random variable contains about another random variable “It is a measure of dependence between two random variables” . “Itis the reduction in uncertainty of one random variable due to another random variable” B. HX, ¥) C.UK1¥) D. HY) of the following is NOT representative of the mutual information? ALI A B. (X; ¥) = (Y; X) Sma aty-Hx1 D. WX; ¥) = H(X) + H(Y) - H(X, ¥) input souree to a noisy communication channel is a random variable X over the three symbols a, b and c. The output from this channel is a random variable Y over these same three symbols, The Joint distribution of these two variables is as follows. ‘What is the value of ‘(Y| X)? = sb | x= B. H(Y|X) = 1.52832 wa | eb | ee é yal 0 | 6 | 16 es 1 X) = 2.58496 yb | 16 | 0 | 16 yee | U6 | 16 | 0 Information rate is defined as ‘A. Information per unit time B._Average number of bits of information per second = [D. Allof the above B. Always non negative D. None of the above L contained in a message with probability of occurrence is B. = klog,P C. T=klogi2P D. T pected information contained in a message is called B. Efficiency C. Coded signal rate basically gives an idea about the generated i / information per . B. Minute C. Hour D. none of these variable is B. 0 Ctafinite | D. Cannot be determined ndom variable is a measure of central location, B. Variance C. Standard Deviation ce refers to given by (k is constant) klogs1/P2 D. None of the above source D. none of these B. No message storage D. None of the above 4 CamScanner = lis &>g.n0all Faculty of © omputers & | uters & Informati Department of Information Svea ider a random es " = O= 12, Pa) = ae X which takes on the values “1, 0 and I with probabilities P(-1) s7, Instances of this random Varighie. ind @) and (e) and (, we consider a sequence of 8 throws, or 2. Find the entropy j : erases ee random variable, - : (om . Dz 1.54 63. Find the entropy in base 2 of this random variable. B. 6.26 C434 D. 457 An analog baseband signal, band-limited ne to 100 Hz, is sampled at the Nyquist rate. The samples are quantized into four message symbols that occur independently with probability py=p4=0.125 and p2=p3 64. The probability p2= " A. 0.25 B.0375 a D0 65. Average information H --+» bits/symbol A. 1.233 B. 1812 " L1s4 D- 2.375 66. Sampling rate is samples/sec A. 50 B. 100 D. 400 67. The information rate (hits/sce) of the message source i... (A. 362.2 B. 18 1 D. 0.375 » A binary communication syste Consider the following events: xp: a“ zero” is transmitted , xy: a“ one” is transmitted a“ zero” is received “ one “ is received ‘The following probabilities are gives: P(x9)= 0.5 , PQobxo)=0.75 and P(yslx,)=0.5 68. The information in bits that you obtain when you learn which symbol has been received ( while you know that a “zero” has been transmitted) i AO B. 1.812 69. The information in bits that you obtain when you learn which symbol has been received ( while you know that a “one” has been transmitted) is... makes use of the symbols “zero” and “one”. There are channel errors. A 0 B. 1.812 D. 0.811 © We flip a fair coin twice. Let X denote the outcome of the 1 flip and Y denote the outcome of the second flip. a oh ; Find H(X,Y ie 70. Be i (XY) Bere C3 g.n0all . ea ving equal number of boy student and students, 75% of the students study Tn a high ining 25% students study Commerce . Commerce students are(two times more Schone poy than are Science students. i lected girl student studies Cor is ‘The probability of sel girl mmerce is : 15 B. 4 {cso _} D1 93. The amount of information gained in knowing that a randomly selected girl student studies bits. + For the flowing source codes, for the following symbols 94. Calculate the entroy Source Symbol | Symbol Code B. 1.96 _ bits/symbol Pane ae wail aa hi C. 1.875 bits/symbol So 12 0 1 D. 2.11 bits/symbol Si u4 10 2 95. Calculate the minimum length S2 18 110 3 Ss 18 111 4 1.96 bits/symbol 1.875 bits/symbol 2.11 bits/symbol D. 2.11 bits/symbol . Calculate the compression ratio 1.965 D. 177 flowing source codes, for the following symbols ulate Kraft McMillan’s inequality B. 1/2 D. 9/8 Prof CamScanner = lis &>g.n0all aul: cal geal CamScanner 3 Lis 2guusall CamScanner 2 Lis &gusall CamScanner = CamScanner 2 Lis. &>gusall hes het qo ae | fis Eade Hdx,¥) d4 ny) = -"55 ° ie, re fe sy(s pi oe BI Bd Hen) ne | + See is Zs Plate pas ese ee ae eee Tre ia ee Oye Ve fal Pad HOY Macy) = Od) Cd) oe g Ee Pe abs Nys. Kagel 2 4 T “ a 4 ts q Ai Na ee ace dattahetc aa — et YY deherdiai sre We Padelidura ge 3 HO ell Cte) BL ed = HY) 2 Hit) = i fai ways —— pedi} Hoel) oe ne A (4,9) a yd) lon eae nes" CamScanner 2 |g 2 guaall Bice —— se fea ie) —— Hex) Cruidy Ve wit Levi wea] PUY 4 (a. fo. (aS —-A, ra AR Heyy 2 a cy 2 ot Sde4 GS) e\ Feral SH iyo y) amc de ee teva) = 22 Plot) bY PGiN anette 2) fo4 (02h (\-CN\.. ase Def Lorte she fed loobe} oak (Noe) 109 (ans ti-ed) — Sy eh aps ae de basb L489. Le eS a —— Aes _ ee 55) Talate) tt ebetle =H ee eee? wr oe “Kee CamScanner 2 |32 2g.saall 2. ee contincicmreastlilisitinciiilactiema—s me ele ite () 2th) 6} 783 oy) elias pa Pe ete = Vie byes: 4 bits etna eee Hehe > come Probakyty A pee by - dad 26 q a kel ee 2b pits Se ay et b_zh Le cy oS - ) a 9 ! me NG) 2 tse hh? Se ae eee & p a . Se R= B_bit ccm pel Bae casi CE) ae sf Has FZ CamScanner 2 lsd %g.uaall ! wy — 209) = Praic) Fey We ISPs) ny = erat aves eal py re aug pabablitl-of sera fi sede? shies Covers ni — 1 oo 215-2215 oe anid eth =T = Zeb = =) as San yar age ae Hien’ Chal = zor Se —S,Not_ external ty thn Cte mot PefK cafe SS CamScanner + las &suaall

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy