0% found this document useful (0 votes)
332 views2 pages

IT Assignment 2

This document contains 7 questions related to information theory concepts like joint probability matrices, entropy, conditional entropy, and mutual information. The questions involve computing these information theory metrics for given probability distributions and channel matrices that model different communication scenarios like binary symmetric channels.

Uploaded by

syed02
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
332 views2 pages

IT Assignment 2

This document contains 7 questions related to information theory concepts like joint probability matrices, entropy, conditional entropy, and mutual information. The questions involve computing these information theory metrics for given probability distributions and channel matrices that model different communication scenarios like binary symmetric channels.

Uploaded by

syed02
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 2

NED UNIVERSITY OF ENGINEERING AND TECHNOLOGY

M.Engg. (Telecommunication Engineering)


TC-502 Information Theory
Assignment number 2

1. The joint probability matrix for two random variables X and Y is given below. Compute H(X),
H(Y), H(X; Y), H(X|Y), H(Y|X), I(X;Y).
X\Y y1 y2 y3 y4
x1 0.05 0 0.2 0.05
x2 0 0.1 0.1 0
x3 0 0 0.2 0.1
x4 0.05 0.05 0 0.1

2. Given H(B|A) = 0.83, H(A) = 0.92, and H(A|B) = 0.73. Find H(B), H(A, B), and I(A, B).

3. A binary channel correctly transmits a 0 (as a 0) twice as many times as transmitting it


incorrectly (as a 1) and correctly transmits a 1 (as a 1) three times more often than transmitting it
incorrectly (as a 0). The input to the channel can be assumed equiprobable.
(a) What is the channel matrix P? Sketch the channel.
(b) Calculate the output probabilities, P(b).
(c) Calculate the backward channel probabilities, P(a|b).

4. Two Binary Symmetric Channels are connected in cascade, as shown in the figure below.

(a) Find the channel matric of the resultant channel.


(b) Draw the channel diagram for the resultant channel.
(c) Find P(z1) and P(z2) if P(x1) = 0.6 and P(x2) = 0.4.

5. Consider the Discrete Memoryless Channel shown in the figure below.


(a) Find the output probabilities if P(x1) = and P(x2) = P(x3) = .
(b) Find the output entropy H(Y).
6. Consider a binary symmetric communication channel, whose input source is the alphabet X =
{0, 1} with probabilities {0.5, 0.5}; whose output alphabet is Y = {0, 1}; and whose channel
matrix is
1
=
1
Where is the probability of transmission error.
(a) What is the entropy of the source, H(X)?
(b) What is the probability distribution of the outputs, p(Y), and the entropy of this output
distribution, H(Y)?
(c) What is the joint probability distribution for the source and the output, p(X; Y), and
what is the joint entropy, H(X; Y)?
(d) What is the mutual information of this channel, I(X; Y)?
(e) How many values are there for for which the mutual information of this channel is
maximal? What are those values, and what then is the capacity of such a channel in bits?
(f) For what value of is the capacity of this channel minimal? What is the channel
capacity in that case?

7. A channel is described by the following channel matrix.


1 1
0
= 2 2
0 0 1
(a) Draw the channel diagram.
(b) Find the channel capacity.

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy