0% found this document useful (0 votes)
32 views71 pages

Comsysf311 L39

This document summarizes key concepts from a lecture on information theory and coding. It discusses: 1. Channel capacity for discrete memoryless systems and additive white Gaussian noise channels. 2. Entropy calculations for discrete and continuous random variables. 3. Mutual information and how it relates to channel capacity. 4. The conditions that must be met for a continuous channel input to maximize channel capacity, including constraints on the probability density function and average power.

Uploaded by

Vishal Singh
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
32 views71 pages

Comsysf311 L39

This document summarizes key concepts from a lecture on information theory and coding. It discusses: 1. Channel capacity for discrete memoryless systems and additive white Gaussian noise channels. 2. Entropy calculations for discrete and continuous random variables. 3. Mutual information and how it relates to channel capacity. 4. The conditions that must be met for a continuous channel input to maximize channel capacity, including constraints on the probability density function and average power.

Uploaded by

Vishal Singh
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 71

Lecture 39: Module 6

Information Theory and Coding:

Channel Capacity:
Discrete and AWGN

Dr. S .M. Zafaruddin


Assistant Professor
Deptt. of EEE, BITS Pilani, Pilani Campus
Dr. Zafar (BITS Pilani) CommSys: L39 IT
Objectives of Today Lecture

1 Channel Capacity for DMS


2 Channel Capacity for AWGN

Dr. Zafar (BITS Pilani) CommSys: L39 IT


Module 6: Information Theory

• First part: Source coding

Dr. Zafar (BITS Pilani) CommSys: L39 IT


Module 6: Information Theory

• First part: Source coding


• Second part: Channel capacity

Dr. Zafar (BITS Pilani) CommSys: L39 IT


Module 6: Information Theory

• First part: Source coding


• Second part: Channel capacity
• Third part: Channel coding

Dr. Zafar (BITS Pilani) CommSys: L39 IT


Discrete Memory Less
Channel: Transition Matrix
• Discrete: If both input and output alphabets are
discrete.

Dr. Zafar (BITS Pilani) CommSys: L39 IT


Discrete Memory Less
Channel: Transition Matrix
• Discrete: If both input and output alphabets are
discrete.
• Memory less: When the current output depends on
the current input and not the previous ones.

Dr. Zafar (BITS Pilani) CommSys: L39 IT


Discrete Memory Less
Channel: Transition Matrix
• Discrete: If both input and output alphabets are
discrete.
• Memory less: When the current output depends on
the current input and not the previous ones.
• X = {x1 , x2 , ....xi }, Y = {y1 , y2 , ..., yj }

Dr. Zafar (BITS Pilani) CommSys: L39 IT


Discrete Memory Less
Channel: Transition Matrix
• Discrete: If both input and output alphabets are
discrete.
• Memory less: When the current output depends on
the current input and not the previous ones.
• X = {x1 , x2 , ....xi }, Y = {y1 , y2 , ..., yj }
• P (yj |xi ): Transition probability or priori probability that
yj is received when xi is transmitted.

Dr. Zafar (BITS Pilani) CommSys: L39 IT


Example: BSC

• Sketch BSC and find its capacity

Dr. Zafar (BITS Pilani) CommSys: L39 IT


Entropy: Discrete

• Entropy: H(X) = 1
P
i P (xi ) log2 P (xi )
bits per message

Dr. Zafar (BITS Pilani) CommSys: L39 IT


Entropy: Discrete

• Entropy: H(X) = 1
P
i P (xi ) log2
P (xi )
bits per message
• Entropy: H(Y ) = 1
P
j P (yj ) log2 P (yj ) bits per message

Dr. Zafar (BITS Pilani) CommSys: L39 IT


Entropy: Discrete

• Entropy: H(X) = 1
P
i P (xi ) log2
P (xi )
bits per message
• Entropy: H(Y ) = 1
P
j P (yj ) log2 P (yj ) bits per message
• Condition entropy:
1
P P
H(X|Y ) = i j P (xi , yj ) log2 P (xi |yj )
bits per
message

Dr. Zafar (BITS Pilani) CommSys: L39 IT


Entropy: Discrete

• Entropy: H(X) = 1
P
i P (xi ) log2
P (xi )
bits per message
• Entropy: H(Y ) = 1
P
j P (yj ) log2 P (yj ) bits per message
• Condition entropy:
1
P P
H(X|Y ) = i j P (xi , yj ) log2 P (xi |yj )
bits per
message
• Condition entropy:
1
P P
H(Y |X) = j i P (yj , xi ) log2 P (yj |xi )
bits per
message

Dr. Zafar (BITS Pilani) CommSys: L39 IT


Entropy: Continuous

R∞ 1
• Entropy: H(X) = −∞
fX (x) log2 fX (x)
dx bits per
message

Dr. Zafar (BITS Pilani) CommSys: L39 IT


Entropy: Continuous

R∞
• Entropy: H(X) = −∞ fX (x) log2 fX1(x) dx bits per
message
• Condition entropy:
R∞ R∞
H(X|Y ) = −∞ −∞ fXY (x, y) log2 fX|Y1(x|y) dxdy bits
per message

Dr. Zafar (BITS Pilani) CommSys: L39 IT


Mutual Information I(X; Y ):
Discrete

• Mutual information:
I(X : Y ) = H(X) − H(X|Y ) = H(Y ) − H(Y |X)

Dr. Zafar (BITS Pilani) CommSys: L39 IT


Mutual Information I(X; Y ):
Discrete

• Mutual information:
I(X : Y ) = H(X) − H(X|Y ) = H(Y ) − H(Y |X)
• I(X : Y ) = i j P (xi , yj ) log2 PP(x(x)P
i ,yj )
P P
(y )i j

Dr. Zafar (BITS Pilani) CommSys: L39 IT


Mutual Information I(X; Y ):
Discrete

• Mutual information:
I(X : Y ) = H(X) − H(X|Y ) = H(Y ) − H(Y |X)
• I(X : Y ) = i j P (xi , yj ) log2 PP(x(x)P
i ,yj )
P P
i (yj )
j |xi )
• I(X : Y ) = i j P (xi )P (yj /xi ) log2 P PP(x(y)P
P P
i (yj |xi )
i

Dr. Zafar (BITS Pilani) CommSys: L39 IT


Mutual Information I(X; Y ):
Continuous

• Mutual information:
I(X : Y ) = H(X) − H(X|Y ) = H(Y ) − H(Y |X)

Dr. Zafar (BITS Pilani) CommSys: L39 IT


Mutual Information I(X; Y ):
Continuous

• Mutual information:
I(X : Y ) = H(X) − H(X|Y ) = H(Y ) − H(Y |X)
R∞ R∞ f (x,y)
• I(X : Y ) = −∞ f (x, y) log2 fXX,Y
−∞ XY (x)fY (y)
dxdy bits
per message

Dr. Zafar (BITS Pilani) CommSys: L39 IT


Channel Capacity C

• Channel capacity C = maxP (xi ) I(X; Y ) bits per


symbol.

Dr. Zafar (BITS Pilani) CommSys: L39 IT


Channel Capacity C

• Channel capacity C = maxP (xi ) I(X; Y ) bits per


symbol.
• Channel capacity C = maxfX (x) I(X; Y ) bits per
symbol.

Dr. Zafar (BITS Pilani) CommSys: L39 IT


Channel Capacity C

• Channel capacity C = maxP (xi ) I(X; Y ) bits per


symbol.
• Channel capacity C = maxfX (x) I(X; Y ) bits per
symbol.
• For discrete, equiprobable messages maximizes the
capacity.

Dr. Zafar (BITS Pilani) CommSys: L39 IT


Channel Capacity C

• Channel capacity C = maxP (xi ) I(X; Y ) bits per


symbol.
• Channel capacity C = maxfX (x) I(X; Y ) bits per
symbol.
• For discrete, equiprobable messages maximizes the
capacity.
• What should be condition for continuous channel?

Dr. Zafar (BITS Pilani) CommSys: L39 IT


Condition for Maximum
Entropy: Continuous
R∞
• Maximize H(x) = −∞
fX (x)log2 fX1(x) dx

Dr. Zafar (BITS Pilani) CommSys: L39 IT


Condition for Maximum
Entropy: Continuous
R∞
• Maximize H(x) = −∞ fX (x)log2 fX1(x) dx
• with a constraint that

Dr. Zafar (BITS Pilani) CommSys: L39 IT


Condition for Maximum
Entropy: Continuous
R∞
• Maximize H(x) = −∞ fX (x)log2 fX1(x) dx
• with a constraint that
R∞
• Constraint 1: −∞ fX (x)dx = 1

Dr. Zafar (BITS Pilani) CommSys: L39 IT


Condition for Maximum
Entropy: Continuous
R∞
• Maximize H(x) = −∞ fX (x)log2 fX1(x) dx
• with a constraint that
R∞
• Constraint 1: −∞ fX (x)dx = 1
• Any other constraint?

Dr. Zafar (BITS Pilani) CommSys: L39 IT


Condition for Maximum
Entropy: Continuous
R∞
• Maximize H(x) = −∞ fX (x)log2 fX1(x) dx
• with a constraint that
R∞
• Constraint 1: −∞ fX (x)dx = 1
• Any other constraint?
• Input signal should have a limited power

Dr. Zafar (BITS Pilani) CommSys: L39 IT


Condition for Maximum
Entropy: Continuous
R∞
• Maximize H(x) = −∞ fX (x)log2 fX1(x) dx
• with a constraint that
R∞
• Constraint 1: −∞ fX (x)dx = 1
• Any other constraint?
• Input signal should have a limited power
• Another constraint: mean square value of x is fixed.

Dr. Zafar (BITS Pilani) CommSys: L39 IT


Condition for Maximum
Entropy: Continuous
R∞
• Maximize H(x) = −∞ fX (x)log2 fX1(x) dx
• with a constraint that
R∞
• Constraint 1: −∞ fX (x)dx = 1
• Any other constraint?
• Input signal should have a limited power
• Another constraint: mean square value of x is fixed.
R∞ 2
• Constraint 2: −∞ x fX (x)dx = σ 2

Dr. Zafar (BITS Pilani) CommSys: L39 IT


Condition for Maximum
Entropy: Continuous

• The solution of the above optimization problem:

Dr. Zafar (BITS Pilani) CommSys: L39 IT


Condition for Maximum
Entropy: Continuous

• The solution of the above optimization problem:


• fX (x) = √ 1 2 e−x2 /2σ2
2πσ

Dr. Zafar (BITS Pilani) CommSys: L39 IT


Condition for Maximum
Entropy: Continuous

• The solution of the above optimization problem:


• fX (x) = √ 1 2 e−x2 /2σ2
2πσ
• What is maximum entropy?

Dr. Zafar (BITS Pilani) CommSys: L39 IT


Condition for Maximum
Entropy: Continuous

• The solution of the above optimization problem:


• fX (x) = √ 1 2 e−x2 /2σ2
2πσ
• What is maximum entropy?
R∞
• H(x) = −∞ fX (x) log2 f 1(x) dx = 21 log2 (2πeσ 2 )
X

Dr. Zafar (BITS Pilani) CommSys: L39 IT


Channel Capacity of
Band-Limited AWGN
• y(t) = x(t) + n(t)

Dr. Zafar (BITS Pilani) CommSys: L39 IT


Channel Capacity of
Band-Limited AWGN
• y(t) = x(t) + n(t)
• Sampled signal: y = x + n

Dr. Zafar (BITS Pilani) CommSys: L39 IT


Channel Capacity of
Band-Limited AWGN
• y(t) = x(t) + n(t)
• Sampled signal: y = x + n
• H(y|x)?

Dr. Zafar (BITS Pilani) CommSys: L39 IT


Channel Capacity of
Band-Limited AWGN
• y(t) = x(t) + n(t)
• Sampled signal: y = x + n
• H(y|x)?
• What is PDF of y given x

Dr. Zafar (BITS Pilani) CommSys: L39 IT


Channel Capacity of
Band-Limited AWGN
• y(t) = x(t) + n(t)
• Sampled signal: y = x + n
• H(y|x)?
• What is PDF of y given x
• It is Gaussian with shifted mean x.

Dr. Zafar (BITS Pilani) CommSys: L39 IT


Channel Capacity of
Band-Limited AWGN
• y(t) = x(t) + n(t)
• Sampled signal: y = x + n
• H(y|x)?
• What is PDF of y given x
• It is Gaussian with shifted mean x.
R∞ R∞
• H(y|x) = −∞ −∞ fXY (x, y) log2 fY |X1(y|x) dxdy bits per
message

Dr. Zafar (BITS Pilani) CommSys: L39 IT


Channel Capacity of
Band-Limited AWGN
R∞ R∞ 1
• H(y|x) = fXY (x, y) log2 dxdy
R−∞
∞ R−∞

fY |X (y|x)
H(y|x) = −∞ −∞
fX (x)fY |X (y|x) log2 fY |X1(y|x) dxdy

Dr. Zafar (BITS Pilani) CommSys: L39 IT


Channel Capacity of
Band-Limited AWGN
R∞ R∞ 1
• H(y|x) = fXY (x, y) log2 dxdy
R−∞
∞ R−∞

fY |X (y|x)
H(y|x) = −∞ −∞
fX (x)fY |X (y|x) log2 fY |X1(y|x) dxdy
R∞ R∞
• H(y|x) = −∞
fX (x)dx −∞ fY |X (y|x) log2 fY |X1(y|x) dy

Dr. Zafar (BITS Pilani) CommSys: L39 IT


Channel Capacity of
Band-Limited AWGN
R∞ R∞ 1
• H(y|x) = fXY (x, y) log2 dxdy
R−∞
∞ R−∞

fY |X (y|x)
H(y|x) = −∞ −∞
fX (x)fY |X (y|x) log2 fY |X1(y|x) dxdy
R∞ R∞
• H(y|x) = −∞
fX (x)dx −∞ fY |X (y|x) log2 fY |X1(y|x) dy
R∞ 1
• H(y|x) = −∞
fY |X (y|x) log2 fY |X (y|x)
dy

Dr. Zafar (BITS Pilani) CommSys: L39 IT


Channel Capacity of
Band-Limited AWGN
R∞ R∞ 1
• H(y|x) = fXY (x, y) log2 dxdy
R−∞
∞ R−∞

fY |X (y|x)
H(y|x) = −∞ −∞
fX (x)fY |X (y|x) log2 fY |X1(y|x) dxdy
R∞ R∞
• H(y|x) = −∞
fX (x)dx −∞ fY |X (y|x) log2 fY |X1(y|x) dy
R∞ 1
• H(y|x) = −∞
fY |X (y|x) log2 fY |X (y|x)
dy
• fY |X (y|x) = fn (y − x)

Dr. Zafar (BITS Pilani) CommSys: L39 IT


Channel Capacity of
Band-Limited AWGN
R∞ R∞ 1
• H(y|x) = fXY (x, y) log2 dxdy
R−∞
∞ R−∞

fY |X (y|x)
H(y|x) = −∞ −∞
fX (x)fY |X (y|x) log2 fY |X1(y|x) dxdy
R∞ R∞
• H(y|x) = −∞
fX (x)dx −∞ fY |X (y|x) log2 fY |X1(y|x) dy
R∞ 1
• H(y|x) = −∞
fY |X (y|x) log2 fY |X (y|x)
dy
• fY |X (y|x) = fn (y − x)
R∞ 1
• H(y|x) = −∞ fn (y − x) log2 fn (y−x)
dy

Dr. Zafar (BITS Pilani) CommSys: L39 IT


Channel Capacity of
Band-Limited AWGN
R∞ R∞ 1
• H(y|x) = fXY (x, y) log2 dxdy
R−∞
∞ R−∞

fY |X (y|x)
H(y|x) = −∞ −∞
fX (x)fY |X (y|x) log2 fY |X1(y|x) dxdy
R∞ R∞
• H(y|x) = −∞
fX (x)dx −∞ fY |X (y|x) log2 fY |X1(y|x) dy
R∞ 1
• H(y|x) = −∞
fY |X (y|x) log2 fY |X (y|x)
dy
• fY |X (y|x) = fn (y − x)
R∞ 1
• H(y|x) = −∞ fn (y − x) log2 fn (y−x) dy
R∞ 1
• Let z = y − x H(y|x) = −∞ fn (z) log2 fn (z)
dz

Dr. Zafar (BITS Pilani) CommSys: L39 IT


Channel Capacity of
Band-Limited AWGN
R∞ R∞ 1
• H(y|x) = fXY (x, y) log2 dxdy
R−∞
∞ R−∞

fY |X (y|x)
H(y|x) = −∞ −∞
fX (x)fY |X (y|x) log2 fY |X1(y|x) dxdy
R∞ R∞
• H(y|x) = −∞
fX (x)dx −∞ fY |X (y|x) log2 fY |X1(y|x) dy
R∞ 1
• H(y|x) = −∞
fY |X (y|x) log2 fY |X (y|x)
dy
• fY |X (y|x) = fn (y − x)
R∞ 1
• H(y|x) = −∞ fn (y − x) log2 fn (y−x) dy
R∞ 1
• Let z = y − x H(y|x) = −∞ fn (z) log2 fn (z)
dz
• H(y|x) = H(n)

Dr. Zafar (BITS Pilani) CommSys: L39 IT


Channel Capacity of
Band-Limited AWGN
• I(x; y) = H(y) − H(y|x) = H(y) − H(n)

Dr. Zafar (BITS Pilani) CommSys: L39 IT


Channel Capacity of
Band-Limited AWGN
• I(x; y) = H(y) − H(y|x) = H(y) − H(n)
• Capacity: For a given H(n), I(x; y) is maximum when
H(y) is maximum.

Dr. Zafar (BITS Pilani) CommSys: L39 IT


Channel Capacity of
Band-Limited AWGN
• I(x; y) = H(y) − H(y|x) = H(y) − H(n)
• Capacity: For a given H(n), I(x; y) is maximum when
H(y) is maximum.
• y =x+n

Dr. Zafar (BITS Pilani) CommSys: L39 IT


Channel Capacity of
Band-Limited AWGN
• I(x; y) = H(y) − H(y|x) = H(y) − H(n)
• Capacity: For a given H(n), I(x; y) is maximum when
H(y) is maximum.
• y =x+n
• For a given mean square value y¯2 = S + N , H(y) is
maximum when y is Gaussian.

Dr. Zafar (BITS Pilani) CommSys: L39 IT


Channel Capacity of
Band-Limited AWGN
• I(x; y) = H(y) − H(y|x) = H(y) − H(n)
• Capacity: For a given H(n), I(x; y) is maximum when
H(y) is maximum.
• y =x+n
• For a given mean square value y¯2 = S + N , H(y) is
maximum when y is Gaussian.
• Maximum H(y): 12 log2 [2πe(S + N )]

Dr. Zafar (BITS Pilani) CommSys: L39 IT


Channel Capacity of
Band-Limited AWGN
• I(x; y) = H(y) − H(y|x) = H(y) − H(n)
• Capacity: For a given H(n), I(x; y) is maximum when
H(y) is maximum.
• y =x+n
• For a given mean square value y¯2 = S + N , H(y) is
maximum when y is Gaussian.
• Maximum H(y): 12 log2 [2πe(S + N )]
• Since n is Gaussian, y will be Gaussian only if x is
Gaussian.

Dr. Zafar (BITS Pilani) CommSys: L39 IT


Channel Capacity of
Band-Limited AWGN
• I(x; y) = H(y) − H(y|x) = H(y) − H(n)
• Capacity: For a given H(n), I(x; y) is maximum when
H(y) is maximum.
• y =x+n
• For a given mean square value y¯2 = S + N , H(y) is
maximum when y is Gaussian.
• Maximum H(y): 12 log2 [2πe(S + N )]
• Since n is Gaussian, y will be Gaussian only if x is
Gaussian.
• fX (x) = √ 1−x2 /2S .
2πSe

Dr. Zafar (BITS Pilani) CommSys: L39 IT


Shannon Channel Capacity of
Band-Limited AWGN
• I(x; y) = H(y) − H(y|x) = H(y) − H(n)

Dr. Zafar (BITS Pilani) CommSys: L39 IT


Shannon Channel Capacity of
Band-Limited AWGN
• I(x; y) = H(y) − H(y|x) = H(y) − H(n)
• Maximum H(y): 12 log2 [2πe(S + N )]

Dr. Zafar (BITS Pilani) CommSys: L39 IT


Shannon Channel Capacity of
Band-Limited AWGN
• I(x; y) = H(y) − H(y|x) = H(y) − H(n)
• Maximum H(y): 12 log2 [2πe(S + N )]
• H(n) = 12 log2 [2πe(N )]

Dr. Zafar (BITS Pilani) CommSys: L39 IT


Shannon Channel Capacity of
Band-Limited AWGN
• I(x; y) = H(y) − H(y|x) = H(y) − H(n)
• Maximum H(y): 12 log2 [2πe(S + N )]
• H(n) = 12 log2 [2πe(N )]
• For signal: best is Gaussian

Dr. Zafar (BITS Pilani) CommSys: L39 IT


Shannon Channel Capacity of
Band-Limited AWGN
• I(x; y) = H(y) − H(y|x) = H(y) − H(n)
• Maximum H(y): 12 log2 [2πe(S + N )]
• H(n) = 12 log2 [2πe(N )]
• For signal: best is Gaussian
• For noise: worst is Gaussian

Dr. Zafar (BITS Pilani) CommSys: L39 IT


Shannon Channel Capacity of
Band-Limited AWGN
• I(x; y) = H(y) − H(y|x) = H(y) − H(n)
• Maximum H(y): 12 log2 [2πe(S + N )]
• H(n) = 12 log2 [2πe(N )]
• For signal: best is Gaussian
• For noise: worst is Gaussian
• C = 12 log2 [2πe(S + N )] − 12 log2 [2πe(N )]

Dr. Zafar (BITS Pilani) CommSys: L39 IT


Shannon Channel Capacity of
Band-Limited AWGN
• I(x; y) = H(y) − H(y|x) = H(y) − H(n)
• Maximum H(y): 12 log2 [2πe(S + N )]
• H(n) = 12 log2 [2πe(N )]
• For signal: best is Gaussian
• For noise: worst is Gaussian
• C = 12 log2 [2πe(S + N )] − 12 log2 [2πe(N )]
• C = 12 log2 [1 + NS ] bits per symbol

Dr. Zafar (BITS Pilani) CommSys: L39 IT


Shannon Channel Capacity of
Band-Limited AWGN
• I(x; y) = H(y) − H(y|x) = H(y) − H(n)
• Maximum H(y): 12 log2 [2πe(S + N )]
• H(n) = 12 log2 [2πe(N )]
• For signal: best is Gaussian
• For noise: worst is Gaussian
• C = 12 log2 [2πe(S + N )] − 12 log2 [2πe(N )]
• C = 12 log2 [1 + NS ] bits per symbol
• For a BW of B: 2B symbols per second

Dr. Zafar (BITS Pilani) CommSys: L39 IT


Shannon Channel Capacity of
Band-Limited AWGN
• I(x; y) = H(y) − H(y|x) = H(y) − H(n)
• Maximum H(y): 12 log2 [2πe(S + N )]
• H(n) = 12 log2 [2πe(N )]
• For signal: best is Gaussian
• For noise: worst is Gaussian
• C = 12 log2 [2πe(S + N )] − 12 log2 [2πe(N )]
• C = 12 log2 [1 + NS ] bits per symbol
• For a BW of B: 2B symbols per second
• C = 2B 21 log2 [1 + NS ] = B log2 (1 + S/N ) =
B log2 (1 + SN R)

Dr. Zafar (BITS Pilani) CommSys: L39 IT


Shannon Channel Capacity:
Infinite BW

• C = B log2 (1 + S
N0 B
), where N0 : PSD, N : Power.

Dr. Zafar (BITS Pilani) CommSys: L39 IT


Shannon Channel Capacity:
Infinite BW

• C = B log2 (1 + S
N0 B
), where N0 : PSD, N : Power.
• limB → ∞C = limB → ∞B log2 (1 + S
N0 B
)

Dr. Zafar (BITS Pilani) CommSys: L39 IT


Shannon Channel Capacity:
Infinite BW

• C = B log2 (1 + S
N0 B
), where N0 : PSD, N : Power.
• limB → ∞C = limB → ∞B log2 (1 + S
N0 B
)
• limB → ∞B log2 (1 NSB ) =
0
limB → ∞ NS0 [ NS0 B log2 (1 + S
N0 B
)]

Dr. Zafar (BITS Pilani) CommSys: L39 IT


Shannon Channel Capacity:
Infinite BW

• C = B log2 (1 + S
N0 B
), where N0 : PSD, N : Power.
• limB → ∞C = limB → ∞B log2 (1 + S
N0 B
)
• limB → ∞B log2 (1 NSB ) =
0
limB → ∞ NS0 [ NS0 B log2 (1 + S
N0 B
)]
S
• limB → ∞B log2 (1 + N0 B
) = log2 e NS0 = 1.44 NS0

Dr. Zafar (BITS Pilani) CommSys: L39 IT


System Adaptation

• Nyquist versus Shannon

Dr. Zafar (BITS Pilani) CommSys: L39 IT


HW39

Problem 12.5-3 BP Lathi, 4th SE Edition.

Dr. Zafar (BITS Pilani) CommSys: L39 IT

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy