Lecture 6 - Communication Channels and Channel Capacity
Lecture 6 - Communication Channels and Channel Capacity
6.1 Introduction
In telecommunications and computer networking, a communication channel refers to
a physical transmission medium, such as a wire, or a logical connection over a
multiplexed medium, such as a radio channel. A channel is used to convey an
information signal, for example, a digital bit stream, from one or several senders (or
transmitters) to one or several receivers.
6.2 Types of channel
A channel has a specific capacity for transmitting information, often measured by its
Hz bandwidth or data rate in bits per second.
1. Symmetric channel
The symmetric channel has the following conditions:
a- Equal number of symbols in X&Y, i.e. P(Y∣X) is a square matrix.
b- Any row in the P (Y∣X) matrix comes from some permutation of other rows.
For example, the following conditional probability of various channel types is shown:
0.9 0.1
a- 𝑃(𝑌|𝑋) = [ ] It is a BSC because it is a square matrix.
0.1 0.9
1
Lecture 6: Communication Channels Dr.Ikhlas M. Farhan
P (𝑌=0∣𝑋=0) =1−𝑃
P (𝑌=0∣𝑋=1) =𝑃
P (𝑌=1∣𝑋=0) =𝑃
P (𝑌=1∣𝑋=1) =1−𝑃
2
Lecture 6: Communication Channels Dr.Ikhlas M. Farhan
The TSC is symmetric but impractical since practically 𝑥1 and 𝑥3 are not affected as
much as 𝑥2. The interference between 𝑥1 and 𝑥3 is much less than the interference
between 𝑥1 and 𝑥2 or 𝑥2 and 𝑥3.
Hence, the more practical but non-symmetric channel has the transinformation
probability.
Where 𝑥1 interferes with 𝑥2 precisely the same as interference between 𝑥2 and 𝑥3, but
𝑥1 and 𝑥3 are not interfering.
3
Lecture 6: Communication Channels Dr.Ikhlas M. Farhan
Note that the BEC's probability of “bit error” is zero. In other words, the following
conditional probabilities hold for any BEC model:
P (𝑌="𝑒𝑟𝑎𝑠𝑢𝑟𝑒"∣𝑋=0) =𝑃
P (𝑌="𝑒𝑟𝑎𝑠𝑢𝑟𝑒"∣𝑋=1) =𝑃
P (𝑌=0∣𝑋=0) =1−𝑃
P (𝑌=1∣𝑋=1) =1−𝑃
P (𝑌=0∣𝑋=1) =0
P (𝑌=1∣𝑋=0) =0
4
Lecture 6: Communication Channels Dr.Ikhlas M. Farhan
6. Special Channels
A. Lossless channel: It has only one nonzero element in each transitional matrix
P (Y∣X) column.
This channel has H(X∣Y)=0 and I(X, Y)=H(X) with zero losses entropy.
B. Deterministic channel: It has only one nonzero element in each row, the
transitional matrix P(Y|X), as an example:
This channel has H(Y∣X)=0 and I(Y, X)=H(Y) with zero noisy entropy.
C. Ideal channel: It has only one nonzero element in each row and column, the
transitional matrix P(Y|X), i.e., it is an identity matrix, as an
𝑦1 𝑦2 𝑦3
𝑥1 1 0 0
𝑃(𝑌|𝑋) = 𝑥2 [ ]
0 1 0
𝑥3 0 0 1
But we have
𝑃(𝑥𝑖 , 𝑦𝑗 ) = P(𝑥𝑖 ) 𝑃(𝑦𝑗 |𝑥𝑖 ) 𝑝𝑢𝑡 𝑖𝑛 𝑎𝑏𝑜𝑣𝑒 𝑒𝑞𝑢𝑎𝑡𝑖𝑜𝑛 𝑦𝑖𝑒𝑙𝑑𝑒𝑠:
𝑚 𝑛
Where K is constant and independent of the row number 𝑖 so that the equation
becomes:
𝑛
6
Lecture 6: Communication Channels Dr.Ikhlas M. Farhan
Max of 𝐼(𝑋,𝑌)=max[𝐻(𝑌)+𝐾]=max[𝐻(𝑌)]+𝐾
When Y has equal probability symbols, then max [H(𝑌)] = log 2 𝑚
Then
I(X, Y) = log 2 𝑚 + 𝐾 or 𝐶 = log 2 𝑚 + 𝐾 (4)
7
Lecture 6: Communication Channels Dr.Ikhlas M. Farhan
𝐼(𝑋,𝑌)
The channel efficiency η =
𝐶
Example 2: Given a BSC below, Determine the channel capacity and channel
Efficiency. If p(x1=0.4).
Solution:
0.9 0.1
𝑃(𝑌|𝑋) = [ ]
0.1 0.9
8
Lecture 6: Communication Channels Dr.Ikhlas M. Farhan
𝐼(𝑋, 𝑌) 0.513
η= ∗ 100% = ∗ 100% = 96.61%
𝐶 0.532
9
Lecture 6: Communication Channels Dr.Ikhlas M. Farhan
For the series information channel, the overall capacity does not exceed each channel
individually.
Example 3: find the transition matrix P( z / x) for the cascaded channel shown:
Solution:
Homework: For previous example) 3(, find p(Y) and p(Z) if p(X)=[ 0.7 0.3]
10
Lecture 6: Communication Channels Dr.Ikhlas M. Farhan
The Shannon-Hartley theorem states that the channel capacity is given by:
S
C = B log 2 (1 + ) (7)
N
Where C is the capacity in bits per second, B is the channel's bandwidth in Hertz,
and S/N is the signal-to-noise ratio.
11
Lecture 6: Communication Channels Dr.Ikhlas M. Farhan
n
2
1 0.5
p ( n) e (8)
2
This has two-sided power spectral density Gn(f)=/2 W/Hz and one-sided power
spectral density Gn(f)= W/Hz.
Since the spectrum is flat, we call this noise white noise. This white noise affects the
signal x (t) as an additive term, i.e., the received signal y(t)=x(t)+n(t). A very popular
name of Additive, White, Gaussian Noise (AWGN) is used for such thermal noise. The
figure below shows how this AWGN affects equiprobable bipolar ±A signal.
12
Lecture 6: Communication Channels Dr.Ikhlas M. Farhan
y(t)
x2
1 0.5
H ( X ) p( x) ln[ e 2
]dx
2 nats/sample
1
H(X ) p( x) ln 2 dx x
2
p( x)dx
2 2
and p( x) dx 1
Then
H ( X ) ln 2 0.5 ln 2 ln e
It should be noted that maximization is already included when we take the case of
Gaussian noise, then: C log 2 ( 2e y ) log 2 ( 2e n )
Using previous expression of H(X) for Gaussian signal for the signal 𝑦 with variance
y2 then for the noise n (t) with variance n2 (noise power):
y 1 y2
C log 2 log 2 But y2=x2+n2 sum of power,
n 2 n2
1 SN 1 S
C log 2 log 2 (1 ) bits/sample
2 N 2 N
For an analog signal sampled at the Nyquist rate, then the sampling frequency is fs=
2 B samples/sec, where B is the bandwidth of the signal, hence:
1 SN
C log 2 * 2 B bits/sec,
2 N
Or:
S
C = B log 2 (1 + ) bps
N
This is a very important formula, the Shannon Equation, named after C.E. Shannon.
It is sometimes called the Shannon-Hartly equation.
Care must be taken regarding the units; here, B is in Hz, SNR=signal to noise power
ratio is absolute, and C is in bits/sec. If SNR is given in dB, then:
The ratio [C/B] = log 2 (1+SNR) gives what is called channel utilization ratio ( bps
per Hz) that increases with SNR, as shown.
14
Lecture 6: Communication Channels Dr.Ikhlas M. Farhan
The equation C=B log 2 (1+SNR) gives the maximum theoretical performance in
terms of the maximum bit rate that can be transmitted over a channel having a
bandwidth B and SNR ratio.
Example 4: Find the channel capacity of a Gaussian channel if its bandwidth increases
without a limit.
Solution:
B increases without a limit means B, then: lim 𝐶 = lim B log 2 (1 + SNR)
𝐵→∞ 𝐵→∞
15
Lecture 6: Communication Channels Dr.Ikhlas M. Farhan
S
=1
Note:
The result of the previous example indicates that the channel capacity C approaches a
limit of 1.44S/ even if B is very large. This result is very important for bandwidth-
unlimited channels. Still, power-limited channels such as satellite channels, where the
bandwidth may be large, but signal power is a very important parameter.
Example 5: Find the maximum theoretical information rate that can be transmitted
over a telephone channel having 3.5 KHz bandwidth and 15dB SNR.
Solution:
C is the maximum theoretical information rate, using Shannon eq., then:
Solution:
16
Lecture 6: Communication Channels Dr.Ikhlas M. Farhan
First, we find the rate of information from the source, which is the source entropy rate
R(X):
Then: R(X) =4 * 500= 2000 bps. Now, since R(X) < 17500, it is possible to transmit
source output over this channel.
Example 7: Find the minimum theoretical time it would take to transmit 2500 octal
digits over the telephone channel of the previous example (example 6).
Solution:
From the previous example, then C=17500 bps. A minimum theoretical transmission
time is obtained if the channel operates at the maximum rate, which is C, then:
Amount of information = 2500 * log 2 8 =7500 bits (note each octal digit has
log 2 8=3bits of information), then:
Tmin= 7500/17500=0.428sec.
Example 8: Find the minimum theoretical SNR required to transmit compressed video
information at a rate of 27Mbps over a channel having 5MHz bandwidth.
Solution:
For the minimum theoretical SNR, then put C=source bit rate =27Mbps, then:
𝑆
𝐶 = 𝐵 log 2 (1 + )
𝑁
27* 106 = 5* 106 log 2 (1+SNR), or
17
Lecture 6: Communication Channels Dr.Ikhlas M. Farhan
Example 9: If the channel capacity is 2000 bps, find its bandwidth if the SNR is 3.
Solution:
𝑆
𝐶 = 𝐵 log 2 (1 + )
𝑁
2000 = 𝐵 log 2 (1 + 3)
2000 = 𝐵 log 2 (4)
𝐵 = 1000 𝑘𝐻𝑧
Example 10: Determine the SNR required to achieve a channel capacity of 1585 bps
if the bandwidth is 1 kHz.
Solution:
𝑆
𝐶 = 𝐵 log 2 (1 + )
𝑁
𝑆
1585 = 1000 log 2 (1 + )
𝑁
1585 𝑆
= log 2 (1 + )
1000 𝑁
−1
log 2 (1.585) (1
= + 𝑆𝑁𝑅)
3=(1 + 𝑆𝑁𝑅)
SNR=2
H.W
Q1: For the channel below having a transition matrix:
Q2: The channel module shown below P(x1) =0.6 and P(x2) =0.4 Determine:
1- Marginal entropy.
3- Joint entropy.
4- Noise entropy H (Y|X).
5- Losses entropy H(X|Y).
18
Lecture 6: Communication Channels Dr.Ikhlas M. Farhan
Q3: A binary source sends x1 with a probability of 0.4 and x2 with 0.6 probabilities
through a channel with the probability of errors of 0.1 for x1 and 0.2 for x2.
Determine:
1-Source entropy.
2- Marginal entropy.
3- Joint entropy.
4- Conditional entropy (Y|X).
5- Losses entropy (X|Y).
6- Transformation.
2/3 1/3
𝑃(𝑌|𝑋) = [ 0 1 ]
0.5 0.5
Q5: Consider a communication system that operates over an Additive White Gaussian
Noise (AWGN) channel with a bandwidth of 4 kHz. The received signal power S is 10
mW, and the noise power spectral density 𝑁𝑂 is =10(−9) ⁄2 W/Hz?
Q6: what is the capacity of an AWGN channel with bandwidth of 1MHz, power of
10W and noise power spectral density of 𝑁𝑂 ⁄2 =10(−9) W/Hz?
19