0% found this document useful (0 votes)
53 views26 pages

Gaussian Channel: I 2 I I I

This document discusses the Gaussian channel model and its capacity. Key points: 1) Gaussian channels have additive Gaussian noise which results in a continuous output alphabet even with a discrete input. 2) The capacity of a Gaussian channel is infinite without constraints but finite when there is a constraint on the average input power. 3) The capacity of a Gaussian channel with an average power constraint P is 1/2 log(1 + P/N) bits per channel use, where N is the noise variance. 4) Capacity-achieving codes for Gaussian channels exist and the capacity formula can be achieved.

Uploaded by

tjthegreat
Copyright
© Attribution Non-Commercial (BY-NC)
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
53 views26 pages

Gaussian Channel: I 2 I I I

This document discusses the Gaussian channel model and its capacity. Key points: 1) Gaussian channels have additive Gaussian noise which results in a continuous output alphabet even with a discrete input. 2) The capacity of a Gaussian channel is infinite without constraints but finite when there is a constraint on the average input power. 3) The capacity of a Gaussian channel with an average power constraint P is 1/2 log(1 + P/N) bits per channel use, where N is the noise variance. 4) Capacity-achieving codes for Gaussian channels exist and the capacity formula can be achieved.

Uploaded by

tjthegreat
Copyright
© Attribution Non-Commercial (BY-NC)
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 26

Gaussian Channel

• A discrete-time, continuous alphabet chan-


nel with additive Gaussian noise

Z∼N (0, N )
where EZi = 0 and EZi2 = N

Xi and Zi are statistically independent

• Motivation - noise at the input of a commu-


nication receiver is the aggregate effect of
many independent random sources

Then, as the central-limit theorem predicts,


the pdf of noise is close to Gaussian

1
1. Capacity

What is the capacity of Gaussian channel ?

• Infinite, if noise variance N = 0

• Also infinite if input Xi has no constraints

– can choose arbitrarily large input ampli-


tudes, making possible to recover them
without error at the output

– but this needs infinite transmitter power

• Hence, consider channels with finite input


average power:

EXi2 ≤ P

– Then have finite capacity

– Useful input alphabet is discrete (with noise


continuous inputs cannot be distinguished
at the output error free, but with coding
a discrete input alphabet can be)

2
2. Example

We can transmit at least 1 − H2(pe) bits/channel


use, over a Gaussian channel using the following
scheme:

• Restrict the channel input to two values so


√ √
that Xi ∈ {− P , + P } (i.e., EXi2 = P )

• Decode the output as follows


( √
− √P if Y < 0
X̂i =
+ P if Y > 0

• The probability of error of the resulting DMC


is
q
Pe = Q( P/N )
and hence C > 1 − H2(Pe)

However, as we will see, the capacity is much


higher

3
3. Information Capacity

C= max I(X; Y )
p(x):EX 2 ≤P

I(X; Y ) = h(Y ) − h(Y |X)


= h(Y ) − h(Z|X)
= h(Y ) − h(Z)

Since EY 2 = EX 2 + EZ 2
1
h(Y ) ≤ log 2πe(P + N )
2
1 1
I(X; Y ) ≤ log 2πe(P + N ) − log(2πeN )
2 2
1 P
 
= log 1 +
2 N
Equality can be achieved if Y ∼ N (0, P + N ),
which is achieved when p(x) = N (0, P )

1 P
 
∴ C = log 1 + bits/use
2 N

Note: P/N , signal-to-noise ratio


4
4. Channel Coding Theorem for Gaus-
sian Channel

• First define an (M, n) code for Gaussian chan-


nel with a power constraint P

– encoding function :

X n : {1, . . . , M } → X n
such that
n
x2
X
i (w) ≤ nP
i=1
where w = 1, . . . , M (message set) and
X n = {xn(1), . . . , xn(M )} (codebook)

– decoding function :

g : Y n → {1, . . . , M }

– Rate :
log2 M
R= bits per use
n
(n)
– Error probabilities λw , λ(n) and Pe de-
fined as in the case of DMC

5
• A rate R said to be achievable for a Gaussian
channel with a power constraint P if we can
find a sequence of (d2nR e, n) codes (for n →
∞) satisfying the power constraint such that
the maximal error probability λ(n) → 0

(by choosing a sufficiently large block-size


n, can find a code with rate > R and λ(n)
arbitrarily small)

• Theorem - if n → ∞ is allowed, one can find


(2nR , n) codes with λ(n) → 0, if R < I(X; Y )

That is all rates


1 P
 
R < C = log 1 + bits/use
2 N
are achievable

• Converse- If there is a a sequence of (2nR , n)


(n)
codes with Pe → 0, then necessarily R ≤ C

Proof: parallels that of DMC (pp. 266-270 C


& T)

6
5. A Plausibility Argument

• Consider each codeword as a point in Rn

• The possible set of channel outputs for each


codeword X n(w) is a set of points centered
around point X n(w) in Rn

Y n for X n(w) lies inside a sphere of radius


⇒√
' nN with very high probability for large n

• Spheres corresponding to all


q codewords lie
inside a sphere of radius ' n(P + N )

• Let volume of a sphere with radius r be Anrn

Then, the total number of input codewords


one can have such that outputs do not over-
lap (error prob → 0)
An[n(P + N )]n/2
M =
An(nN )n/2

• Maximum rate
log M 1 P + N n/2
 
= log bits per use
n n N

7
6. Sphere Packing

8
7. Black Swans

Robert W. Lucky, IEEE Spectrum, Nov. 2010

9
8. Parallel Gaussian Channels
Yj = Xj + Zj , j = 1, . . . , k
where Z1, . . . , Zk are independent Gaussian ran-
dom var., Zj ∼ N (0, Nj )

Can also be viewed as a vector channel:

Y =X+Z

Power allocation problem:

Given the constraint

EkXk2 ≤ P
on total average input power, wish to maximize
the total capacity of the vector channel, by allo-
cating P1, . . . Pk among the channels, such that
X
Pi = P
where P i = EXi2

10
The capacity of vector channel is

C= max I(X; Y)
p(x):EkXk2 ≤P

We need to solve

max C
{P1 ,...,Pk }
subject to
X
Pi ≤ P
i
Pi ≥ 0 ∀i

11
• First show that
X
I(X; Y) ≤ h(Yi) − h(Zi)
i
X1 
Pi

≤ log 1 +
i 2 Ni

Equality achieved when Yi ∼ N (0, Pi + Ni)


and independent
⇒ when Yi ∼ N (0, Pi) and independent

• Next find
X Pi 

Cmax = max log 1 +
{P1 ,...,Pk } i Ni
subject to
X
Pi − P ≤ 0
i
Pi ≥ 0 ∀i

12
Maximize the Lagrangian
X  Pi  X
J(P) = log 1 + − λ( Pi − P )
i Ni i
where P = (P1, . . . , Pk ) and λ ≥ 0

∂J(P)
⇒ = 0,
∂Pj
gives
P j = ν − Nj , j = 1, . . . , k
where ν = 1/λ

That is, have to find ν such that ν > Ni ∀i, and


X
(ν − Ni) ≤ P
i

13
Interpret the solution:

If all Pj > 0 then ν = P̄ + N̄ and

Nj ≤ P̄ + N̄ j = 1, . . . , k
where
1X
N̄ = Ni
k i
P
P̄ =
k

⇒ to have Pj > 0, must have Nj < P̄ + N̄


Otherwise Pj = 0

i.e., not all channels may be allocated power and


ν is P̄ + N̄ for channels with Pj > 0

14
9. Water-Filling

Choose ν such that

(ν − Ni)+ = P
X

where
(
x if x ≥ 0
(x)+ =
0 if x < 0

15
10. Time Varying Channels

• Channel output conditional distribution p(Y |X, Q)


varies with time, where the random variable
Q reflects channel variations

e.g., wireless (radio) channels (due to signal


fading)

• Capacity C(Q) is a function of Q

If Q is unknown and if Q → 0 is possible,


then the Shannon capacity is zero

This is the case with commonly used wireless


channel models (e.g., Rayleigh, Ricean, or
Nakagami fading)

16
• In practice, transmitter and/or receiver may
have some knowledge of Q

e.g., we know distribution of Q or we can


measure Q (channel side-information)

Then two definitions of capacity may make


sense:

– Ergodic capacity : Ce = E{C(Q)}

– Outage capacity

17
11. Example: Fading Channel

• Signal arriving at the receiver suffers from


time-dependent multi-path fading

• Can model this channel as a AWGN with a


time-dependent gain:

Yi = QiXi + Zi
where Qi is random channel gain during i-th
transmission (Zi is Gaussian noise)
1  Q2P 
C(Q) = log 1 + bits/use
2 N

• Note that, if Qi cannot be measured and


Qi → 0 is also probable, no fixed rate exists
at which reliable transmission can be guar-
anteed for all Qi, and hence C = 0

18
12. An Interpretation of Ergodic Ca-
pacity

• Suppose Qi in above example is always known


to the receiver and the transmitter

- for each channel use, can adapt transmitter


and receiver to achieve capacity

(but need to vary transmission rate)

• What is the maximum average rate at which


reliable communication is possible ?

19
13. Outage Capacity

Suppose fixed-rate transmission is required

For every rate R, can have two situations:

• C(Q) > R (reliable transmission feasible)

• C(Q) < R (in ‘outage”)

Suppose, wish to have outage only a% of time

Outage capacity Cout is R for which

P (C(Q) < R) ≤ a%

20
14. Band-limited Channels

• So far considered discrete-time, continuous


alphabet channels

- limit is then on the amount of information


that can be transferred per channel use (no
limit on channel/uses per sec.)

• Now consider a continuous-time channel with


finite-bandwidth

- inputs and outputs are “waveforms”

- the bandwidth of the channel puts a limit


on allowed channel uses per second and hence
information rate

21
15. Channel Model
Z(t)

'$
?
X(t) - + - Linear Filter - Y (t)
&%

• A linear time-invariant channel with impulse


response h(t)

Y (t) = (X(t) + Z(t)) ∗ h(t)


where H(f ) = F{h(t)} is an ideal bandpass
filter with bandwidth W

• Channel noise Z(t) is white Gaussian noise


(WGN) with power spectral density (psd)
N0/2 watts/Hz

⇒ average noise power in a 1 Hz of band-


width = N0 watts

22
16. Capacity

Suppose we can reliably distinguish a maximum


of M distinct waveforms of duration T at the
output of the bandlimited AWGN channel

We can then send log2 M bits per T seconds

Capacity of the channel is


log2 M
C = lim bits/sec.
T →∞ T

23
17. Line of Thinking

• A signal bandlimited to W Hz can be exactly


1 sec. apart
described by samples taken 2W

If the signal has a duration of T sec., then it


can be described by 2W T samples

This maps each band-limited, time-limited


continuous-time signal into a unique point in
R2W T

• However, a bandlimited signal cannot be ex-


actly time-limited

Hence, think of a signal with most of its en-


ergy in a bandwidth of W and a time interval
of T

• We can thus view the transmission of the sig-


nal during T sec. as 2T W uses of a discrete
channel

24
18. Capacity of Band-limited Chan-
nels

Show that the continuous-time Gaussian channel


with bandwidth W Hz, average power constraint
P watts, and white noise psd N0 watts/Hz has
the capacity
P
 
C = W log 1 + bits per second
N0 W
by establishing the following facts about channel
output waveform, sampled at 2W per sec

• Noise samples are independent and have a


variance N20

P
• Power per signal sample is 2W

25
P
 
C = W log 1 + bits per second
N0 W
where NPW is the SNR at the channel output
0

• When SNR is high, the capacity is linear in W


and logarithmic in SNR (bandwidth limited
region):
P
 
C ' W log
N0 W

• When W is large, capacity is linear in power


(power limited region):
P P
lim C = log2 e = 1.44
W →∞ N0 N0
1
(note: limx→0(1 + x) x = e)

Question: When W → ∞, we can vary input sig-


nal at any rate and hence the capacity increases,
but why doesn’t capacity increase to infinity with
W ?

26

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy