0% found this document useful (0 votes)
17 views62 pages

L09 Viterbi

Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
17 views62 pages

L09 Viterbi

Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 62

Convolutional Codes

COS 463: Wireless Networks


Lecture 9
Kyle Jamieson

[Parts adapted from H. Balakrishnan]


Today
1. Encoding data using convolutional codes
– Encoder state
– Changing code rate: Puncturing

2. Decoding convolutional codes: Viterbi Algorithm

2
Convolutional Encoding
• Don’t send message bits, send only parity bits

• Use a sliding window to select which message bits may


participate in the parity calculations

Message bits: 1 0 1 1 0 1 0 0 1 0 1

Constraint length K

3
Sliding Parity Bit Calculation

K=4

-3 -2 -1 0 1 2 3 4 5 6 7 8 …..
Message
0 0 0 0 1 1 0 1 0 0 1 0 1
bits:

+
P[0]= 0
• Output: 0 4
Sliding Parity Bit Calculation

K=4

-3 -2 -1 0 1 2 3 4 5 6 7 8 …..
Message
0 0 0 0 1 1 0 1 0 0 1 0 1
bits:

+
P[1] = 1
• Output: 01 5
Sliding Parity Bit Calculation

K=4

-3 -2 -1 0 1 2 3 4 5 6 7 8 …..
Message
0 0 0 0 1 1 0 1 0 0 1 0 1
bits:

+
P[2] = 0
• Output: 010 6
Sliding Parity Bit Calculation

K=4

-3 -2 -1 0 1 2 3 4 5 6 7 8 …..
Message
0 0 0 0 1 1 0 1 0 0 1 0 1
bits:

+
P[3] = 1
• Output: 0100 7
Multiple Parity Bits
P2[3] = 1

-3 -2 -1 0 1 2 3 4 5 6 7 8 …..
Message
0 0 0 0 1 1 0 1 0 0 1 0 1
bits:

+
• Output: ….11 P1[3] = 1 8
Multiple Parity Bits
P2[4] = 0

-3 -2 -1 0 1 2 3 4 5 6 7 8 …..
Message
0 0 0 0 1 1 0 1 0 0 1 0 1
bits:

+
• Output: ….1100 P1[4] = 0 9
Multiple Parity Bits
P2[5] = 1

-3 -2 -1 0 1 2 3 4 5 6 7 8 …..
Message
0 0 0 0 1 1 0 1 0 0 1 0 1
bits:

+
• Output: ….110001 P1[5] = 0 10
Encoder State
• Input bit and K-1 bits of current state determine state on
next clock cycle
– Number of states: 2K-1

Input bit
State

Message
0 0 0 0 1 1 0 1 0 0 1 0 1
bits:

Constraint length K

11
Constraint Length
• K is the constraint length of the code

• Larger K:
– Greater redundancy
– Better error correction possibilities (usually, not always)

12
Transmitting Parity Bits
• Transmit the parity sequences, not the message itself
– Each message bit is “spread across” K bits of the
output parity bit sequence

– If using multiple generators, interleave the bits of


each generator
• e.g. (two generators):

!0 0 , !1 0 , !0 1 , !1 1 , !0 2 , !1[2]

13
Transmitting Parity Bits
• Code rate is 1 / #_of_generators
– e.g., 2 generators à rate = ½

• Engineering tradeoff:
– More generators improves bit-error correction
• But decreases rate of the code (the number of
message bits/s that can be transmitted)

14
Shift Register View

• One message bit x[n] in, two parity bits out


– Each timestep: message bits shifted right by one, the
incoming bit moves into the left-most register

15
Today
1. Encoding data using convolutional codes
– Encoder state machine
– Changing code rate: Puncturing

2. Decoding convolutional codes: Viterbi Algorithm

16
State-Machine View
• Example: K = 3, code rate = ½, convolutional code
– There are 2K-1 state
– States labeled with (x[n-1], x[n-2])
– Arcs labeled with x[n]/p0[n]p1[n]
– Generator: g0 = 111, g1 = 101
– msg = 101100

Starting state
0/00
1/11
00 10

0/10 1/01
0/11
1/00
01 11
0/01 1/10
17
State-Machine View
Starting state • P0[n] = (1*x[n] + 1*x[n-1] + 1*x[n-2]) mod 2
0/00 • P1[n] = (1*x[n] + 0*x[n-1] + 1*x[n-2]) mod 2
1/11 • Generators: g0 = 111, g1 = 101
00 10

0/10 1/01
0/11
1/00

01 11
0/01 1/10

• msg = 101100
• Transmit:
18
State-Machine View
Starting state • P0[n] = 1*1 + 1*0 + 1*0 mod 2
0/00 • P1[n] = 1*1 + 0*0 + 1*0 mod 2
1/11 • Generators: g0 = 111, g1 = 101
00 10

0/10 1/01
0/11
1/00

01 11
0/01 1/10

• msg = 101100
• Transmit: 11
19
State-Machine View
Starting state • P0[n] = 1*0 + 1*1 + 1*0 mod 2
0/00 • P1[n] = 1*0 + 0*1 + 1*0 mod 2
1/11 • Generators: g0 = 111, g1 = 101
00 10
0/10
0/11 1/01
1/00

01 11
0/01 1/10

• msg = 101100
• Transmit: 11 10
20
State-Machine View
Starting state • P0[n] = 1*1 + 1*0 + 1*1 mod 2
0/00 • P1[n] = 1*1 + 0*0 + 1*1 mod 2
1/11 • Generators: g0 = 111, g1 = 101
00 10
0/10
0/11 1/01
1/00

01 11
0/01 1/10

• msg = 101100
• Transmit: 11 10 00
21
State-Machine View
Starting state • P0[n] = 1*1 + 1*1 + 1*0
0/00 • P1[n] = 1*1 + 0*1 + 1*0
1/11 • Generators: g0 = 111, g1 = 101
00 10
0/10
0/11 1/01
1/00

01 11
0/01 1/10

• msg = 101100
• Transmit: 11 10 00 01
22
State-Machine View
Starting state • P0[n] = 1*0 + 1*1 + 1*1
0/00 • P1[n] = 1*0 + 0*1 + 1*1
1/11 • Generators: g0 = 111, g1 = 101
00 10
0/10
0/11 1/01
1/00

01 11
0/01 1/10

• msg = 101100
• Transmit: 11 10 00 01 01
23
State-Machine View
Starting state • P0[n] = 1*0 + 1*0 + 1*1
0/00 • P1[n] = 1*0 + 0*0 + 1*1
1/11 • Generators: g0 = 111, g1 = 101
00 10
0/10
0/11 1/01
1/00

01 11
0/01 1/10

• msg = 101100
• Transmit: 11 10 00 01 01 11
24
Today
1. Encoding data using convolutional codes
– Encoder state machine
– Changing code rate: Puncturing

2. Decoding convolutional codes: Viterbi Algorithm

25
Varying the Code Rate
• How to increase/decrease rate?

• Transmitter and receiver agree on coded bits to omit


– Puncturing table indicates which bits to include (1)
• Contains p columns, N rows
p

N
N

Coded bits Puncture, coded bits 26


Punctured convolutional codes: example

0 0 1 0 1
• Coded bits =
0 0 1 1 1

• With Puncturing:

!!!"
P1= !""! Puncturing table

27
Punctured convolutional codes: example

0 0 1 0 1
• Coded bits =
0 0 1 1 1

• With Puncturing:
3 out of 4 bits are used

!!!"
P1= !""!

2 out of 4 bits are used


28
Punctured convolutional codes: example

0 0 1 0 1
• Coded bits =
0 0 1 1 1

• With Puncturing:

!!!"
P1= !""!

0
• Punctured, coded bits: 0
29
Punctured convolutional codes: example

0 0 1 0 1
• Coded bits =
0 0 1 1 1

• With Puncturing:

!!!"
P1= !""!

0 0
• Punctured, coded bits: 0
30
Punctured convolutional codes: example

0 0 1 0 1
• Coded bits =
0 0 1 1 1

• With Puncturing:

!!!"
P1= !""!

0 0 1
• Punctured, coded bits: 0
31
Punctured convolutional codes: example

0 0 1 0 1
• Coded bits =
0 0 1 1 1

• With Puncturing:

!!!"
P1= !""!

0 0 1
• Punctured, coded bits: 0 1
32
Punctured convolutional codes: example

0 0 1 0 1
• Coded bits =
0 0 1 1 1

• With Puncturing:

!!!"
P1= !""!

0 0 1 1
• Punctured, coded bits: 0 1 1
33
Punctured convolutional codes: example

0 0 1 0 1
• Coded bits =
0 0 1 1 1

0 0 1 1
• Punctured, coded bits:
0 1 1

• Punctured rate is: R = (1/2) / (5/8) = 4/5

34
Today
1. Encoding data using convolutional codes
– Changing code rate: Puncturing

2. Decoding convolutional codes: Viterbi Algorithm


– Hard decision decoding
– Soft decision decoding

35
Motivation: The Decoding Problem
• Received bits: Message Coded bits Hamming
distance
000101100110 0000 000000000000 5
0001 000000111011 --

• Some errors have occurred 0010 000011101100 --


0011 000011010111 --
0100 001110110000 --
• What’s the 4-bit message? 0101 001110001011 --
0110 001101011100 --
0111 001101100111 2
• Most likely: 0111 1000 111011000000 --
– Message whose codeword 1001 111011111011 --
is closest to received bits 1010 111000101100 --

in Hamming distance 1011 111000010111 --


1100 110101110000 --
1101 110101001011 --
1110 110110011100 --
1111 110110100111 --

36
The Trellis
0/00 Starting state
• Vertically, lists encoder states
1/11
00 10
0
• Horizontally, tracks time steps
0/11 0/1 0 1/01
1/0
1/10
• Branches connect states in
01
0/01
11 successive time steps

Trellis:
0/00 0/00 0/00 0/00
00
1/

1/
1/

1/
1 1
0/1 0/1
11

11
11

11
01 1/0 1/0
States

0 0
0 0 0
Branch 0/1 0/1 0/1
10
1/0 1 1/01 1 1/01
1 0 0
0/ 0/
11 1/10
Time à 1/10
x[n-1] x[n-2] 37
The Trellis: Sender’s View
• At the sender, transmitted bits trace a unique, single
path of branches through the trellis
– e.g. transmitted data bits 1 0 1 1

• Recover transmitted bits ⟺ Recover path

x[n-1] x[n-2]
00
1/
11

01 1/0
States

0
0
0/1
10
1/0
1
11
Time à
38
Viterbi algorithm
• Andrew Viterbi (USC)

• Want: Most likely sent bit sequence

• Calculates most likely path through trellis

1. Hard Decision Viterbi algorithm: Have possibly-


corrupted encoded bits, after reception

2. Soft Decision Viterbi algorithm: Have possibly-


corrupted likelihoods of each bit, after reception
– e.g.: “this bit is 90% likely to be a 1.”

39
Viterbi algorithm: Summary
• Branch metrics score likelihood of each trellis branch

• At any given time there are 2K-1 most likely messages we’re
tracking (one for each state)
– One message ⟷ one trellis path
– Path metrics score likelihood of each trellis path

• Most likely message is the one that produces the smallest


path metric

40
Today
1. Encoding data using convolutional codes
– Changing code rate: Puncturing

2. Decoding convolutional codes: Viterbi Algorithm


– Hard decision decoding
– Soft decision decoding

41
Hard-decision branch metric
• Hard decisions à input is bits

• Label every branch of trellis with branch metrics


– Hard Decision Branch metric: Hamming Distance
between received and transmitted bits

Received: 00
0/00 à 0
00
1/11 2
à2 0/11 à
01
States

1/00 à
0
1
0/01 à
10 à1
10 1/01 à
0/ 1
11 1/10 à 1

42
Hard-decision branch metric

• Suppose we know encoder is in state 00, receive bits: 00

Received: 00
0/00
00
0
1/
01 11
States

2
10

11
Time à
43
Hard-decision path metric
• Hard-decision path metric: Sum Hamming distance
between sent and received bits along path

• Encoder is initially in state 00, receive bits: 00

Received: 00

0/00 à 0
00 0 0
1/
11
à
01 2

10 2

11
44
Hard-decision path metric
• Right now, each state has a unique predecessor state

• Path metric: Total bit errors along path ending at state


– Path metric of predecessor + branch metric

Received: 00 11

0/00 à 0 0/00 à 2
00 0 0 2
1/
11
1/ 0
11

à
à

01 2 3
0
0/1 1
10 2 à
1/0 0
1
à
1
11 3
45
Hard-decision path metric
• Each state has two predecessor states, two
predecessor paths (which to use?)

• Winning branch has lower path metric (fewer bit errors):


Prune losing branch

Received: 00 11 01

0/00 à 0 0/00 à 2 0/00 à 1


00 0 0 2 3
1/
11 1
1/ 0


11

à
à

01 3 0/ 1
2
0
0/1 1
10 2 à
1/0 0
1
à
1
11 3
46
Hard-decision path metric

• Prune losing branch for each state in trellis

Received: 00 11 01

0/00 à 0 0/00 à 2 0/00 à 1


00 0 0 2 3
1/
11
1/ 0
11

à
à

01 2 3 2 2
0
0/1 1 / 10à
0 0
10 2 à
1/0 0 à
à
1 / 01
1 0
11 3
47
Pruning non-surviving branches
• Survivor path begins at each state, traces unique path
back to beginning of trellis
– Correct path is one of four survivor paths

• Some branches are not part of any survivor: prune them


Received: 00 11 01

0/00 à 0 0/00 à 2 0/00 à 1


00 0 0 2 1/ 3
1/ 11
11
1/ 0

à
11

à
à

01 2 3 1 2
0 0
0/1 1 0/1 2
10 2 à
1/0 0 à1
/01 3
1 à
à 0
1
11 3 0
48
Making bit decisions

• When only one branch remains at a stage, the Viterbi


algorithm decides that branch’s input bits:

Received: 00 11 01
Decide: 0
0/00 à 0 0/00 à 2 0/00 à 1
00 0 0 2 1/ 3
11
1/ 0

à
11
à

01 1 2
0
0/1 2
10 0 à1
/01 3
à
0
11 0
49
End of received data
• Trace back the survivor with minimal path metric

• Later stages don’t get benefit of future error correction,


had data not ended

Received: 00 11 01 10
Decide: 0 1 1 1
0/00 à 2 0/00 à 1
00 0 2 1/ 3 à1 3
11 1
à 0/1
1/
11

01 1 2 1/0
0 2
à

0 à
0/1 2
0

1
10 0 à 3 2 3
1/0 à
1à / 01
0 0 0 0
11
1/10 à 0
50
Terminating the code
• Sender transmits two 0 data bits at end of data

• Receiver uses the following trellis at end:

0/00 0/00
00
01 0/11 0/11
0/01
10
0/10
11

• After termination only one trellis survivor path remains


– Can make better bit decisions at end of data based
on this sole survivor

51
Viterbi with a Punctured Code
• Punctured bits are never transmitted

• Branch metric measures dissimilarity only between


received and transmitted unpunctured bits
– Same path metric, same Viterbi algorithm
– Lose some error correction capability
Received: 0-
0/00 à 0
00
1/11 1
à1 0/11 à
01
States

1/00 à
0
0
0/01 à
10 à1
10 1/01 à
0/ 0
11 1/10 à 1

52
Today
1. Encoding data using convolutional codes
– Changing code rate: Puncturing

2. Decoding convolutional codes: Viterbi Algorithm


– Hard decision decoding
• Error correcting capability
– Soft decision decoding

53
How many bit errors can we correct?
• Think back to the encoder; linearity property:
– Message m1 à Coded bits c1
– Message m2 à Coded bits c2
– Message m1 ⨁ m2 à Coded bits c1 ⨁ c2

0 0 0 0 1 1 0 1 0 0 1 0 1

• So, dmin = minimum distance between 000...000 codeword and


codeword with fewest 1s

54
!"!#!"!% $"!#!"!%

Calculating dmin for the convolutional code


Figure 8-5: Branch metric for soft decision decoding.

• Find path with smallest non-zero path metric going from


first 00 state to a future 00 state

• Here, dmin = 4, so can correct 1 error in 8 bits:


x[n] 0 0 0 0 0 0

00 00 00 00
00 0/00 0/00 0/00 0/00 4 0/00 0/00
1/11 1/11 1
1/11 1/11 1/11 1
1/11

0/10 0/10 0/10


0 0/10 0
0/10 0
0/10
01 3
1/01 1/01 1/01 1/01 /0
1/01 1/01

0/11 0
0/11 0/11
0/ 0/11 0/
0/11 0/11
10 1/00 2 1/00 1/ 0
1/00 1/00
0 1/00
1/0
1/
1/00 1/00

0/01 0/01 0/01 0/01


0 /01 0/01 0/01
11 1/10 1/10 2 1/10 1/10 1/10 1/10

x[n-1]x[n-2]

t time
55
Today
1. Encoding data using convolutional codes
– Changing code rate: Puncturing

2. Decoding convolutional codes: Viterbi Algorithm


– Hard decision decoding
– Soft decision decoding

56
Model for Today
• Coded bits are actually continuously-valued “voltages”
between 0.0 V and 1.0 V:

1.0 V
Strong “1”

Weak “1”

Weak “0”

0.0 V Strong “0”

57
On Hard Decisions
• Hard decisions digitize each voltage to “0” or “1” by
comparison against threshold voltage 0.5 V
– Lose information about how “good” the bit is
• Strong “1” (0.99 V) treated equally to weak “1” (0.51 V)

• Hamming distance for branch metric computation

• But throwing away information is almost never a good


idea when making decisions
– Find a better branch metric that retains information about
the received voltages?

58
Soft-decision decoding
• Idea: Pass received voltages to decoder before digitizing
– Problem: Hard branch metric was Hamming distance

• “Soft” branch metric


– Euclidian distance between received voltages and voltages
of expected bits:

0.0, 1.0 1.0, 1.0

“Soft” metric
Expected parity bits:
(Vp0, Vp1)
(0, 1)
0.0, 0.0 1.0, 0.0

59
Soft-decision decoding
• Different branch metric, hence different path metric

• Same path metric computation

• Same Viterbi algorithm

• Result: Choose path that minimizes sum of squares of


Euclidean distances between received, expected voltages

60
Putting it together:
Convolutional coding in Wi-Fi

Data bits

Data bits Viterbi


Decoder
Convolutional Coded bits (hard-
encoder decision decoding) or
Voltage Levels (soft-
Coded bits decision decoding)

Modulation
(BPSK, QPSK, …)
Demodulation

61
Thursday Topic:
Rateless Codes

Friday Precept:
Midterm Review

62

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy