L09 Viterbi
L09 Viterbi
2
Convolutional Encoding
• Don’t send message bits, send only parity bits
Message bits: 1 0 1 1 0 1 0 0 1 0 1
Constraint length K
3
Sliding Parity Bit Calculation
K=4
-3 -2 -1 0 1 2 3 4 5 6 7 8 …..
Message
0 0 0 0 1 1 0 1 0 0 1 0 1
bits:
+
P[0]= 0
• Output: 0 4
Sliding Parity Bit Calculation
K=4
-3 -2 -1 0 1 2 3 4 5 6 7 8 …..
Message
0 0 0 0 1 1 0 1 0 0 1 0 1
bits:
+
P[1] = 1
• Output: 01 5
Sliding Parity Bit Calculation
K=4
-3 -2 -1 0 1 2 3 4 5 6 7 8 …..
Message
0 0 0 0 1 1 0 1 0 0 1 0 1
bits:
+
P[2] = 0
• Output: 010 6
Sliding Parity Bit Calculation
K=4
-3 -2 -1 0 1 2 3 4 5 6 7 8 …..
Message
0 0 0 0 1 1 0 1 0 0 1 0 1
bits:
+
P[3] = 1
• Output: 0100 7
Multiple Parity Bits
P2[3] = 1
-3 -2 -1 0 1 2 3 4 5 6 7 8 …..
Message
0 0 0 0 1 1 0 1 0 0 1 0 1
bits:
+
• Output: ….11 P1[3] = 1 8
Multiple Parity Bits
P2[4] = 0
-3 -2 -1 0 1 2 3 4 5 6 7 8 …..
Message
0 0 0 0 1 1 0 1 0 0 1 0 1
bits:
+
• Output: ….1100 P1[4] = 0 9
Multiple Parity Bits
P2[5] = 1
-3 -2 -1 0 1 2 3 4 5 6 7 8 …..
Message
0 0 0 0 1 1 0 1 0 0 1 0 1
bits:
+
• Output: ….110001 P1[5] = 0 10
Encoder State
• Input bit and K-1 bits of current state determine state on
next clock cycle
– Number of states: 2K-1
Input bit
State
Message
0 0 0 0 1 1 0 1 0 0 1 0 1
bits:
Constraint length K
11
Constraint Length
• K is the constraint length of the code
• Larger K:
– Greater redundancy
– Better error correction possibilities (usually, not always)
12
Transmitting Parity Bits
• Transmit the parity sequences, not the message itself
– Each message bit is “spread across” K bits of the
output parity bit sequence
!0 0 , !1 0 , !0 1 , !1 1 , !0 2 , !1[2]
13
Transmitting Parity Bits
• Code rate is 1 / #_of_generators
– e.g., 2 generators à rate = ½
• Engineering tradeoff:
– More generators improves bit-error correction
• But decreases rate of the code (the number of
message bits/s that can be transmitted)
14
Shift Register View
15
Today
1. Encoding data using convolutional codes
– Encoder state machine
– Changing code rate: Puncturing
16
State-Machine View
• Example: K = 3, code rate = ½, convolutional code
– There are 2K-1 state
– States labeled with (x[n-1], x[n-2])
– Arcs labeled with x[n]/p0[n]p1[n]
– Generator: g0 = 111, g1 = 101
– msg = 101100
Starting state
0/00
1/11
00 10
0/10 1/01
0/11
1/00
01 11
0/01 1/10
17
State-Machine View
Starting state • P0[n] = (1*x[n] + 1*x[n-1] + 1*x[n-2]) mod 2
0/00 • P1[n] = (1*x[n] + 0*x[n-1] + 1*x[n-2]) mod 2
1/11 • Generators: g0 = 111, g1 = 101
00 10
0/10 1/01
0/11
1/00
01 11
0/01 1/10
• msg = 101100
• Transmit:
18
State-Machine View
Starting state • P0[n] = 1*1 + 1*0 + 1*0 mod 2
0/00 • P1[n] = 1*1 + 0*0 + 1*0 mod 2
1/11 • Generators: g0 = 111, g1 = 101
00 10
0/10 1/01
0/11
1/00
01 11
0/01 1/10
• msg = 101100
• Transmit: 11
19
State-Machine View
Starting state • P0[n] = 1*0 + 1*1 + 1*0 mod 2
0/00 • P1[n] = 1*0 + 0*1 + 1*0 mod 2
1/11 • Generators: g0 = 111, g1 = 101
00 10
0/10
0/11 1/01
1/00
01 11
0/01 1/10
• msg = 101100
• Transmit: 11 10
20
State-Machine View
Starting state • P0[n] = 1*1 + 1*0 + 1*1 mod 2
0/00 • P1[n] = 1*1 + 0*0 + 1*1 mod 2
1/11 • Generators: g0 = 111, g1 = 101
00 10
0/10
0/11 1/01
1/00
01 11
0/01 1/10
• msg = 101100
• Transmit: 11 10 00
21
State-Machine View
Starting state • P0[n] = 1*1 + 1*1 + 1*0
0/00 • P1[n] = 1*1 + 0*1 + 1*0
1/11 • Generators: g0 = 111, g1 = 101
00 10
0/10
0/11 1/01
1/00
01 11
0/01 1/10
• msg = 101100
• Transmit: 11 10 00 01
22
State-Machine View
Starting state • P0[n] = 1*0 + 1*1 + 1*1
0/00 • P1[n] = 1*0 + 0*1 + 1*1
1/11 • Generators: g0 = 111, g1 = 101
00 10
0/10
0/11 1/01
1/00
01 11
0/01 1/10
• msg = 101100
• Transmit: 11 10 00 01 01
23
State-Machine View
Starting state • P0[n] = 1*0 + 1*0 + 1*1
0/00 • P1[n] = 1*0 + 0*0 + 1*1
1/11 • Generators: g0 = 111, g1 = 101
00 10
0/10
0/11 1/01
1/00
01 11
0/01 1/10
• msg = 101100
• Transmit: 11 10 00 01 01 11
24
Today
1. Encoding data using convolutional codes
– Encoder state machine
– Changing code rate: Puncturing
25
Varying the Code Rate
• How to increase/decrease rate?
N
N
0 0 1 0 1
• Coded bits =
0 0 1 1 1
• With Puncturing:
!!!"
P1= !""! Puncturing table
27
Punctured convolutional codes: example
0 0 1 0 1
• Coded bits =
0 0 1 1 1
• With Puncturing:
3 out of 4 bits are used
!!!"
P1= !""!
0 0 1 0 1
• Coded bits =
0 0 1 1 1
• With Puncturing:
!!!"
P1= !""!
0
• Punctured, coded bits: 0
29
Punctured convolutional codes: example
0 0 1 0 1
• Coded bits =
0 0 1 1 1
• With Puncturing:
!!!"
P1= !""!
0 0
• Punctured, coded bits: 0
30
Punctured convolutional codes: example
0 0 1 0 1
• Coded bits =
0 0 1 1 1
• With Puncturing:
!!!"
P1= !""!
0 0 1
• Punctured, coded bits: 0
31
Punctured convolutional codes: example
0 0 1 0 1
• Coded bits =
0 0 1 1 1
• With Puncturing:
!!!"
P1= !""!
0 0 1
• Punctured, coded bits: 0 1
32
Punctured convolutional codes: example
0 0 1 0 1
• Coded bits =
0 0 1 1 1
• With Puncturing:
!!!"
P1= !""!
0 0 1 1
• Punctured, coded bits: 0 1 1
33
Punctured convolutional codes: example
0 0 1 0 1
• Coded bits =
0 0 1 1 1
0 0 1 1
• Punctured, coded bits:
0 1 1
34
Today
1. Encoding data using convolutional codes
– Changing code rate: Puncturing
35
Motivation: The Decoding Problem
• Received bits: Message Coded bits Hamming
distance
000101100110 0000 000000000000 5
0001 000000111011 --
36
The Trellis
0/00 Starting state
• Vertically, lists encoder states
1/11
00 10
0
• Horizontally, tracks time steps
0/11 0/1 0 1/01
1/0
1/10
• Branches connect states in
01
0/01
11 successive time steps
Trellis:
0/00 0/00 0/00 0/00
00
1/
1/
1/
1/
1 1
0/1 0/1
11
11
11
11
01 1/0 1/0
States
0 0
0 0 0
Branch 0/1 0/1 0/1
10
1/0 1 1/01 1 1/01
1 0 0
0/ 0/
11 1/10
Time à 1/10
x[n-1] x[n-2] 37
The Trellis: Sender’s View
• At the sender, transmitted bits trace a unique, single
path of branches through the trellis
– e.g. transmitted data bits 1 0 1 1
x[n-1] x[n-2]
00
1/
11
01 1/0
States
0
0
0/1
10
1/0
1
11
Time à
38
Viterbi algorithm
• Andrew Viterbi (USC)
39
Viterbi algorithm: Summary
• Branch metrics score likelihood of each trellis branch
• At any given time there are 2K-1 most likely messages we’re
tracking (one for each state)
– One message ⟷ one trellis path
– Path metrics score likelihood of each trellis path
40
Today
1. Encoding data using convolutional codes
– Changing code rate: Puncturing
41
Hard-decision branch metric
• Hard decisions à input is bits
Received: 00
0/00 à 0
00
1/11 2
à2 0/11 à
01
States
1/00 à
0
1
0/01 à
10 à1
10 1/01 à
0/ 1
11 1/10 à 1
42
Hard-decision branch metric
Received: 00
0/00
00
0
1/
01 11
States
2
10
11
Time à
43
Hard-decision path metric
• Hard-decision path metric: Sum Hamming distance
between sent and received bits along path
Received: 00
0/00 à 0
00 0 0
1/
11
à
01 2
10 2
11
44
Hard-decision path metric
• Right now, each state has a unique predecessor state
Received: 00 11
0/00 à 0 0/00 à 2
00 0 0 2
1/
11
1/ 0
11
à
à
01 2 3
0
0/1 1
10 2 à
1/0 0
1
à
1
11 3
45
Hard-decision path metric
• Each state has two predecessor states, two
predecessor paths (which to use?)
Received: 00 11 01
1à
11
à
à
01 3 0/ 1
2
0
0/1 1
10 2 à
1/0 0
1
à
1
11 3
46
Hard-decision path metric
Received: 00 11 01
à
à
01 2 3 2 2
0
0/1 1 / 10à
0 0
10 2 à
1/0 0 à
à
1 / 01
1 0
11 3
47
Pruning non-surviving branches
• Survivor path begins at each state, traces unique path
back to beginning of trellis
– Correct path is one of four survivor paths
à
11
à
à
01 2 3 1 2
0 0
0/1 1 0/1 2
10 2 à
1/0 0 à1
/01 3
1 à
à 0
1
11 3 0
48
Making bit decisions
Received: 00 11 01
Decide: 0
0/00 à 0 0/00 à 2 0/00 à 1
00 0 0 2 1/ 3
11
1/ 0
à
11
à
01 1 2
0
0/1 2
10 0 à1
/01 3
à
0
11 0
49
End of received data
• Trace back the survivor with minimal path metric
Received: 00 11 01 10
Decide: 0 1 1 1
0/00 à 2 0/00 à 1
00 0 2 1/ 3 à1 3
11 1
à 0/1
1/
11
01 1 2 1/0
0 2
à
0 à
0/1 2
0
1
10 0 à 3 2 3
1/0 à
1à / 01
0 0 0 0
11
1/10 à 0
50
Terminating the code
• Sender transmits two 0 data bits at end of data
0/00 0/00
00
01 0/11 0/11
0/01
10
0/10
11
51
Viterbi with a Punctured Code
• Punctured bits are never transmitted
1/00 à
0
0
0/01 à
10 à1
10 1/01 à
0/ 0
11 1/10 à 1
52
Today
1. Encoding data using convolutional codes
– Changing code rate: Puncturing
53
How many bit errors can we correct?
• Think back to the encoder; linearity property:
– Message m1 à Coded bits c1
– Message m2 à Coded bits c2
– Message m1 ⨁ m2 à Coded bits c1 ⨁ c2
0 0 0 0 1 1 0 1 0 0 1 0 1
54
!"!#!"!% $"!#!"!%
00 00 00 00
00 0/00 0/00 0/00 0/00 4 0/00 0/00
1/11 1/11 1
1/11 1/11 1/11 1
1/11
0/11 0
0/11 0/11
0/ 0/11 0/
0/11 0/11
10 1/00 2 1/00 1/ 0
1/00 1/00
0 1/00
1/0
1/
1/00 1/00
x[n-1]x[n-2]
t time
55
Today
1. Encoding data using convolutional codes
– Changing code rate: Puncturing
56
Model for Today
• Coded bits are actually continuously-valued “voltages”
between 0.0 V and 1.0 V:
1.0 V
Strong “1”
Weak “1”
Weak “0”
57
On Hard Decisions
• Hard decisions digitize each voltage to “0” or “1” by
comparison against threshold voltage 0.5 V
– Lose information about how “good” the bit is
• Strong “1” (0.99 V) treated equally to weak “1” (0.51 V)
58
Soft-decision decoding
• Idea: Pass received voltages to decoder before digitizing
– Problem: Hard branch metric was Hamming distance
“Soft” metric
Expected parity bits:
(Vp0, Vp1)
(0, 1)
0.0, 0.0 1.0, 0.0
59
Soft-decision decoding
• Different branch metric, hence different path metric
60
Putting it together:
Convolutional coding in Wi-Fi
Data bits
Modulation
(BPSK, QPSK, …)
Demodulation
61
Thursday Topic:
Rateless Codes
Friday Precept:
Midterm Review
62