Compressive Ing: Nonsens
Compressive Ing: Nonsens
nonsensing
Richard Baraniuk
Rice University
Chapter 1
The Problem
challenge 1
data too expensive
Case in Point: MR Imaging
• Measurements
very expensive
• $1-3 million
per machine
• 30 minutes
per scan
Case in Point: IR Imaging
challenge 2
too much data
Case in Point: DARPA ARGUS-
IS
• 1.8 Gpixel image sensor
– video rate output:
444 Gbits/s
– comm data rate:
274 Mbits/s
factor of 1600x
way out of reach of
existing compression
technology
• Reconnaissance
without conscience
– too much data to transmit to a ground station
– too much data to make effective real-time decisions
Chapter 2
The Promise
V E
S I
ES G
P R I N
O M N S
C SE
innovation 1
sparse signal
models
Sparsity
large
pixel
wavelet
s
coefficient
s
(blue = 0)
wideban
frequenc
large
d Gabor (TF)
signal
y
coefficient
samples s
tim
e
Sparsity
large
pixel
wavelet
s
coefficient
s
(blue = 0)
spars
nonlinear e
signal
signal
model
nonzer
o
innovation 2
dimensionality
reduction for
sparse signals
Dimensionality Reduction
• When data is sparse/compressible, can directly
acquire a compressed representation with
no/little information loss through
linear dimensionality reduction
spars
measurement
e
s
signal
nonzer
o
entries
Stable Embedding
• An information preserving projection preserves
the geometry of the set of sparse signals
K-dim
subspaces
• SE ensures that
Stable Embedding
• An information preserving projection preserves
the geometry of the set of sparse signals
• SE ensures that
Random Embedding is Stable
• Measurements = random linear combinations
of the entries of
spars
measurement
e
s
signal
nonzer
o
entries
innovation 3
sparsity-based
signal recovery
Signal Recovery
• Problem: Random
projection not full rank
(ill-posed inverse problem)
• Problem: Random
projection not full rank
(ill-posed inverse problem)
DMD DMD
random
pattern on
DMD
array
w/ Kevin
Kelly
“Single-Pixel” CS Camera
scen
e single photon
detector
image
reconstruction
or
processing
DMD DMD
random
pattern on
DMD
array
spectrogram sparsogra
m
challenge 1
data too expensive
means fewer
expensive
measurements
needed for the same
resolution scan
challenge 2
too much data
means we compress
on the fly as we
acquire data
! !!
NG
IT I
XC
E
2004—2014
9797 citations
6640 citations
The Hype
CS is Growing Up
Gerhard Richter
4096 Colours
muralsoflajolla.com/roy-mcmakin-mural
“L1 is the new L2”
- Stan Osher
Exponential Growth
?
Chapter 4
The Fallout
“L1 is the new L2”
- Stan Osher
CS for “Face Recognition”
From: M. V.
Subject: Interesting application for compressed sensing
Date: June 10, 2011 at 11:37:31 PM EDT
To: candes@stanford.edu, jrom@ece.gatech.edu
I'm excited about the applications CS could have in many fields, but today I was
reminded of a specific application where CS could conceivably settle an
area of dispute between mainstream historians and Roswell UFO theorists.
As outlined in the linked video below, Dr. Rudiak has analyzed photos from 1947
in which a General Ramey appears holding a typewritten letter from which Rudiak
believes he has been able to discern a number of words which he believes
substantiate the extraterrestrial hypothesis for the Roswell Incident). For
your perusal, I've located a "hi-res" copy of the cropped image of the letter in
Ramey's hand.
Back to Reality
Back to Reality
• “There's no such thing as a free lunch”
• Dimensionality reduction
is no exception
• Result: Compressive
Nonsensing
Nonsense 1
Robustness
Measurement Noise
• Stable recovery
with additive
measurement noise
• Noise is added to
• Noise is added to
slope = -3
CS recovered signal SNR
“Tail Folding”
“signal” + “tail”
• As amount of subsampling
grows, can employ
an ADC with a lower
sampling rate and hence
higher-resolution quantizer
Dynamic Range
• CS can significantly boost the ENOB
of an ADC system for sparse signals
stated number of bits
CS ADC w/ sparsity
conventional ADC
• As amount of subsampling
grows, can employ
an ADC with a lower
sampling rate and hence
higher-resolution quantizer
Dynamic Range:
up to 7.9dB gain
for
each doubling of
Adaptivity
column
• Then we boost s
the SNR by
• Motivates adaptive
’
sensing strategies
that bypass the
noise-folding tradeoff
[Haupt, Castro, Nowak, B 2009;
Candes, Davenport 2011]
Nonsense 2
Quantization
CS and Quantization
• Vast majority of work in CS assumes the
measurements are real-valued
• Limited progress
– large number of bits per measurement
– 1 bit per measurement
CS and Quantization
N=2000, K=20, M = (total bits)/(bits per meas)
12 bits/meas
10 bits
8 bits
6 bits
1 bit
4 bits
2 bits
Nonsense 3
Weak Models
Weak Models
best case
Spectral CS
20dB
average case
worst case
Nonsense 4
Focus on
Recovery
Misguided Focus on Recovery
• Common issue:
– L unknown articulation parameters
classification rate
more
noise
more
noise
(%)
CS-PLL w/ 20x
undersampling
Nonsense 5
Weak
Guarantees
Performance Guarantees
• CS performance guarantees
– RIP, incoherence, phase transition
reduction
12-Step Program
To End Compressive Nonsensing
1. Don’t give in to the hype surrounding CS
2. Resist the urge to blindly apply L 1 minimization
3. Face up to robustness issues
4. Deal with measurement quantization
5. Develop more realistic signal models
6. Develop practical sensing matrices beyond random
7. Develop more efficient recovery algorithms
8. Develop rigorous performance guarantees for practical CS
systems
9. Exploit signals directly in the compressive domain
10. Don’t give in to the hype surrounding CS
11. Resist the urge to blindly apply L 1 minimization
12. Don’t give in to the hype surrounding CS