An Introduction To Adaptive Filtering & It's Applications: Asst - Prof.Dr - Thamer M.Jamel
An Introduction To Adaptive Filtering & It's Applications: Asst - Prof.Dr - Thamer M.Jamel
Asst.Prof.Dr.Thamer M.Jamel
Department of Electrical Engineering
University of Technology
Baghdad Iraq
Introduction
Linear filters :
the filter output is a linear function of the
filter input
Design methods:
The classical approach
frequency-selective filters such as
low pass / band pass / notch filters etc
Optimal filter design
Wiener filter
work of Wiener in 1942 and
Kolmogorov in 1939
it is based on a priori
statistical information
when such a priori
information is not available,
which is usually the case,
it is not possible to design
a Wiener filter in the first
place.
Adaptive filter
the signal and/or noise characteristics are
Adaptive filter
The basic operation now involves two
processes :
1. a filtering process, which produces an
output signal in response to a given input
signal.
2. an adaptation process, which aims to
adjust the filter parameters (filter transfer
function) to the (possibly time-varying)
environment
Often, the (average) square value of the
error signal is used as the optimization
criterion
Adaptive filter
Because of complexity of the optimizing
analog signals,
the adaptive filter
is then preceded
by A/D and D/A
convertors.
Adaptive filter
The generalization to adaptive IIR filters
Applications of Adaptive
Filters: Identification
Used to provide a linear model of an
unknown plant
Applications:
System identification
Applications of Adaptive
Filters: Inverse Modeling
Used to provide an inverse model of an
unknown plant
Applications:
Equalization (communications channels)
Applications of Adaptive
Filters: Prediction
Used to provide a prediction of the
Filters: Interference
Cancellation
Used to cancel unknown interference from a
primary signal
Applications:
Echo / Noise cancellation
hands-free carphone, aircraft headphones
etc
Acoustic Echo
Cancellation
LMS Algorithm
Most popular adaptation algorithm is LMS
adaptation
In the family of stochastic gradient algorithms
Approximation of the steepest descent method
Based on the MMSE criterion.(Minimum Mean
square Error)
Adaptive process containing two input signals:
tap weights
*
k
k 0
n d n y n
Estimation errore
wk n un k e n
*
Tap-weight adaptation
wk n 1
update value
old value
learning -
vector
vector
parameter
tap
input
vector
error
signal
17
Stability of LMS
The LMS algorithm is convergent in the
adaptation)
Increases residual mean-squared error
Given the following function we need to obtain the vector that would give us the absolute
minimum.
It is obvious that
give us the minimum.
C1 C2 0,
C1
C2
dy
dc
1
2C1
dy 2C2
dc
2
C1
C
2
C1
C1
0.2 y
C
C2
[ n 1] 2 [ n ]
C1
0. 1
C2
[n]
C1
0.9
C2
[n]
[n]
C 5
Iteration1 : 1
C2 7
C1 4.5
Iteration 2 :
C2 6.3
Initial guess
C1 0.405
0.567
C
Iteration3 :
Minimum
......
C1 0.01
Iteration 60 :
C2 0.013
C1
0
lim n
C 2 [ n ] 0
C1
C2
As we can see, the vector [c1,c2] converges to the value which would yield
the function minimum and the speed of this convergence depends on
.
SMART ANTENNAS
Adaptive
Array
Antenna
Adaptive Arrays
Linear Combiner
Interference
Adaptive
Equalization
Introduction
Wireless communication is the
most
interesting field of
communication these days,
because it supports mobility
(mobile users). However, many
applications of wireless comm.
now require high-speed
communications (high-data-
Cause of ISI
ISI is imposed due to band-limiting effect
of practical channel, or also due to the
multi-path effects (delay spread).
Need of equalization:
is to mitigate the effects of ISI to
decrease the probability of error that
occurs without suppression of ISI, but
this reduction of ISI effects has to be
balanced with prevention of noise power
enhancement.
Types of Equalization
techniques
Non-Linear Equalization
techniques
which are more complex to implement, but
have much less noise enhancement than
linear equalizers.
Equalization Techniques
Multiplyingoperations
LMS
2N 1
MMSE N 2toN 3
RLS 2.5 N 2 4.5 N
Fast
20 N 5
kalma
n
1.5 N 2 6.5 N
RLSDFE
complexity
convergen tracking
ce
Low
Very high
High
Fairly
Low
slow
fast
fast
fast
poor
good
good
good
High
fast
good
Adaptive
noise
cancellation
e(n)
Error Output
x(n)
Filter Input
Adaptive Filter
y(n)
e(n)
Filter Output
LMS Convergence Vs u
e(n)
Speech Output
Far Microphone
x(n) = noise'
Adaptive Filter
y(n)
e(n)
Filter Output
(noise)
convergence of
the LMS
Algorithm is
controlled by the
Step size (mu).
This is the critical
variable.
Headphones
to +5V
Microphone
Adaptive Echo
Cancellation
Thank You