0% found this document useful (0 votes)
20 views21 pages

Chapter - 4 THE LMS ALGORITHM

Uploaded by

saadmuheyfalh
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
20 views21 pages

Chapter - 4 THE LMS ALGORITHM

Uploaded by

saadmuheyfalh
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 21

THE LMS ALGORITHM

See
1. “ADAPTIVE FILTER THEORY , Fifth Edition, by Simon Haykin|
2. “Digital Signal Processing Fundamentals and Applications, Second
edition, by Li Tan and Jean Jiang” , Chapter 10.
WIENER FILTER
• The Wiener filter is a linear filter used in signal
processing to estimate an unknown signal
corrupted by noise.
• The filter minimizes the mean squared error
between the estimated signal and the true signal.
• One common approach to calculating the Wiener
filter is through the use of the autocorrelation
function and the cross-correlation function of the
signal and noise. This approach is known as the
autocorrelation method.
ITERATIVE METHOD

• On the other hand, there are another approach is the


"iterative" method, which is also known as the iterative
method.
• This method involves iteratively refining an initial
estimate of the filter until convergence is achieved.
• The advantage of the iterative method is that it can
handle more complex signal and noise models, and it
can converge to a better filter estimate than the
autocorrelation method. However, it can be
computationally more expensive and may require
careful initialization of the filter.
LMS ALGORITHM
The LMS algorithm is a widely used due to its simplicity and low
computational complexity., which was pioneered by Widrow and Hoff (1960).
Distinctive features of this algorithm can be summarized as follows:
1. The LMS algorithm is simple, meaning that computational complexity of
the algorithm scales linearly with the dimensionality of the finite-
duration impulse response (FIR) filter, around which the algorithm
operates.
2. Unlike the Wiener filter, the algorithm does not require knowledge of
statistical characteristics of the environment in which it operates.
3. The algorithm is robust in a deterministic sense (i.e., single realization of
the algorithm) in the face of unknown environmental disturbances.
4. Last but by no means least, the algorithm does not require inversion of
the correlation matrix of the input vector, which, therefore, makes it
simpler than its counterpart, namely, the RLS algorithm.
STRUCTURAL DESCRIPTION OF THE LMS
ALGORITHM
THE STEPS OF THE ITERATIVE METHOD

1. Define an initial estimate of the Wiener filter


(W(0)), usually a zero filter or an identity filter.
2. Use this filter to estimate the signal from the
corrupted signal.
3. Use the estimated signal and the true signal to
compute the error signal.
4. Calculate the new values of tap values
(W(n+1))
5. Repeat steps 2-4 until the filter converges.
LMS ALGORITHM EQUATIONS
• In Wiener filter, the cost function as the mean-square value
of the estimation error

• The expectation operator performs ensemble averaging over


a large number of statistically independent realizations of the
instantaneous value of the square estimation error, , which is
performed at time n.
LMS ALGORITHM EQUATIONS

substituting Eq. (5.3) into Eq. (5.4), we obtain


LMS ALGORITHM EQUATIONS
Another Way of Deriving the lMS
Algorithm
• The updated formula of Eq. (5.6) may also be obtained exactly
from Eq. (4.10), describing iterative computation of the
Wiener filter using the method of steepest descent that was
covered in the previous Chapter.

• But

• and
Another Way of Deriving the lMS
Algorithm

• There is an important point that should be carefully noted here:


• Whatever connection that may have existed between the LMS
algorithm and the Wiener filter, that connection is completely
destroyed once the expectation operator 𝔼 is removed in
deriving the LMS algorithm from the Wiener filter.
comparison between Steepest Descent
method and LMS algorithm
• Steepest Descent and LMS (Least Mean Squares) are both
iterative algorithms used to minimize a cost function in the
context of linear regression. However, they differ in their
approach and how they update the model parameters.
• Steepest Descent is a gradient descent-based algorithm that
updates the model parameters in the direction of the steepest
descent of the cost function. In other words, it takes small steps
in the direction of the negative gradient of the cost function to
reach the minimum. Steepest Descent requires the computation
of the gradient of the cost function with respect to the model
parameters at each iteration. This can be computationally
expensive, especially for high-dimensional datasets.
comparison between Steepest Descent
method and LMS algorithm
• LMS, on the other hand, is a specific form of the stochastic
gradient descent algorithm. It updates the model parameters
based on the gradient of the cost function at each individual data
point in the training set, rather than using the gradient over the
entire dataset. LMS algorithm is commonly used in adaptive
filters, where the filter coefficients are adjusted iteratively in
response to the input signals.
• The main difference between Steepest Descent and LMS is their
convergence speed and computational complexity. Steepest
Descent has a slower convergence rate compared to LMS.
However, Steepest Descent can be more efficient for small
datasets, while LMS is preferred for large datasets due to its lower
computational cost.
Summary of the LMS Algorithm
APPLICATIONS: NOISE CANCELLATION
See “Digital Signal Processing Fundamentals and Applications, Second
edition, by Li Tan and Jean Jiang” , Chapter 10.
APPLICATIONS: System Modeling

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy