0% found this document useful (0 votes)
4 views16 pages

FIR AND IIR Report Using Matlap

The document details the implementation of adaptive filters using MATLAB, focusing on both FIR and IIR filter designs. It includes code for training and testing these filters with various input signals, along with explanations of the algorithms used, such as the Least Mean Squares (LMS) method. Additionally, it provides visualizations of the filter outputs and learning curves to assess performance.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
4 views16 pages

FIR AND IIR Report Using Matlap

The document details the implementation of adaptive filters using MATLAB, focusing on both FIR and IIR filter designs. It includes code for training and testing these filters with various input signals, along with explanations of the algorithms used, such as the Least Mean Squares (LMS) method. Additionally, it provides visualizations of the filter outputs and learning curves to assess performance.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 16

The document was prepared under the

supervision of Prof. Dr. Ashraf Abdulmonem

Prepared by:
Ahmed Mohey mohamed
Hesham Fathy Saber
Seif Mostafa Kamel
Mina Maher Roshdy

Esmail khalid esmail ali

Faculty of Engineering, Minia Universit


Adaptive Filters Using MATLAB

• FIR Adaptive filter


 Block Diagram of FIR

 Difference Equation of FIR


 Matlap Training code
clc; clear;
close all;
% Parameters
N = 300; % total samples
Fs = 100; % Sampling frequency
Ts = 1/Fs;
n = 1:N;
% Signals
Desired = sin(2 * pi * Ts * n); % clean desired signal
Noise = 0.5 * (rand(1, N) - 0.5); % random noise
X = Desired + Noise; % input = desired + noise
% FIR adaptive filter (LMS-like) u = 0.25; b1 =
0.2 * ones(1, N); b2 = 0.3 * ones(1, N); b3 = 0.4
* ones(1, N); b4 = 0.2 * ones(1, N); b5 = 1.0 *
ones(1, N); y = zeros(1, N); error_history =
zeros(1, 75); % for learning curve

% Training Phase with error tracking for k = 1:75 epoch_error = 0;


for i = 6:N y(i) = b1(i)*X(i-1) + b2(i)*X(i-2) + b3(i)*X(i-3) +
b4(i)*X(i-4) + b5(i)*X(i-5);
e = y(i) - Desired(i);
epoch_error = epoch_error + abs(e);
b1(i) = b1(i) - u * X(i-1) * e;
b2(i) = b2(i) - u * X(i-2) * e;
b3(i) = b3(i) - u * X(i-3) * e;
b4(i) = b4(i) - u * X(i-4) * e;
b5(i) = b5(i) - u * X(i-5) * e; end
error_history(k) = epoch_error / (N - 5); end

% Plotting the filter output figure;


plot(n, Desired, 'k--', 'LineWidth', 1.2); hold on;
plot(n, X, 'r:', 'LineWidth', 1); plot(n, y, 'b',
'LineWidth', 1.5); legend('Desired', 'Noisy
Input', 'FIR Output'); title('FIR Adaptive Filter
Output'); xlabel('Sample'); ylabel('Amplitude');
grid on;

% Plotting the Learning Curve figure;


plot(1:75, error_history, 'LineWidth', 2); title('Learning
Curve');
xlabel('Epoch'); ylabel('Mean
Absolute Error'); grid on;

Explanation of the MATLAB Code for Adaptive Filtering

1. Code Initialization
Before starting the simulation, the environment is cleaned to ensure accurate results:
• clc → Clears the command window.
• clear → Removes all variables from the MATLAB workspace.
• close all → Closes all open figure windows.

This ensures that previous data or plots do not interfere with the current session.

2. Simulation Parameters
The following parameters are defined to control the simulation:

• N: Total number of samples to be generated.


• Fs: Sampling frequency (in Hz).
• Ts = 1/Fs: Sampling period (in seconds).
• n: A vector representing discrete time indices, usually from 0 to N−1.

These parameters form the time base for signal generation and filtering.

3. Signal Generation
To simulate a real-world scenario, two signals are created:

• Desired Signal (d[n]):


A clean sine wave that serves as the reference or target signal.
• Input Signal (x[n]):
Created by adding random noise to the clean sine wave.
This noisy signal is the actual input to the adaptive filter.

The goal is to train the filter to recover the clean signal from the noisy input.

4. Adaptive Filter Initialization


The filter is configured as follows:
• Structure: A 5-tap Finite Impulse Response (FIR) filter.
• Weights: Five coefficients (b1 to b5) are initialized with small, possibly random values.
• Learning Rate (μ): Controls the speed of adaptation.
o Higher μ → faster learning but risk of instability.
o Lower μ → stable learning but slower convergence.
• Error Tracking:
A vector error_history is used to store the average error at each training epoch, helping
evaluate the learning performance.
5. Training Phase Using LMS Algorithm
The filter is trained over multiple iterations using the Least Mean Squares (LMS) algorithm:

• Epochs: The training loop runs for 75 epochs.


• Filter Output:
At each time step n, the filter output y[n] is calculated as a weighted sum of the previous 5
input samples.
• Error Calculation:
e[n] = d[n] - y[n], where d[n] is the desired output.
• Weight Update Rule:
The weights are adjusted using the LMS rule: bi=bi+μ e[n] x[n−i]b_i = b_i +

\mu \cdot e[n] \cdot x[n-i]bi=bi+μ e[n] x[n−i]

• Averaging Error:
After each epoch, the average error is stored for plotting a learning curve, showing how the
filter improves over time.

 Matlap Testing code


- Test at the same input signal used in training
clc; clear; close all;

%% PARAMETERS
N = 300;
Fs = 100;
Ts = 1/Fs;
n = 1:N;
order = 5;
mu = 0.01;

%% TRAINING PHASE
% Training signals
Desired_train = sin(2 * pi * Ts * n);
Noise_train = 0.5 * (rand(1, N) - 0.5);
X_train = Desired_train + Noise_train;

% Initialize
w = zeros(order, 1);
y_train = zeros(1, N);
e_train = zeros(1, N);

% LMS Training
for i = order+1:N
x_vec = X_train(i-1:-1:i-order)';
y_train(i) = w' * x_vec;
e_train(i) = Desired_train(i) - y_train(i);
w = w + mu * x_vec * e_train(i);
end

% Save weights
save('fir_weights.mat', 'w');

% Plot Training Output


figure;
plot(n, Desired_train, 'g', n, X_train, 'r--', n, y_train, 'b');
legend('Desired', 'Noisy Input', 'FIR Output (Train)');
title('FIR Training Phase');
xlabel('Sample'); ylabel('Amplitude'); grid on;

% Plot Learning Curve


figure;
plot(n, abs(e_train), 'm');
title('Learning Curve - Error during Training');
xlabel('Sample'); ylabel('Absolute Error');
grid on;

%% TESTING PHASE
% New test signals
Desired_test = sin(2 * pi * Ts * n);
Noise_test = 0.5 * (rand(1, N) - 0.5);
X_test = Desired_test + Noise_test;

y_test = zeros(1, N);


for i = order+1:N
x_vec = X_test(i-1:-1:i-order)';
y_test(i) = w' * x_vec;
end

% Plot Testing Output


figure;
plot(n, Desired_test, 'g', n, X_test, 'r--', n, y_test, 'b');
legend('Desired', 'Noisy Input', 'FIR Output (Test)');
title('FIR Testing Phase');
xlabel('Sample'); ylabel('Amplitude'); grid on;

 Matlab Training result


 Matlab Testing result
 Matlap Testing code
- Test at the different input signal
clc;
clear;
close all;

% Parameters
N = 300; % total samples
Fs = 100; % Sampling frequency
Ts = 1/Fs;
n = 1:N;

% Training Signal (composite sine wave)


Desired_train = sin(2*pi*3*Ts*n) + 0.5*sin(2*pi*7*Ts*n);
Noise_train = 0.5 * (rand(1, N) - 0.5);
X_train = Desired_train + Noise_train;

% Normalize training data


X_train = X_train / max(abs(X_train));
Desired_train = Desired_train / max(abs(Desired_train));

% Filter parameters
L = 10; % Filter taps
mu0 = 0.05;
epsilon = 1e-6;

% Initialize weights and output


b = ones(1, L) * 0.1;
y_train = zeros(1, N);
error_train = zeros(1, N);
error_history_train = zeros(1, 75);
% Training phase
for epoch = 1:75
mu = mu0 / sqrt(epoch); % Learning rate decay
epoch_error = 0;
for i = L+1:N
x_vec = X_train(i-1:-1:i-L); % input vector
y_train(i) = b * x_vec';
e = y_train(i) - Desired_train(i);
epoch_error = epoch_error + abs(e);
b = b - (mu * x_vec * e) / (epsilon + norm(x_vec)^2); % Normalized
LMS
end
error_history_train(epoch) = epoch_error / (N - L);
end

% Testing Signal (different from training)


Desired_test = sin(2*pi*5*Ts*n);
Noise_test = 0.5 * (rand(1, N) - 0.5);
X_test = Desired_test + Noise_test;

% Normalize testing data


X_test = X_test / max(abs(X_test));
Desired_test = Desired_test / max(abs(Desired_test));

y_test = zeros(1, N);


error_history_test = zeros(1, 75);

% Apply trained filter (testing phase)


for epoch = 1:75
epoch_error = 0;
for i = L+1:N
x_vec = X_test(i-1:-1:i-L);
y_test(i) = b * x_vec';
e = y_test(i) - Desired_test(i);
epoch_error = epoch_error + abs(e);
end
error_history_test(epoch) = epoch_error / (N - L);
end

% Plot Training Output


figure;
plot(n, Desired_train, 'k--', n, X_train, 'r:', n, y_train, 'b',
'LineWidth', 1.2);
title('Training Phase Output');
xlabel('Sample'); ylabel('Amplitude');
legend('Desired (Train)', 'Noisy (Train)', 'FIR Output (Train)');
grid on;

% Plot Testing Output


figure;
plot(n, Desired_test, 'k--', n, X_test, 'r:', n, y_test, 'g', 'LineWidth',
1.2);
title('Testing Phase Output');
xlabel('Sample'); ylabel('Amplitude');
legend('Desired (Test)', 'Noisy (Test)', 'FIR Output (Test)');
grid on;

% Plot Learning Curves


figure;
plot(1:75, error_history_train, 'b', 1:75, error_history_test, 'g',
'LineWidth', 2);
title('Learning Curves (Train vs Test)');
xlabel('Epoch'); ylabel('Mean Absolute Error');
legend('Training Error', 'Testing Error');
grid on;

IIR Adaptive filter


Block Diagram of IIR
Difference
Equation of IIR

 Matlab code

clc; clear;
close all;
% Parameters
N = 300; % Number of samples
Fs = 100; % Sampling frequency
Ts = 1/Fs; n
= 1:N;

% Generate signals
Desired = sin(2 * pi * Ts * n); % Clean desired signal
Noise = 0.5 * (rand(1, N) - 0.5); % Random noise
X = Desired + Noise; % Input = desired + noise

% IIR adaptive filter parameters


u = 0.05; % Learning rate
a1 = zeros(1, N); b1 = zeros(1, N); a2 = zeros(1, N); b2 =
zeros(1, N); a3 = zeros(1, N); b3 = zeros(1, N);

% Initialize filter coefficients a1(:)


= 0.3; a2(:) = -0.2; a3(:) = 0.1;
b1(:) = 0.5; b2(:) = 0.3; b3(:) = 0.2;
y = zeros(1, N); error_history = zeros(1, 75);
% For learning curve

% Training Phase for epoch = 1:75 epoch_error = 0;


for i = 4:N y(i) = b1(i)*X(i-1) + b2(i)*X(i-2) +
b3(i)*X(i-3) ...
- a1(i)*y(i-1) - a2(i)*y(i-2) - a3(i)*y(i-3);

e = Desired(i) - y(i);
epoch_error = epoch_error + abs(e);

% LMS-like adaptation
b1(i) = b1(i) + u * e * X(i-1);
b2(i) = b2(i) + u * e * X(i-2);
b3(i) = b3(i) + u * e * X(i-3);
a1(i) = a1(i) - u * e * y(i-1);
a2(i) = a2(i) - u * e * y(i-2);
a3(i) = a3(i) - u * e * y(i-3); end
error_history(epoch) = epoch_error / (N - 3); end

% Plot signals figure;


plot(n, Desired, 'g', 'LineWidth', 1.5); hold on;
plot(n, X, 'r--'); plot(n, y, 'b');
legend('Desired', 'Noisy Input', 'Filter Output');
title('IIR Adaptive Filter Output');
xlabel('Sample Index'); ylabel('Amplitude');

% Plot learning curve figure;


plot(1:75, error_history, 'k', 'LineWidth', 2);
xlabel('Epoch'); ylabel('Mean Absolute Error');
title('Learning Curve - IIR Adaptive Filter');
grid on;

 matlab result
 Matlap Testing code
- Test at the different input signal
clc;
clear;
close all;

% Parameters
N = 300; % Number of samples
Fs = 100; % Sampling frequency
Ts = 1/Fs;
n = 1:N;

%% Generate Training Signals


Desired_train = sin(2 * pi * 2 * Ts * n) + 0.5 * sin(2 * pi * 5 * Ts * n);
Noise_train = 0.5 * (rand(1, N) - 0.5);
X_train = Desired_train + Noise_train;

% IIR adaptive filter parameters


u0 = 0.1; % Base learning rate
a1 = zeros(1, N); b1 = zeros(1, N);
a2 = zeros(1, N); b2 = zeros(1, N);
a3 = zeros(1, N); b3 = zeros(1, N);

% Initialize filter coefficients


a1(:) = 0.3; a2(:) = -0.2; a3(:) = 0.1;
b1(:) = 0.5; b2(:) = 0.3; b3(:) = 0.2;

y_train = zeros(1, N);


train_error_history = zeros(1, 75);

%% Training Phase
for epoch = 1:75
epoch_error = 0;
u = u0 / sqrt(epoch); % Decreasing learning rate
for i = 4:N
y_train(i) = b1(i)*X_train(i-1) + b2(i)*X_train(i-2) +
b3(i)*X_train(i-3) ...
- a1(i)*y_train(i-1) - a2(i)*y_train(i-2) -
a3(i)*y_train(i-3);
e = Desired_train(i) - y_train(i);
epoch_error = epoch_error + abs(e);

% LMS-like adaptation
norm_factor = 1e-6 + X_train(i-1)^2 + X_train(i-2)^2 + X_train(i-
3)^2 + y_train(i-1)^2 + y_train(i-2)^2 + y_train(i-3)^2;
b1(i) = b1(i) + (u * e * X_train(i-1)) / norm_factor;
b2(i) = b2(i) + (u * e * X_train(i-2)) / norm_factor;
b3(i) = b3(i) + (u * e * X_train(i-3)) / norm_factor;
a1(i) = a1(i) - (u * e * y_train(i-1)) / norm_factor;
a2(i) = a2(i) - (u * e * y_train(i-2)) / norm_factor;
a3(i) = a3(i) - (u * e * y_train(i-3)) / norm_factor;
end
train_error_history(epoch) = epoch_error / (N - 3);
end

%% Testing Phase
Desired_test = sin(2 * pi * 3 * Ts * n) + 0.4 * sin(2 * pi * 6 * Ts * n);
Noise_test = 0.5 * (rand(1, N) - 0.5);
X_test = Desired_test + Noise_test;

y_test = zeros(1, N);


test_error_history = zeros(1, 75);

for epoch = 1:75


epoch_error = 0;
for i = 4:N
y_test(i) = b1(end)*X_test(i-1) + b2(end)*X_test(i-2) +
b3(end)*X_test(i-3) ...
- a1(end)*y_test(i-1) - a2(end)*y_test(i-2) -
a3(end)*y_test(i-3);
e = Desired_test(i) - y_test(i);
epoch_error = epoch_error + abs(e);
end
test_error_history(epoch) = epoch_error / (N - 3);
end

%% Plot outputs
figure;
plot(n, Desired_train, 'g', 'LineWidth', 1.5); hold on;
plot(n, X_train, 'r--');
plot(n, y_train, 'b');
legend('Desired (Train)', 'Noisy Input (Train)', 'IIR Output (Train)');
title('Training Phase Output');
xlabel('Sample Index'); ylabel('Amplitude');

figure;
plot(n, Desired_test, 'k--', 'LineWidth', 1.5); hold on;
plot(n, X_test, 'm:');
plot(n, y_test, 'c');
legend('Desired (Test)', 'Noisy Input (Test)', 'IIR Output (Test)');
title('Testing Phase Output');
xlabel('Sample Index'); ylabel('Amplitude');

%% Plot learning curves


figure;
plot(1:75, train_error_history, 'b', 'LineWidth', 2); hold on;
plot(1:75, test_error_history, 'g', 'LineWidth', 2);
legend('Training Error', 'Testing Error');
xlabel('Epoch'); ylabel('Mean Absolute Error');
title('Learning Curves (Train vs Test)');
grid on;

 Matlab Testing result

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy