0% found this document useful (0 votes)
17 views3 pages

Sam SC 3.2

The document describes training a backpropagation neural network to generate the XOR function. It details initializing the network with random weights and biases, training it by feeding inputs forward and using backpropagation to adjust weights to minimize error, and testing the trained network on XOR inputs to produce the predicted output.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
17 views3 pages

Sam SC 3.2

The document describes training a backpropagation neural network to generate the XOR function. It details initializing the network with random weights and biases, training it by feeding inputs forward and using backpropagation to adjust weights to minimize error, and testing the trained network on XOR inputs to produce the predicted output.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 3

DEPARTMENT OF

COMPUTER SCIENCE & ENGINEERING

Experiment-3.2

Student Name: Samarth Maheshwari UID: 21BCS10260


Branch: CSE Section/Group: SC-906-A
Semester: 6th Date:03-04-24
Subject Name: Soft Computing LAB Subject Code: 21CSP-377

1. Aim: To write a MATLAB program to train and test the back propagation neural network
for the generation of XOR function.

2. Objective:. Generation of XOR Function using back propagation algorithm.


3. Algorithm:
1. Set up input and output data: Define the input-output pairs for the XOR function.
2. Initialize the neural network: Create a neural network with two input neurons, two hidden
neurons, and one output neuron. Initialize random weights and biases.
3. Train the neural network:
- Loop until the error becomes small enough.
- Feed forward: Pass inputs through the network to get predicted outputs.
- Compute the error: Measure how much the predicted outputs differ from the actual
outputs.
- Backpropagation: Adjust the weights and biases to minimize the error.
- Update weights and biases based on the error and learning rate.
4. Test the trained network: Use the trained network to predict the output for the XOR inputs.
5. Display the predicted output: Show the predicted outputs obtained from the trained neural
network.

4. Script and output:

inputs = [0 0; 0 1; 1 0; 1 1];
targets = [0; 1; 1; 0];
DEPARTMENT OF
COMPUTER SCIENCE & ENGINEERING
input_neurons = 2;
hidden_neurons = 2;
output_neurons = 1;
learning_rate = 0.1;

hidden_weights = randn(input_neurons, hidden_neurons);


hidden_bias = randn(1, hidden_neurons);
output_weights = randn(hidden_neurons, output_neurons);
output_bias = randn(1, output_neurons);

error_threshold = 0.01;

error = Inf;
while error > error_threshold

hidden_activation = sigmoid(inputs * hidden_weights + hidden_bias);


output_activation = sigmoid(hidden_activation * output_weights + output_bias);

error = sum((targets - output_activation).^2) / numel(targets);

output_error = targets - output_activation;


output_delta = output_error .* sigmoid_derivative(output_activation);

hidden_error = output_delta * output_weights';


hidden_delta = hidden_error .* sigmoid_derivative(hidden_activation);

output_weights = output_weights + learning_rate * hidden_activation' * output_delta;


output_bias = output_bias + learning_rate * sum(output_delta);

hidden_weights = hidden_weights + learning_rate * inputs' * hidden_delta;


hidden_bias = hidden_bias + learning_rate * sum(hidden_delta);
end
disp("Samarth Maheshwari 21BCS10260");
hidden_activation = sigmoid(inputs * hidden_weights + hidden_bias);
output_activation = sigmoid(hidden_activation * output_weights + output_bias);
disp('Predicted output:');
disp(output_activation);

function sig = sigmoid(x)


sig = 1 ./ (1 + exp(-x));
end

function sig_prime = sigmoid_derivative(x)


DEPARTMENT OF
COMPUTER SCIENCE & ENGINEERING
sig_prime = x .* (1 - x);
end

5. OUTPUT

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy