0% found this document useful (0 votes)
4 views4 pages

KNN_Backpropagation_Assignment

The document explains the K-Nearest Neighbour (KNN) algorithm, which classifies data points based on their proximity to others in feature space, and provides a step-by-step example of its application. It also describes the Backpropagation algorithm used in neural networks for updating weights by minimizing output error through gradient descent, including an illustrative example. Both algorithms are fundamental in supervised machine learning for classification and model improvement.

Uploaded by

kmkmmkmk510
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
4 views4 pages

KNN_Backpropagation_Assignment

The document explains the K-Nearest Neighbour (KNN) algorithm, which classifies data points based on their proximity to others in feature space, and provides a step-by-step example of its application. It also describes the Backpropagation algorithm used in neural networks for updating weights by minimizing output error through gradient descent, including an illustrative example. Both algorithms are fundamental in supervised machine learning for classification and model improvement.

Uploaded by

kmkmmkmk510
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 4

Assignment

1. Working of K-Nearest Neighbour (KNN) Algorithm

Introduction

K-Nearest Neighbour (KNN) is a simple, supervised machine learning algorithm. It assumes that

data points that are close in the feature space are similar.

KNN is widely used for classification problems.

Working of KNN

1. Choose the number of neighbours (k).

2. Calculate the distance (Euclidean or others) between the test data and all training data.

3. Sort the distances and find the k nearest neighbours.

4. Classify the test data by majority voting among the k nearest neighbours.

Example: Predict the class using KNN

Given data:

Fruit | Weight (g) | Texture (0-Smooth, 1-Rough)

------ | ---------- | ---------------------------

Apple | 150 |0

Apple | 160 |0

Orange | 140 |1

Orange | 170 |1

New Fruit to classify:

- Weight = 155g
- Texture = 0 (Smooth)

Step 1: Calculate Euclidean Distance

d = sqrt((x1 - x2)^2 + (y1 - y2)^2)

Distance to Apple (150, 0): 5

Distance to Apple (160, 0): 5

Distance to Orange (140, 1): ~15.03

Distance to Orange (170, 1): ~15.03

Step 2: Choose k = 3

Nearest neighbours: 2 Apples and 1 Orange

Step 3: Majority Voting

Majority = Apple

Result: The new fruit is classified as Apple.

2. Backpropagation Algorithm

Introduction

Backpropagation is the method of updating the weights of a neural network by minimizing the output

error using the gradient descent method.

Steps in Backpropagation

1. Forward Pass: Calculate outputs layer by layer.

2. Error Calculation: Find the difference between actual output and predicted output.
3. Backward Pass: Compute gradients using the chain rule.

4. Update Weights: Adjust weights to minimize the error.

Example: Simple Backpropagation Example

Given:

- Input x = 1.0

- Initial weight w = 0.5

- Target output t = 1.0

- Learning rate = 0.1

Activation Function: Linear

Step 1: Forward Pass

output = w * x = 0.5 * 1.0 = 0.5

Step 2: Calculate Error

E = 0.5 * (t - output)^2 = 0.125

Step 3: Backward Pass

dE/dw = (output - t) * x = -0.5

Step 4: Update Weight

w_new = w_old - learning rate * dE/dw = 0.5 + 0.05 = 0.55

Result: New weight = 0.55

Conclusion
- KNN predicts the output by finding nearest neighbours and using voting.

- Backpropagation improves the model by updating weights based on error.

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy