0% found this document useful (0 votes)
74 views34 pages

UNIT 6.machine Learning

1. The document discusses single layer and multi-layer perceptrons. A single layer perceptron uses a step function as its activation function to map inputs to outputs between 0 and 1 or -1 and 1. 2. A multi-layer perceptron defines a more complex neural network architecture with multiple layers of perceptrons. It uses backpropagation for training the network to perform supervised learning. 3. Deep learning uses statistical techniques based on neural networks to learn feature hierarchies from data and has applications such as image recognition, video analysis, and time series forecasting.

Uploaded by

Lakshya Kaushik
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
74 views34 pages

UNIT 6.machine Learning

1. The document discusses single layer and multi-layer perceptrons. A single layer perceptron uses a step function as its activation function to map inputs to outputs between 0 and 1 or -1 and 1. 2. A multi-layer perceptron defines a more complex neural network architecture with multiple layers of perceptrons. It uses backpropagation for training the network to perform supervised learning. 3. Deep learning uses statistical techniques based on neural networks to learn feature hierarchies from data and has applications such as image recognition, video analysis, and time series forecasting.

Uploaded by

Lakshya Kaushik
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 34

UNIT-5

Single Layer Perceptron


Perceptron uses the step function that returns +1 if
the weighted sum of its input 0 and -1.

The activation function is used to map the input


between the required value like (0, 1) or (-1, 1).
In this step, add all the increased values and call them the
Weighted sum.
c. In our last step, apply the weighted sum to a correct Activation Function.
There are two types of architecture. These types focus on
the functionality of artificial neural networks as follows-

1. Single Layer Perceptron


2. Multi-Layer Perceptron
Now, We have to do the following necessary steps of training logistic regression-

• The weights are initialized with the random values at the origination
of each training.

• For each element of the training set, the error is calculated with the
difference between the desired output and the actual output. The
calculated error is used to adjust the weight.

• The process is repeated until the fault made on the entire training set
is less than the specified limit until the maximum number of
iterations has been reached.

The logistic regression is considered as predictive analysis.

Logistic regression is mainly used to describe data and use to explain the relationship between the dependent
binary variable and one or many nominal or independent variables.
Multi-layer Perceptron
Multi-Layer perceptron defines the most complex architecture of artificial neural
networks. It is substantially formed from multiple layers of the perceptron.

The pictorial representation of multi-layer perceptron learning is as shown below-


MLP networks are used for supervised learning format.

A typical learning algorithm for MLP networks is also called back propagation's
algorithm.

A multilayer perceptron (MLP) is a feed forward artificial neural network that


generates a set of outputs from a set of inputs.

An MLP is characterized by several layers of input nodes connected as a directed


graph between the input nodes connected as a directed graph between the input
and output layers.

MLP uses backpropagation for training the network.

MLP is a deep learning method.


Backpropagation Process in Deep Neural Network
Deep learning is a collection of statistical techniques of machine learning for
learning feature hierarchies that are actually based on artificial neural networks.
• Identify Faces, Street Signs, Tumors.
• Image Recognition.
• Video Analysis.
• NLP.
• Anomaly Detection.
• Drug Discovery.
• Checkers Game.
• Time Series Forecasting.

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy