0% found this document useful (0 votes)
5 views20 pages

Deep Learningchap2

The document provides an overview of deep learning, including its fundamentals, neural network types, and training methods. It explains the structure and function of artificial neural networks, activation functions, and various gradient descent techniques. Additionally, it discusses different types of neural networks and their applications in machine learning tasks.

Uploaded by

Kamel Marnissi
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
5 views20 pages

Deep Learningchap2

The document provides an overview of deep learning, including its fundamentals, neural network types, and training methods. It explains the structure and function of artificial neural networks, activation functions, and various gradient descent techniques. Additionally, it discusses different types of neural networks and their applications in machine learning tasks.

Uploaded by

Kamel Marnissi
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 20

FR

RÉSIDENCES FABRIKAM
Deep Learning
Ms Mounira Zouaghi
Content

1 Introduction to deeplearning

Fundamentals of deeplearning
2
Deeplearning frameworks:PyTorch,
3 TensorFlow, Keras…

4 Types of Neural Network

5 Use cases of Neural Network

2
Chapter2: Fundamentals of Deep Learning
What is Deep Learning
 a subset of machine learning that's based
on artificial neural networks.

 The learning process is deep because the


structure of artificial neural networks
consists of multiple input, output, and
hidden layers.

 Each layer contains units that transform the


input data into information that the next
layer can use for a certain predictive task.

 Thanks to this structure, a machine can


learn through its own data processing.
Artificial Neural Netwoks
 An artificial neural network (ANN) is a
Input layer hidden layer
computational model to perform tasks like
prediction, classification, decision making, etc.

wi  Artificial neurons are a copy of human brain


output layer neurons.

 It consists of artificial neurons. T

 Neurons in the brain pass the signals to perform


the actions.

 Similarly, artificial neurons connect in a neural


network to perform tasks.

 The connection between the artificial neurons is


called synapsis ,
neuron synapsis
 Every synapsis having a weight.
How does it work in simple way?

https://www.youtube.com/watch?v=ER2It2mIagI
What is Neuron?

Biological Neuron Artificial Neuron


Dendrite Inputs

Cell nucleus or Soma Nodes

Synapses Weights

Axon Output
How does Neuron work?
Activation function(1)
• Is the first mathematical operation done by
neuron
step1
y=

bi: bias

step2
z=Activation(y)=f(y)
Activation function (2)
What?

 Activation function is a mathematical function that we use to get the output of Node called also
Transfer Function,

Why?

 Used to determine the output of neural network like yes or no (Neuron is Activated or Not). Output is generally
between 0 and 1

Examples ?
How to train NN with BackProbagation
𝜕𝑙𝑜𝑠𝑠
𝑤 1𝑛𝑒𝑤=𝑤 1 𝑜𝑙𝑑 −𝜇
𝜕𝑤 1
y: correct result
x1
w1 𝝁 : 𝑳𝒆𝒂𝒓𝒏𝒊𝒏𝒈 𝑹𝒂𝒕𝒆
w4 o4
x2 w2 ŷ The same process will be
done some number of
w3 times (epochs) until loss
𝑙𝑜𝑠𝑠=( 𝑦 − ŷ ) 2
function will be reduced
x3
to 0
How does it work?

https://www.youtube.com/watch?v=bfmFfD2RIcg

https://www.youtube.com/watch?v=ER2It2mIagI
Train Multilayer NNT: Gradient Descent
Hidden Hidden
input Layer
Layer1 Layer2
w11
x1
f11 O11
w21 𝝁 : 𝑳𝒆𝒂𝒓𝒏𝒊𝒏𝒈 𝑹𝒂𝒕𝒆
w31 O/p Layer
x2 w41
O12 f21 O21
f 12
f21
ŷ

O22
y: eventuel result
x3
O13
f22 Loss=(y-ŷ) 2
ŷ : calculated result
f 13
x4
Gradient Descent Optimizer
Initialization of NN Parameters:Weights and
Repeat to obtain bayes
the minimum cost

Compute Gradient: calculate derivated with


respect of each parameter

Update parameters
Types of Gradient Descent

 SGD: Stochastic Gradient Descent: In SGD, a random mini-batch of training examples is used to estimate the
gradient in each iteration. This introduces randomness and can lead to faster convergence and escape local
minima

 Batch Gradient Descent: In this variant, the entire training dataset is used to compute the gradient in each
iteration. It can be computationally expensive for large datasets but is more stable than SGD

 Mini-Batch Gradient Descent:This approach combines the benefits of full-batch gradient descent and SGD by
using a small random subset (mini-batch) of the training data in each iteration.
Chain Rule in Backpropagation
Hidden Hidden
input Layer
Layer1 Layer2
w111 𝝁 : 𝑳𝒆𝒂𝒓𝒏𝒊𝒏𝒈 𝑹𝒂𝒕𝒆
x1 w 11 2

f11
w121 O11 𝝏 𝒍𝒐𝒔𝒔 𝝏 𝒍𝒐𝒔𝒔 𝝏O31
= ∗
O/p Layer 𝝏 w 3 11 𝝏O 3 1 𝝏 w 311
w212
x2
w131
f21 O21
f 12
O12 w311
f21
ŷ O31

w141 O22
x3
f22 w321
()

f 13
O13
x4

y: eventuel result
Loss=(y-ŷ) 2

ŷ : calculated result
Types of NN
 Perceptron

 Feed Forward Neural Network

 Multilayer Perceptron

 Convolutional Neural Network

 Radial Basis Functional Neural Network

 Recurrent Neural Network

 LSTM – Long Short-Term Memory

 Sequence to Sequence Models

 Modular Neural Network


Perceptron (1)

 Perceptron model, proposed by Minsky-Papert is one of the


simplest and oldest models of Neuron.

 It is the smallest unit of neural network that does certain


computations to detect features or business intelligence in
the input data.

 It accepts weighted inputs, and apply the activation function


to obtain the output as the final result. Perceptron is also
known as TLU(threshold logic unit)

 Perceptron is a supervised learning algorithm that classifies


the data into two categories, thus it is a binary classifier.
Perceptron (2)

Avantages Disadvantages

Perceptrons can implement Logic Gates l Perceptrons can only learn linearly separable
like AND, OR, or NAND. problems such as boolean AND problem.

For non-linear problems such as the boolean XOR


problem, it does not work.
Bibliography
 Types of NN accessible on https://www.mygreatlearning.com/blog/types-of-neural-networks/ last access
2023 October

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy