0% found this document useful (0 votes)
8 views

Intro_deep_learning_week_1

Uploaded by

tamerkobba12
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
8 views

Intro_deep_learning_week_1

Uploaded by

tamerkobba12
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 45

Deep Learning

LAU – School of Arts and Science


Department of Computer Science and Maths
Deep Learning Chapter 1: Introduction to Deep Learning

Course Outline
1. Introduction to Deep Learning
2. Training Neural Networks
3. Optimization Using Neural Networks
4. Convolution Neural Networks (CNNs)
5. Recurrent Neural Networks (RNNs)
6. Natural Neural Networks Using Deep Learning
7. Self-attention and the transformer
8. Topics in Deep Learning

LAU – School of Arts and Science Department of Computer Science and Maths 2
Deep Learning Chapter 1: Introduction to Deep Learning

Evolution of Neural networks …

Deep learning is in fact a new name for an approach to artificial intelligence called neural
networks.

LAU – School of Arts and Science Department of Computer Science and Maths 3
Deep Learning Chapter 1: Introduction to Deep Learning

Evolution of Neural networks …


• The idea of neural networks began unsurprisingly as a model of
how neurons in the brain function.
• The neuron is the fundamental building block of neural networks.
• Each neuron in the brain has a relatively simple function.
• There are approximately 86 billion neurons in a mature human
brain.
• Each neuron can make connections with more than 1000 other
neurons, thus an adult brain has approximately 60 trillion neuronal
connections.
• These neurons act together to create an incredible processing unit.
• The brain is trained by its environment, learns by experience.

LAU – School of Arts and Science Department of Computer Science and Maths 4
Deep Learning Chapter 1: Introduction to Deep Learning

Evolution of Neural Networks


How biological neurons work ?

• The cell body containing the nucleus and metabolic machinery.


• Dendrites receive inputs from other neurons via synapses. Neurons and their connections in
• Axons transmit signal (information) via its synaptic terminals. brain imagery. Source: Wikipedia
• Axon transports the output signal as electric impulses along its length.
• Each neuron has one axon.
LAU – School of Arts and Science Department of Computer Science and Maths 5
Deep Learning Chapter 1: Introduction to Deep Learning

Evolution of Neural Networks


How biological neurons work ?
• A neuron only fires if its input signal exceeds a certain amount (threshold) in a
short time period.
▪ Connections vary in strength
▪ Good connections allowing a large signal
▪ Slight connections allow only a weak signal

• When a neuron receives a stimulus with high enough voltage, it emits an action
potential (aka, nerve impulse or spike). It is said to fire.

• the connections between neurons are highly complex, diverse and very flexible,
and the strength of the connections can be modified by a variety of factors,
including learning and experience.

LAU – School of Arts and Science Department of Computer Science and Maths 6
Deep Learning Chapter 1: Introduction to Deep Learning

Evolution of Neural Networks


• 1943: McCulloch (neuroscientist) and Pitts (mathematician) were the first to define a
mathematical computation model similar to neural networks and they put the neuron at
the center of their model as the basic unit to process information in the brain. In order to
describe how neurons in the brain might work, they modeled a simple neural network
using electrical circuits.

A simple neuronal network in McCullogh and Pitts notation, and its truth table.
The state of each neuron is indicated by Ni.

LAU – School of Arts and Science Department of Computer Science and Maths 7
Deep Learning Chapter 1: Introduction to Deep Learning

Evolution of Neural Networks


The main elements of the McCulloch-Pitts model can be summarized as follow:
• Neuron activation is binary. A neuron either fire or not-fire.
• For a neuron to fire, the weighted sum of inputs must be equal or larger than a
predefined threshold.
• When a neuron receives a stimulus with high enough voltage, it emits an action potential
(aka, nerve impulse or spike). It is said to fire.
• If one or more inputs are stopping, the neuron will not fire.
• It takes a fixed one-time step for the signal to pass through a link.
• Neither the structure nor the weights change over time.

McCulloch and Pitts decided on this architecture based on what it was known
at the time about the function of biological neurons.

LAU – School of Arts and Science Department of Computer Science and Maths 8
Deep Learning Chapter 1: Introduction to Deep Learning

Evolution of Neural Networks


• McCulloch and Pitts developed a mathematical formulation know as linear
threshold gate, which describes the activity of a single neuron with two
states, firing or not-firing.
• A McCulloch-Pitts neuron takes in inputs, takes a weighted sum and returns
‘0’ if the result is below threshold and ‘1’ otherwise.

LAU – School of Arts and Science Department of Computer Science and Maths 9
Deep Learning Chapter 1: Introduction to Deep Learning

Evolution of Neural Networks


• An input is considered excitatory when its contribution
to the weighted sum is positive, for instance:

• whereas an input is considered inhibitory when its


contribution to the weighted sum is negative, for
instance:

• If the value of , the neuron fires, otherwise,


it does not.

It is important to highlight that the only role of the "weights" in the


McCulloch-Pitts model, as presented here, is to determine whether the
input is excitatory or inhibitory. the McCulloch-Pitts model is
actually unweighted.

LAU – School of Arts and Science Department of Computer Science and Maths 10
Deep Learning Chapter 1: Introduction to Deep Learning

Evolution of Neural Networks


• 1949: Hebb presents in his book (The Organization of Behavior) a learning
hypothesis of biological neurons. He pointed out the fact that neural connections
are strengthened each time they are used, a concept fundamentally essential to the
ways in which humans learn. If two nerves fire at the same time, he argued, the
connection between them is enhanced.

Hebbian Learning Rule

LAU – School of Arts and Science Department of Computer Science and Maths 11
Deep Learning Chapter 1: Introduction to Deep Learning

Evolution of Neural Networks


Limitations of the McCulloch-Pitts Artificial Neuron
• Only binary inputs and outputs are allowed: this is a significant limitation
since many of the features that you can imagine can be useful to make
decisions are continuous rather than binary. In many instances we may want
to attach a continuous value (e.g., a probability value) to a decision instead
of a yes or no label.

• No learning is possible: The model has no autonomy whatsoever, restricting


the problems that can be solved to the ones that you know how to solve
already. Open question: how do we learn a perceptron model? Frank
Rosenblatt answered this question.

LAU – School of Arts and Science Department of Computer Science and Maths 12
Deep Learning Chapter 1: Introduction to Deep Learning

Evolution of Neural Networks


• 1958: Frank Rosenblatt (a research psychologist working at Cornell Aeronautical
Laboratory), using the McCulloch and Pitt’s model of a neuron and the findings of Hebb,
went on to develop and introduced the first Perceptron : a single layer of neurons. This
can be seen as the first ancestor of modern neural networks.

Rosenblatt Perceptron.
The Perceptron. Source: Wikipedia Source: https://commons.wikimedia.org/wiki/File:Rosenblattperceptron.png

LAU – School of Arts and Science Department of Computer Science and Maths 13
Deep Learning Chapter 1: Introduction to Deep Learning

Evolution of Neural Networks


Rosenblatt made the following key modifications to the McCulloch-
Pitts neuron:
• Numeric inputs: He replaced the binary inhibitory/excitatory inputs with continuous
equivalents via real-valued inputs and weights. Unlike the binary inputs of the MCP
neuron, the perceptron could process numbers in any range.

• Weighted sum: He changed the MCP neuron's activation rule, which was based on
reaching a threshold, to a rule that fires if a weighted sum plus a bias is reached. The
threshold Ɵ is computed automatically

• Machine learning: He introduced the perception learning rule, The learning rule allowed
the perceptron to learn from examples, adjust its weights, and make predictions based
on its inputs.

LAU – School of Arts and Science Department of Computer Science and Maths 14
Deep Learning Chapter 1: Introduction to Deep Learning

Evolution of Neural Networks


Representation of the Perceptron

LAU – School of Arts and Science Department of Computer Science and Maths 15
Deep Learning Chapter 1: Introduction to Deep Learning

Evolution of Neural Networks


The Influence of Bias
• The bias term is an adjustable, numerical term added to a perceptron’s weighted
sum of inputs and weights that can increase accuracy.

• The addition of the bias term is helpful because it serves as another model
parameter (in addition to weights) that can be tuned to make the model’s
performance on training data as good as possible. It also provide the perceptron
with additional flexibility in modeling complex patterns in the input data.

• The default input value for the bias weight is 1 and the weight value is adjustable.

LAU – School of Arts and Science Department of Computer Science and Maths 16
Deep Learning Chapter 1: Introduction to Deep Learning

Evolution of Neural Networks


How perceptron learn?
• To learn from data, we must have training set of input-output examples, the Perceptron
should ‘learn’ a function from, for each example increase the weights if the Perceptron
output for that example’s input is too low compared to the example, and otherwise
decrease the weights if the output is too high.

• The neuron makes a prediction, and an error is calculated. The perceptron learning rule
is then applied to update the weights of the model in order to reduce the error for the
given example.

• The perceptron learning rule consists of minimizing the difference between desired and
actual output.

LAU – School of Arts and Science Department of Computer Science and Maths 17
Deep Learning Chapter 1: Introduction to Deep Learning

Evolution of Neural Networks


A simple algorithm to learn a perceptron
1. Start with random weights in a perceptron.

2. For a training example, compute the output of the perceptron.

3. If the output does not match the correct input:


i. If the correct output was 0 but the actual output was 1, decrease the weights that
had an input of 1.
ii. If the correct output was 1 but the actual output was 0, increase the weights that had
an input of 1.

4. Repeat steps 2-3 for all the training examples until the perceptron makes no more
mistakes.

LAU – School of Arts and Science Department of Computer Science and Maths 18
Deep Learning Chapter 1: Introduction to Deep Learning

Evolution of Neural Networks


Drawbacks of perceptron’s Rosenblatt?
• The model implements the functioning of a single neuron that can solve linear
classification problems through very simple learning algorithms.
• The network is only compound of one neuron.
• This simple single neuron model has the main limitation of not being able to solve non-
linear separable problems.

LAU – School of Arts and Science Department of Computer Science and Maths 19
Deep Learning Chapter 1: Introduction to Deep Learning

Evolution of Neural Networks


• 1960: Bernard Widrow and Tedd Hoff develop the ADALINE (Adaptive Linear Neuron or
later Adaptive Linear Element) network, which is a single-layer neural network. The
physical device, where it was implemented carries the same name.

Schematic of a single ADALINE unit.


Photo of an ADALINE machine, with hand-adjustable weights. Source: https://en.wikipedia.org/wiki/ADALINE
Source: https://en.wikipedia.org/wiki/ADALINE

LAU – School of Arts and Science Department of Computer Science and Maths 20
Deep Learning Chapter 1: Introduction to Deep Learning

Evolution of Neural Networks


AI winter is coming !
• 1969: Marvin Minsky (founder of MIT AI lab) and Seymour Papert (director of the lab)
published a book named “rigorous analysis of the limitations of perceptrons”. They
demonstrated the limitation of single layer perceptrons.

LAU – School of Arts and Science Department of Computer Science and Maths 21
Deep Learning Chapter 1: Introduction to Deep Learning

Evolution of Neural Networks


Limitations of perceptron’s Rosenblatt?
• While the perceptron showed promise in early experiments, it had limitations. It could only
solve linearly separable problems and struggled with complex patterns, leading to doubts
about its practical applications

• Impossible to learn an XOR function using a single perceptron. (XOR function is not linearly
separable).

• They basically said: this approach was a dead end. This publication was believed to lead to the
first AI winter - a freeze to funding and publications.

Led to the first AI winter from the 70s to the 80s.


The development of neural networks on hold
LAU – School of Arts and Science Department of Computer Science and Maths 22
Deep Learning Chapter 1: Introduction to Deep Learning

Evolution of Neural Networks


• 1986: Rummelhart, Hilton and Williams propose and precisely explain the back-
propagation algorithm to train multi-layer perceptrons.

• The Neural networks with a hidden layer (multi-layer perceptrons) Become popular
- Train by "backpropagating errors"
- Have a non-linearity between layers (allows representation of arbitrary functions)

• The introduction of the multilayer perceptron (MLP) and the backpropagation algorithm
brought new life to neural networks. Multi-layer perceptrons can model arbitrarily
complex Boolean functions.

LAU – School of Arts and Science Department of Computer Science and Maths 23
Deep Learning Chapter 1: Introduction to Deep Learning

Evolution of Neural Networks


Second AI winter is coming (in the mid 90s) !
• Misunderstood theorem about representation ability of network with a single hidden
layer discourages research in more layers.

• Multi-layer neural networks trained with back-propagation don’t work well, especially
compared to simpler models.

• Led to the second AI winter in the mid 90s.

LAU – School of Arts and Science Department of Computer Science and Maths 24
Deep Learning Chapter 1: Introduction to Deep Learning

Evolution of Neural Networks


The rise of deep learning
• In 2006, Hinton , Osindero and Teh showed that back-propagation works if the initial
weights are chosen in a smart way. Deep (neural networks) Learning gains popularity !

• In 2012 made significant break-through in many applications.

• The deep learning tsunami continues today.

LAU – School of Arts and Science Department of Computer Science and Maths 25
Deep Learning Chapter 1: Introduction to Deep Learning

Ethics AI and Data Ethics


• Ethics is a set of moral principles which help us discern between right and wrong.

• AI ethics is a multidisciplinary field that studies how to optimize AI's beneficial impact
while reducing risks and adverse outcomes.

• Examples of AI ethics issues include data responsibility and privacy, fairness, explainability,
robustness, transparency, environmental sustainability, inclusion, moral agency, value
alignment, accountability, trust, and technology misuse.

LAU – School of Arts and Science Department of Computer Science and Maths 26
Deep Learning Chapter 1: Introduction to Deep Learning

Ethics AI and Data Ethics

How to control a system that is smarter than us?

Who’s responsible for AI’s mistakes?

LAU – School of Arts and Science Department of Computer Science and Maths 27
Deep Learning Chapter 1: Introduction to Deep Learning

Ethics AI and Data Ethics

Should we use AI in medicine ?

LAU – School of Arts and Science Department of Computer Science and Maths 28
Deep Learning Chapter 1: Introduction to Deep Learning

Ethics AI and Data Ethics

Should we use AI in law ?

LAU – School of Arts and Science Department of Computer Science and Maths 29
Deep Learning Chapter 1: Introduction to Deep Learning

Ethics AI and Data Ethics

Should we use AI in hiring ?

LAU – School of Arts and Science Department of Computer Science and Maths 30
Deep Learning Chapter 1: Introduction to Deep Learning

Ethics AI and Data Ethics

Should we use AI in law enforcement (Police) ?

LAU – School of Arts and Science Department of Computer Science and Maths 31
Deep Learning Chapter 1: Introduction to Deep Learning

AI privacy
Data Collection and Surveillance
Invasive Technologies
Profiling and Predictive
Analytics Voice and Speech Recognition
Algorithmic Bias and Discrimination
Regulatory Compliance [GDPR]
Data Encryption and Security Measures
User Consent and Control Transparency and Explainability

LAU – School of Arts and Science Department of Computer Science and Maths 32
Deep Learning Chapter 1: Introduction to Deep Learning

Breakthroughs with Deep learning : State of the Art

LAU – School of Arts and Science Department of Computer Science and Maths 33
Deep Learning Chapter 1: Introduction to Deep Learning

Breakthroughs with Deep learning : Image recognition

LAU – School of Arts and Science Department of Computer Science and Maths 34
Deep Learning Chapter 1: Introduction to Deep Learning

Breakthroughs with Deep learning : Image segmentation

Object detection using “Mask R-CNN”.

LAU – School of Arts and Science Department of Computer Science and Maths 35
Deep Learning Chapter 1: Introduction to Deep Learning

Breakthroughs with Deep learning : Image Generation

Images generated by a GAN from a dataset of faces of celebrities

LAU – School of Arts and Science Department of Computer Science and Maths 36
Deep Learning Chapter 1: Introduction to Deep Learning

Breakthroughs with Deep learning : Captions generated


entirely by a neural network

LAU – School of Arts and Science Department of Computer Science and Maths 37
Deep Learning Chapter 1: Introduction to Deep Learning

Breakthroughs with Deep learning : boarding games

LAU – School of Arts and Science Department of Computer Science and Maths 38
Deep Learning Chapter 1: Introduction to Deep Learning

Breakthroughs with Deep learning : Autonomous Vehicles

LAU – School of Arts and Science Department of Computer Science and Maths 39
Deep Learning Chapter 1: Introduction to Deep Learning

Breakthroughs with Deep learning :


Financial Forecasting and Stock Market Analysis

LAU – School of Arts and Science Department of Computer Science and Maths 40
Deep Learning Chapter 1: Introduction to Deep Learning

Breakthroughs with Deep learning : Healthcare and Disease Diagnosis

LAU – School of Arts and Science Department of Computer Science and Maths 41
Deep Learning Chapter 1: Introduction to Deep Learning

Breakthroughs with Deep learning : NLP (Natural Language Processing)


Several big improvements in recent years in NLP

• Machine Translation
• Sentiment Analysis
• Dialogue Agents
• Question Answering
• Text Classification

LAU – School of Arts and Science Department of Computer Science and Maths 42
Deep Learning Chapter 1: Introduction to Deep Learning

Breakthroughs with Deep learning : Robotics

LAU – School of Arts and Science Department of Computer Science and Maths 43
Deep Learning Chapter 1: Introduction to Deep Learning

Main researchers in Deep Learning


• Samy Bengio : https://research.google.com/pubs/bengio.html
• Yoshua Bengio : http://www.iro.umontreal.ca/~bengioy/yoshua_en/research.html
• Thomas Dean : htps://research.google.com/pubs/author189.html
• Jeffrey Dean : https://research.google.com/pubs/jeff.html
• Nando de Freitas : https://www.cs.ox.ac.uk/people/nando.defreitas/
• Geoff Hilton : http://www.cs.toronto.edu/~hinton/
• Yann LeCun : http://yann.lecun.com/
• Andrew Ng : http://www.andrewng.org/
• Quoc Le,
• Honglak Lee,
• Tommy Poggio, ...

LAU – School of Arts and Science Department of Computer Science and Maths 44
Deep Learning Chapter 1: Introduction to Deep Learning

Software libraries for Deep Learning

LAU – School of Arts and Science Department of Computer Science and Maths 45

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy