deep learning
deep learning
ztdf
73
oceP Learning
Artificial lntelligen ce
achlne L
( Deep UeamlQQ
Bias
b
Activation Output
iJ = g(w • x + b)
Inputs
l>t•,•p l.c11rnln~ 77
dendrites
...
nucleus
. ~ ~ ~ ,(
~~ body 1
cell ~I
t
axon
terminals
out
f
•
In n
bias
these input signals, and the output layer nodes compute the final output by processing the
hidden layer's results using activation functions.
Biological Neuron Artificial Neuron
Dendrite Inputs
Cell nucleus or Soma Nodes
Synapses Weights
Axon Output
Synaptic plasticity Backpropagations
Here's a comparison between Biological Neural Networks (BNN) and Artificial Neural
Networks (ANN) in tabular form :
7H Fundam ental ,,r Al
/\c.tlvn l lon
fun ction
~ ut
q,(. ) i----Yi
Summing junctio n
I> W k4
Synaptic Welnhts
rh..: McC ulloch-Pitts neuron consists of I lcrc, wrcprescnts the weights, x,thc
several key components: Input values, ond the sum is taken over
Inputs : The model accepts multiple oil inputs.
binury inputs (0 or l ), which represent - Binary Nnture: The model operates in
signals from other neurons or sensory a/binary fashion, meaning it can only
inputs. produce o utputs of O or I. T hi s
Weights : Each input is ussociatcd characteristic aligns with the idea of
with a weight, although in the originul simple decision-making processes.
mode l, these weights arc impl icitly 3. Logic Gates
considered as being equal (typically The McCulloch-Pitts neuron can be used
binary: either present or absent). to model basic logical fun ctions. By
C:
Summation : The inputs are summed configuring the inputs and thresholds, it
together, resulting in a total activation can represent various logic gates:
value. - AND Gate: Re4uircs all inputs to be
.
Threshold : A threshold value is
I to produce an output of I .
defin ed. Jf the summed inputs exceed - OR Gnte: Requires at least one input
this threshold, the neuron "fires" and to be I to produce an output of 1.
produces an output of I; otherwise, the - NOT Gntc: Inverts the input; if the
output is 0. input is 0, the output is I, and vice
2. Functionality versa.
The McC ull och-Pitts neuron operates 4. SIGNIFICANCE
based on the following logic: - Foundutlon of Neural Networks :
- Activation Function: The output y is The McCulloch-Pitts model was pivotal
determined by the equation: in establishing the concept of neurons
us compututionul units, inlluencing later
y - {I if l:( w, . x1 ) threshold}
~.
development s in urti fici ul neural
() otherwi se networks.
Fundamental of Al
80
6. Conclusion :
- Theoretical Framework : It provided
The McCulloch-Pitts neuron is a funda-
a simple framework to understand how
mental concept in the history of artifi cial
networks of neurons could be used to
intelligence and neural netwo rk s. By
perform complex computations.
simulating basic logical operati ons and
5. Limitations introducing the idea of neuron s as
While the McCulloch-Pitts neuron was computational units, it paved the way for
ground-breaking, it has several limitations: more advanced models and techniques in
- Binary Inputs and Outputs : The the field . Despite its s impli city, th e
binary nature restricts its ability to principles established by thi s model
model more complex, continuous-valued continue to influence modern neural
functions. network d~signs.
- Static Weights : The original model • Types of Artificial Neural Networks
does not incorporate learning or weight Artificial Neural Networks (ANNs) come
adjustment mechanisms. in various architectures, each suited to
- Lack of Complexity : It cannot speci fie tasks and data types. Below is a
represent more complex functions detailed overview of the most common
requiring non-linear combinations of types of ANNs, their structures, and their
inputs. applications.
outputs
inputs
outputs
inputs con11>etition/in hi bit ion
feedback
3. Recurrent Neural
Networks (RNN)
I
' -- --- ----
Input L.ayer
-----
Hidden Layer Output L.ayer
neurons x, neurons YI neurons :Zk
Fu n<1amcntal or Al / 2025 / 11
fg & ' ,_, w
◄
1 C •
Ml Fundum cnhel of Al
4. Henlthcarc
Medica l Diagno sis: ANNs assist in
8. Time Series Prediction
· Stock Price Predic tion : AN Ns
83
Fraud Detect ion : ANNs analyz e - Manu factur ing : U sed to d etect
transac tion pattern s to identif y defects or anomalies in products during
potenti ally fraudul ent activities in real- the produc tion process .
time. 10. Gamin g :
. Credit Scorin g : Assess ing the - Game AI : ANNs are employ ed to
creditw orthine ss of individ uals by create intelligent agents that can adapt
analyzi ng historic al data. to players ' strategies in real-tim e.
• Algorithmic Tradin g : ANNs are used - Proced ural Conte nt Gener ation :
to develop trading strategies based on Used to generate game environ ments or
market data. levels based on learned pattern s.
6. Autono mous System s : 4.3 Types of Deep Learni ng Models
· Self-D riving Cars : ANNs process Deep learning encomp asses a variety of
sensor data to unders tand the model types, each suited for specific tasks
enviro nment and make driving and data types. Here's a detailed overvie w
decisions. of the main types of deep learning models :
- Roboti cs : Used in robotic systems for 1. Feedforward Neural Networ ks (FNNs )
naviga tion, manip ulation , and 2. Convolutional Neural Networ ks (CNNs )
interact ion with objects . 3. Recurrent Neural Networ ks (RNNs )
7. Recom menda tion System s 4. Generative Adversarial Networ ks (GANs )
· E-com merce : ANNs analyz e user 5. Autoen coders
behavior and preferences to recommend
6. Transformers
produc ts ( e.g., Amazo n, Netflix ).
7. Graph Neural Networ ks (GNNs )
· Conten t Recom menda tion : Used by
platform s like YouTu be and Spotify to 8. Deep Reinfor cement Leamin g Models
suggest videos or music based on user 1. Feedforward Neural Networks (FNNs )
interests. Struct ure : The s imples t ty pe o f
IU l· uncfamrn1 .. 1 ,,t Al
ncur.il r\clwurk where data llow'i 111 one d1rcl.llon lrom 1npu1 11, 1>u1pu1 w1th,,ul l) c.ll:
or loop,
u~age : l omnu,nl y used tor hJ'iH.: cltt \'il licat,on and rcgrc'i'ito n ta"lkfi. 'iuch a'i pred1u1ng
hou, ing prn.c:, or diJ'.','tl I> mg 1111,1µc-,
r\ 0 . D
, J
.. 0 ,.
.,.
,,.
~
(J / ()
0 .. ..
0 ,..,. _,,., .(_)
0
Input layer
0 D output layer
Fully
Connec ted
Co nvo lution
0 ~.
Poo ling Output
Input
LJ
·o ·-
~---- - - - - - - - y - - - ~ ) \...____y____)
Feature Extract ion Classifi cation
Lcurninf,! HS
occP
~Usngc : Primarily used in computer Key Variants :
v is ion tas k s . s uc h as ima g e
Long Short-Term Memory (LSTM):
recog niti o n. o bj e ct de te cti o n, a nd
Addresses th e v a ni s h ing g rn d ic n l
segmentation.
prob lem, enabl ing better lea rn ing of
Recurrent Neural Networks (RNNs)
3. long-range dependenc ies.
Structure: Des ig ned for sequen tial
data. RNNs have loops allowing them Gated Recurrent Unit (GRU): A
to maintain a mem ory of prev ious simplified vers ion o f LSTM tha t is
in uts. often faster and requ ires less memory.
Recurrent network
output layer
input layer \ (class/target)
y
hidden layers : "deep" if> 1
Usage: Ideal for time series analysis, natural language processing (NLP), and speech
recognition.
4. Generative Adversarial Networks (GANs)
Structure: Comprises two neural networks- the generator and the discriminator- that
compete against each other.
H,gh
O,o nens,o nal
Sam ple
um
lf Jl
Space Real
·l lmuges
D 1scn min;:it 1w •
Re..)I
Nctw orh. 0
Lovr.1 Gener ative
D l_,
lJ1mer)s1on.:il N etwork Generated Fa ... e
Laten t Fake Images J
~pace G
,
86 Fundamental of AI
-----
purpose Primarily used for spati a l data
analysis (e.g., images).
Designed for sequential data and
time-series analysis.
_Architecture Consi sts of convoluti onal and Composed o r recurrent layers with
pooling layers, followed by fully loops to maintain memory o f
connected layers. previous inputs.
oatn Handling Processes data with a fixed input Handles variable-len g th input
size, suitable for grid-I ike sequences, ma intnining co nte x t
structures. across time steps.
Feature Extraction Learns temporal dependencies and
Automatically learns spatial relationships in sequences.
hierarchies and patterns through
Memory convolutions. Maintains hidden s tates to
Lacks memory of previous inputs remember previous inputs ,
once the input is processed. allowing for context.
Training Time Slower training due to sequential
Generally faster to train due to nature; training must occur step-
parallel processing capabilities. by-step.
Applications Natural language processing, time
Image classification, object series forecasting, and speech
detectiun, and video analysis. recognition.
4.4 Deep Learning Applicati ons : no ise. leadin g to appl icat ions like art
generation. deepJ'akcs, and vi rtual n:tili ty
Deer learning has tran sformed many
environments.
industries by enabling sophisticated data
analysis and decision-making. 2. Natural Language Processing (NLP)
I. Computer Vision • Sentimen t Analysis :
2. Natural Language Processin g (NLP) Analyzing text data to determine sentiment
3. Healthcare (positive, negative, neutral). Businesses use
thi s to gauge publi c opinion about
4. Autonomous Vehicles products or services.
5. Finance
• Machine Translation :
6. Gaming and Entertainment Deep learning models have significant ly
7. Manufacturing and Industry improved translation services. as seen in
8. Agriculture tools like Google Translate , enabling
1. Compute r Visio~ accurate translatio ns between multiple
languages.
• I magc Recognition:
• Chatbots and Conversa tional Agents :
- Deep learning models, particula rly
Convolu tional Neural Network s NLP models ' power chatbots that can
(CNNs), excel in identifying objects understan d and respond to customer
within images. Applications include : queries in real-time, enhancing customer
service across various platforms.
- Facial Recogniti on : Used in security
• Text Generatio n :
systems and social media platforms for
tagging and identification. Models like GPT (Generative Pre-trained
Transform er) can create coherent text,
- Medical Imaging : Analyzing X-rays,
useful for content creation, summarization,
MRis, and CT scans to detect diseases
and even coding assistance.
such as cancer, enabling earlier and
more accurate diagnoses . 3. Healthcar e :
• Object Detection : • Medical Diagnosis
Detecting and classifying multiple objects Deep learning algorithms analyze medical
within an image. This technology is vital images and patient data to identify
111 : conditions such as diabetic retinopathy or
- Autonom ous Vehicles: Identifyi ng pneumon ia. They assist doctors by
pedestria ns, traflic signs, and other providin g second op1111ons and
vehicles. highlighting areas of concern.
- SurveiJla nce Systems: Monitori ng • Drug Discovery :
public spaces for safety and security. Deep learning models predict ho\','
• Image Generation : different compoun ds might interact.
speeding up the process of finding ne,,
Generative Adversarial Networks (GANs)
drugs and treatments.
can create realistic images from random
11 _ _ _ _ _ __ __ _89
_
t)t•rP 1' ,, r,,1-..i:_n:_.:_ _ _ _ _ _ _ _ _ _ _ _ _ __ _ _ _ _ _ _ __
-
- -prt·sonu • .. Mt•dicine :
1·z,.,t
6. Gnming nnd Entcrtninmcnt
• 8 , unnl y1 ing geneti c informntion nnd • Gnmc Al :
h~n lth records. d ee p lenrning cnn help
trcntmcnts to individual putienls. De ep lea rnin g e nh a n ces n o n- pl aye r
Ill ·1or
I
improving outcomes. character (NPC) behav iours. making them
more realisti c and responsive lo playe r
Autonomous Vehicles
nclions.
J. Deep lenrnin g is cruc in l for the
development o r sdf-driving cors. enabling • Content Creation:
them to: Dee p learning al go rithms arc used lo
Perceive the Environment : Us ing generate music. orl, and stories, providing
sen sors and cam e ra s to identify n ew w ays for c re a tors to ex pre ss
objects. road conditions. and obstacles. themselves.
Path Planning : Determining the best 7. Manufacturing and Industry
route while avoiding collisions and • Predictive Maintenance:
optimizing for time or distance.
By analyzing data from machinery, deep
Renl-Timc Decision Making : Making learning predicts when equipment is likely
split-s econd decisions based on to fail, allowing for timely ma intenance
dynamic data from the environment. and reducing downtime.
s. Finance • Quality Control :
Fraud Detection
Deep learning systems can inspect
Deep learning models analyze transaction products on production lines to identify
patterns to detect anomalies indicative of defects, ensuring high-quality standards are
fraudulent activity. helping banks and maintained.
financial institutions protect customers.
8. Agriculture
Algorithmic Trading:
• Crop Monitoring
Traders use deep learning to predict stock
Using drone imagery and deep learning
prices and execute trades based on vast
algorithms, farmers can monitor crop
amounts of financial data, identifying
health, detect diseases , and optimize
trends and making decisions in real-time.
yields.
Credit Scoring:
• Precision Agriculture :
By evaluating a wide array of financial
Deep learning models analy ze environ-
data, deep learning helps assess credit risk
mental data to inform decisions on
more accurately, improving lending
planting, watering, and harvesting. leading
processes.
to more efficient resource use.
F1ind1mcntal or Al / 2025 / 12