0% found this document useful (0 votes)
12 views11 pages

neural network

The document compares artificial neurons in neural networks with biological neurons, highlighting their structural and functional differences. It outlines the evolution of neural networks from the 1940s to the present, detailing key developments and applications in various fields. Additionally, it discusses the advantages and disadvantages of neural networks, their learning mechanisms, and the integration of fuzzy logic in neuro-fuzzy systems.

Uploaded by

razorquake2499
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
12 views11 pages

neural network

The document compares artificial neurons in neural networks with biological neurons, highlighting their structural and functional differences. It outlines the evolution of neural networks from the 1940s to the present, detailing key developments and applications in various fields. Additionally, it discusses the advantages and disadvantages of neural networks, their learning mechanisms, and the integration of fuzzy logic in neuro-fuzzy systems.

Uploaded by

razorquake2499
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 11

https://www.mygreatlearning.

com/blog/types-of-neural-networks/

https://thecleverprogrammer.com/2020/07/19/image-classification-with-ann/

While artificial neurons are inspired by biological neurons and share some key characteristics, they
are simplified mathematical models designed for specific computational tasks. Biological neurons are
highly complex and versatile, serving various functions in the human brain beyond the scope of
artificial neurons.

similarities and differences between artificial neurons in neural networks and biological neurons:

Aspect Artificial Neurons (in Neural Networks) Biological Neurons

Consist of dendrites, cell body (soma),


Basic Structure Comprise inputs, processing unit, and output and axon

Information Receive inputs, perform computations, generate Receive signals, process them, and
Processing outputs transmit signals

Connections Connected via weighted connections (synapses) Connected via synapses

Activation Require a certain level of input for action


Threshold Use activation functions to determine firing potential

Can exhibit non-linear responses to input


Non-Linearity Activation functions introduce non-linearity signals

Learning Adjust weights during training to learn patterns Exhibit synaptic plasticity

Exhibit synaptic plasticity and


Adaptation Adapt by changing weights and parameters adaptability
Time Period Key Developments and Milestones

1940s-1950s - McCulloch and Pitts propose artificial neuron concept

1960s-1970s - Development of the Perceptron for binary classification

tasks

- Limited success, leading to a decline in interest

1980s-1990s - Resurgence with advances in training algorithms,

including backpropagation

- Introduction of Multi-Layer Perceptron (MLP)

- Challenges with training deep networks due to the

vanishing gradient problem

2000s-2010s - Breakthroughs in deep learning with CNNs and RNNs

- AlexNet (2012) sets a milestone in image recognition

- Application of deep learning to various fields,

including natural language processing

2010s-Present - Continued advancement with GANs and Transformers

- Deep learning's impact on computer vision, speech

recognition, autonomous systems, and more

- Ongoing research to explore new architectures and

training techniques
An input layer: The first layer, which accepts input data and transmits it to the layers below.
One or more Layers that process data via interconnected neurons b/w input and output layers.
hidden layers:
An output layer: The layer at the end of the network that generates predictions or outputs.
Neurons: The fundamental processing units take in inputs, apply weights and biases, and
then, through an activation function, produce an output.
Weights: Parameters that control the intensity of connectivity between neurons and thus
the direction of information flow.
Biases: Additional neuronal characteristics that adjust the network’s behavior and
change the activation function.
Activation A non-linear function applied to the weighted sum of inputs in each neuron,
function: introducing non-linearity to the network.
Forward The method of creating predictions by transferring input data from the input
propagation: layer to the output layer through the network.
Backpropagation: The process of computing gradients of the error with respect to weights and
biases to adjust them during training.
Loss function: A function that gauges the performance of the network by calculating the
difference between predicted results and actual labels.
Neural Networks
Pros Cons

 Can often work more efficiently and for longer than  Still rely on hardware that may require labour
humans and expertise to maintain
 Can be programmed to learn from prior outcomes to  May take long periods of time to develop the
strive to make smarter future calculations code and algorithms
 Are continually being expanded in new fields with  Usually report an estimated range or estimated
more difficult problems amount that may not actualize

 When an element of the NN fails, its parallel nature can  Needs training to operate.
continue without any problem.

1. Data Preparation
2. Network Architecture Design
3. Data Pre-processing
4. Training the ANN
5. Validation and Tuning
6. Deployment and Testing
7. Decision-Making and Insights

1. Non-linearity
2. Adaptability and Learning
3. Parallel Processing
4. Robustness
5. Feature Extraction

1. Non-linearity: ANNs can capture complex non-linear


relationships between input variables and output
predictions. Unlike traditional linear models, ANNs can
handle intricate patterns and interactions within the
data, allowing them to model and predict more
accurately in real-world business scenarios.
2. Adaptability and Learning: ANNs can learn from
data and adapt their internal parameters (weights and
biases) to optimize performance. Through training,
ANNs adjust these parameters iteratively, improving
their predictive capabilities over time. This
adaptability enables ANNs to respond to changing
business conditions and refine their predictions
accordingly.
3. Parallel Processing: ANNs can process information
parallel across multiple neurons and layers. This
similar processing capability allows for efficient
computation and scalability, making ANNs suitable
for handling large and complex datasets commonly
encountered in business and finance.
4. Robustness: ANNs can handle noisy or incomplete
data, making them resilient to certain imperfections.
They can generalize patterns from training data to
make predictions on unseen data, thus providing
robust performance even when faced with imperfect
or partial information.
5. Feature Extraction: ANNs can automatically extract
relevant features from the input data. Instead of
relying solely on human-defined features, ANNs can
learn and identify essential elements within the data.
This ability to extract meaningful features is precious
in business settings, where large and diverse datasets
may contain hidden patterns that are not apparent to
human analysts.

Artificial Neural Network vs


Biological Neural Network
A comparison between Artificial Neural Networks (ANNs)
and Biological Neural Networks (BNNs) is as follows:

Aspect Artificial Neural Network (ANN) Biological Neural Network (BNN)

Origin Designed and developed by humans Found in living organisms

Structure It can have complex architectures and layers Comprises interconnected neurons

Complexity Can have complex architectures and layers Varies in complexity across species

Learning Learns through backpropagation and Learns through adaptation and


Mechanism training experience

Slower due to chemical and


Speed Can perform computations quickly
biological processes

Scalability Can be scaled up or down as needed Limited by biological constraints

Processing Processing power varies across


Can process vast amounts of data rapidly
Power organisms

Fault
Resistant to noise and incomplete data Susceptible to noise and errors
Tolerance

Replication Easily replicated and controlled for Inherent control systems with
and Control experiments regulatory mechanisms

Neuro-Fuzzy Model and its applications


What is Fuzzy logic?

Fuzzy logic refers to the logic developed to express the


degree of truthiness by assigning values between 0 and 1
unlike traditional boolean logic representing 0 and 1.

What is Fuzzy logic role in Neural networks?

Fuzzy logic and it have one thing in common. They can be


used to solve pattern recognition problems and others that
do not involve any mathematical model.

What are the applications of Neuro-Fuzzy Model?

Systems combining both fuzzy logic and neural networks


are neuro-fuzzy systems. These systems (Hybrid) can
combine the advantages of both it and fuzzy logic to
perform in a better way. Fuzzy logic and it have been
integrated to use in the following applications -

 Automotive engineering
 Applicant screening of jobs

 Control of crane

 Monitoring of glaucoma
In a hybrid (neuro-fuzzy) model, Neural Networks Learning
Algorithms are fused with the fuzzy reasoning of fuzzy
logic. It determines the values of parameters, while if-then
rules are controlled by fuzzy logic.

Neural Network for Machine Learning


 Multilayer Perceptron (supervised classification)
 Back Propagation Network (supervised
classification)
 Hopfield Network (for pattern association)
 Deep Neural Networks (unsupervised clustering)
Neural Networks for data-intensive applications
It have been successfully applied to the broad spectrum of
data-intensive applications, such as:
Application Architecture / Algorithm Activation Fu

Process modeling and


Radial Basis Network Radial Basis
control

Tan- Sigmoid
Machine Diagnostics Multilayer Perceptron
Function

Tan- Sigmoid
Portfolio Management Classification Supervised Algorithm
Function

Tan- Sigmoid
Target Recognition Modular Neural Network
Function

Tan- Sigmoid
Medical Diagnosis Multilayer Perceptron
Function

Credit Rating Logistic Discriminant Analysis with ANN, Support Vector Machine Logistic functi

Targeted Marketing Back Propagation Algorithm Logistic functi

Multilayer Perceptron, Deep Neural Networks( Convolutional Neural


Voice recognition Logistic functi
Networks)

Financial Forecasting Backpropagation Algorithm Logistic functi

Intelligent searching Deep Neural Network Logistic functi

Fraud detection Gradient - Descent Algorithm and Least Mean Square (LMS) algorithm. Logistic functi

Face Recognition using Artificial Neural Networks

Face recognition entails comparing an image with a


database of saved faces to identify the person in that input
picture. It is a mechanism that involves dividing images
into two parts; one containing targets (faces) and one
providing the background. The associated assignment of
face detection has direct relevance to the fact that images
need to analyze and faces identified earlier than they can
be recognized.

A subclass of deep learning techniques specifically built to


deal with graph data and make inferences from it. Click to
explore about our, Graph Neural Network on AWS

What is learning rule in neural network?


The learning rule is a type of mathematical logic. It
encourages to gain from the present conditions and
upgrade its efficiency and performance. The learning
procedure of the brain modifies its neural structure. The
expanding or diminishing quality of its synaptic
associations rely upon their activity. Learning rules in the
Neural network:

 Hebbian learning rule: It determines how to


customize the weights of nodes of a system.
 Perceptron learning rule: Network starts its
learning by assigning a random value to each load.
 Delta learning rule: Modification in a node's
sympatric weight is equal to the multiplication of
error and the input.
 Correlation learning rule: It is similar to
supervised learning.

Approach Explanation
- Analyze neuron weights to determine feature importance and
Weight Analysis
connection strengths.
- Visualize neuron activations to understand how information is
Activation Visualization
processed through hidden layers.
- Generate inputs that maximize neuron activations, revealing what
Feature Visualization
each neuron recognizes.
Interpretability - Use LIME and SHAP to provide local and global explanations for
Techniques model predictions.

Knowledge Distillation - Train a smaller model to mimic the MLP's behavior, facilitating
insight extraction.
Layer-wise Relevance - Assign relevance scores to neurons and connections, showing their
Propagation contributions.
- Optimize inputs to maximize neuron activations, revealing neuron
Activation Maximization
preferences.
- Derive human-readable rules or decision trees from the MLP for
Rule Extraction
transparent representation.
Ensemble and Ablation - Analyze ensembles and perform ablation studies to understand model
Studies behavior and components.
Task-specific - Tailor interpretation approaches to the specific problem domain, e.g.,
Interpretation visualizing filters.

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy