Neuro AI - Intelligent Systems and Neural Networks: Preface
Neuro AI - Intelligent Systems and Neural Networks: Preface
Preface
Skip this and go to Introduction to neural networks
Introduction
What is an artificial neural network?
An artificial neural network is a system based on the operation of biological
neural networks, in other words, is an emulation of biological neural system.
Why would be necessary the implementation of artificial neural networks?
Although computing these days is truly advanced, there are certain tasks
that a program made for a common microprocessor is unable to perform;
even so a software implementation of a neural network can be made with
their advantages and disadvantages.
Advantages:
• A neural network can perform tasks that a linear program can not.
• When an element of the neural network fails, it can continue without
any problem by their parallel nature.
• A neural network learns and does not need to be reprogrammed.
• It can be implemented in any application.
• It can be implemented without any problem.
Disadvantages:
• The neural network needs training to operate.
• The architecture of a neural network is different from the architecture
of microprocessors therefore needs to be emulated.
• Requires high processing time for large neural networks.
Another aspect of the artificial neural networks is that there are different
architectures, which consequently requires different types of algorithms, but
despite to be an apparently complex system, a neural network is relatively
simple.
Artificial neural networks are among the newest signal processing
technologies nowadays. The field of work is very interdisciplinary, but the
explanation I will give you here will be restricted to an engineering
perspective.
In the world of engineering, neural networks have two main functions:
Pattern classifiers and as non linear adaptive filters. As its biological
predecessor, an artificial neural network is an adaptive system. By adaptive,
it means that each parameter is changed during its operation and it is
deployed for solving the problem in matter. This is called the training phase.
A artificial neural network is developed with a systematic step-by-step
procedure which optimizes a criterion commonly known as the learning rule.
The input/output training data is fundamental for these networks as it
conveys the information which is necessary to discover the optimal operating
point. In addition, a non linear nature make neural network processing
elements a very flexible system.
The output of the neuron, yk, would therefore be the outcome of some
activation function on the value of vk.
Activation functions
As mentioned previously, the activation function acts as a squashing
function, such that the output of a neuron in a neural network is between
certain values (usually 0 and 1, or -1 and 1). In general, there are three
types of activation functions, denoted by Φ(.) . First, there is the Threshold
Function which takes on a value of 0 if the summed input is less than a
certain threshold value (v), and the value 1 if the summed input is greater
than or equal to the threshold value.
Secondly, there is the Piecewise-Linear function. This function again can take
on the values of 0 or 1, but can also take on values between that depending
on the amplification factor in a certain region of linear operation.
Thirdly, there is the sigmoid function. This function can range between 0 and
1, but it is also sometimes useful to use the -1 to 1 range. An example of the
sigmoid function is the hyperbolic tangent function.
The artifcial neural networks which we describe are all variations on the
parallel distributed processing (PDP) idea. The architecture of each neural
network is based on very similar building blocks which perform the
processing. In this chapter we first discuss these processing units and discuss
diferent neural network topologies. Learning strategies as a basis for an
adaptive system
weights:
in which dk is the desired activation provided by a teacher. This is often
called the Widrow-Hoff rule or the delta rule, and will be discussed in the
next chapter. Many variants (often very exotic ones) have been published
the last few years.
Top of Form
Bottom of Form
Neural Networks
Neural Network introduction
• Adaptive Resonance Theory
• Backpropagation Neural Network
• Hopfield neural network
• Kohonen
• Neocognitron
• Perceptron and Adaline
Applications
• Robot Control
• Speech Recognition
• Stock market prediction
Related Topics