0% found this document useful (0 votes)
12 views10 pages

CDI PL5 MatLab Neural Networks Toolbox

Uploaded by

abirtj
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
12 views10 pages

CDI PL5 MatLab Neural Networks Toolbox

Uploaded by

abirtj
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 10

CONTROLO E DECISÃO INTELIGENTE 08/09

PL #5 – MatLab Neural Networks Toolbox

Alexandra Moutinho

Example #1 Create a feedforward backpropagation network with a hidden layer.


Here is a problem consisting of inputs P and targets T that we would like to solve with a network.
>> P = [0 1 2 3 4 5 6 7 8 9 10];
>> T = [0 1 2 3 4 3 2 1 2 3 4];
Here a network is created with one hidden layer of 5 neurons.
>> net = newff(P,T,5);
Here the network is simulated and its output plotted against the targets.
>> Y = sim(net,P);
>> plot(P,T,'rs-',P,Y,'o')
>> legend('T','Y',0),xlabel('P')

-1

-2 T
Y

-3
0 2 4 6 8 10
P

Here the network is trained for 50 epochs. Again the network's output is plotted.
>> net.trainParam.epochs = 50;
>> net = train(net,P,T);
>> Y = sim(net,P);
>> figure,plot(P,T,'rs-',P,Y,'o')
>> legend('T','Y',0),xlabel('P')
4.5

3.5

2.5

1.5

0.5
T
0 Y

-0.5
0 2 4 6 8 10
P

Type net to see the network:


>> net

First the architecture parameters and the subobject structures:

net =
Neural Network object:
architecture:
numInputs: 1
numLayers: 2
biasConnect: [1; 1]
inputConnect: [1; 0]
layerConnect: [0 0; 1 0]
outputConnect: [0 1]
[numWeightElements:31 ??]
numOutputs: 1 (read-only)
numInputDelays: 0 (read-only)
numLayerDelays: 0 (read-only)
subobject structures:
inputs: {1x1 cell} of inputs
layers: {2x1 cell} of layers
outputs: {1x2 cell} containing 1 output
biases: {2x1 cell} containing 2 biases
inputWeights: {2x1 cell} containing 1 input weight
layerWeights: {2x2 cell} containing 1 layer weight

are shown. The latter contains information about the individual objects of the network. Each layer
consists of neurons with the same transfer function net.transferFcn and net input function
net.netInputFcn, which are in the case of perceptrons hardlim and netsum. If neurons should
have different transfer functions then they have to be arranged in different layers. The parameters
net.inputWeights and net.layerWeights specify among other things the applied learning
functions and their parameters. The next paragraph contains the training, initialization and performance
functions.

functions:
adaptFcn: 'trains'

Controlo e Decisão Inteligente – PL #5 MatLab Neural Networks Toolbox - Alexandra Moutinho 2


divideFcn: 'dividerand'
gradientFcn: 'calcjx'
initFcn: 'initlay'
performFcn: 'mse'
trainFcn: 'trainlm'

The trainFcn and adaptFcn are used for the two different learning types, batch learning and
incremental or on-line learning. By setting the trainFcn parameter you tell MatLab which training
algorithm should be used, which is in our case the cyclical order incremental training/learning function
trainc. The ANN toolbox includes almost 20 training functions. The performance function is the
function that determines how well the ANN is doing its task. For a perceptron it is the mean absolute
error performance function mae. For linear regression usually the mean squared error performance
function mse is used. The initFcn is the function that initialized the weights and biases of the
network. To get a list of the functions available type help nnet. To change one of these functions to
another one in the toolbox or one that you have created, just assign the name of the function to the
parameter, e.g.

>> net.trainFcn = ’mytrainingfun’;

The parameters that concern these functions are listed in the next paragraph.

parameters:

adaptParam: .passes
divideParam: .trainRatio, .valRatio, .testRatio
gradientParam: (none)
initParam: (none)
performParam: (none)
trainParam: .epochs, .goal, .max_fail, .mem_reduc,
.min_grad, .mu, .mu_dec, .mu_inc,
.mu_max, .show, .time

By changing these parameters you can change the default behavior of the functions mentioned above.
The parameters you will use the most are probably the components of trainParam. The most used of
these are net.trainParam.epochs which tells the algorithm the maximum number of epochs to
train, and net.trainParam.show that tells the algorithm how many epochs there should be
between each presentation of the performance. Type help train for more information.

The weights and biases are also stored in the network structure:

weight and bias values:

IW: {2x1 cell} containing 1 input weight matrix


LW: {2x2 cell} containing 1 layer weight matrix
b: {2x1 cell} containing 2 bias vectors
other:
userdata: (user information)

The .IW(i,j) component is a two dimensional cell matrix that holds the weights of the connection
between the input j and the network layer i. The .LW(i,j) component holds the weight matrix for
the connection from the network layer j to the layer i. The cell array b contains the bias vector for each
layer.

Controlo e Decisão Inteligente – PL #5 MatLab Neural Networks Toolbox - Alexandra Moutinho 3


To implement a Neural Network, 7 steps must be followed:
1. Load data source.
2. Select attributes required.
3. Decide training, validation, and testing data.
4. Data manipulations and Target generation (for supervised learning).
5. Neural Network creation (selection of network architecture) and initialization.
6. Network Training and Testing.
7. Performance evaluation.

Example#2 - Iris classification with NFTOOL


Load the data:
>> load iris.dat
>> P = iris(:,1:4);
>> T = iris(:,5);
Open the Neural Network Fitting Tool window with this command:
>> nftool

Controlo e Decisão Inteligente – PL #5 MatLab Neural Networks Toolbox - Alexandra Moutinho 4


Load target data Load input data Data description
from workspace from workspace

Divide data set in


subsets for training
and validation

Define number of
hidden neurons
Controlo e Decisão Inteligente – PL #5 MatLab Neural Networks Toolbox - Alexandra Moutinho 5
Controlo e Decisão Inteligente – PL #5 MatLab Neural Networks Toolbox - Alexandra Moutinho 6
Train network

Controlo e Decisão Inteligente – PL #5 MatLab Neural Networks Toolbox - Alexandra Moutinho 7


Mean-square error

Controlo e Decisão Inteligente – PL #5 MatLab Neural Networks Toolbox - Alexandra Moutinho 8


Controlo e Decisão Inteligente – PL #5 MatLab Neural Networks Toolbox - Alexandra Moutinho 9
Compare the output with the target:
>> figure,plot(T,'bo-'),hold on, plot(output,'g*-')
>> legend('target','output',0)

And plot the error:


>> figure,plot(error)
>> ylabel('error')

Controlo e Decisão Inteligente – PL #5 MatLab Neural Networks Toolbox - Alexandra Moutinho 10

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy