0% found this document useful (0 votes)
7 views11 pages

Generalized Feed Forwards and Support Vector Machines SVM

The document evaluates 70 Generalized Feed Forward (GFF) and 14 Support Vector Machines (SVM) models to identify optimal classifiers for portfolio selection. It discusses the efficiency of GFF networks compared to traditional Multi-Layer Perceptrons (MLPs) and outlines various hybrid SVM models utilizing Genetic Algorithms (GAs) for optimization. The findings suggest that hybrid GFF networks demonstrate high classification performance and low processing time, while SVMs, despite being efficient, may suffer from overfitting.

Uploaded by

loukerisn
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
7 views11 pages

Generalized Feed Forwards and Support Vector Machines SVM

The document evaluates 70 Generalized Feed Forward (GFF) and 14 Support Vector Machines (SVM) models to identify optimal classifiers for portfolio selection. It discusses the efficiency of GFF networks compared to traditional Multi-Layer Perceptrons (MLPs) and outlines various hybrid SVM models utilizing Genetic Algorithms (GAs) for optimization. The findings suggest that hybrid GFF networks demonstrate high classification performance and low processing time, while SVMs, despite being efficient, may suffer from overfitting.

Uploaded by

loukerisn
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 11

GENERALIZED FEED FORWARDS AND SUPPORT

VECTOR MACHINES SVM


We evaluate the performance of 70 Generalized
Feed Forward and 14 Support Vector Machines
models of plain and hybrid form to define the
optimal classifier:

i) in 20 neural networks and 50 GFF hybrids

ii) 2 plain SVMs and 12 SVM hybrids of alternative


topologies that seek the most efficient classifier
in portfolio selection.
THE GENERALIZED FEEDFORWARD
NETWORKS AND THEIR HYBRIDS

The Generalized feedforward networks are a generalization of the MLP such


that connections can jump over one or more layers.

After the number of layers is given, the MLP requires significantly more time
of training the epochs than the generalized feedforward network
containing the same number of neurons.

The network will construct an MLP in which each layer feeds classic example
of this is the two spiral problem.

An MLP can solve any problem that a GFF network can solve, but the GFFs
often solve the problem much more efficiently.

The important inputs demand a multiple training of the network to define


Fig. 3. The Generalized Feed Forward
Network
Input lay er Hidden layers Output lay er

GA GA GA

Fig. 4. The Hybrid GFF Net of GA optimum, Cross


Validation in all the layers, Loukeris et al (2013)
THE SUPPORT VECTOR MACHINES
AND THEIR HYBRIDS
The Support Vector Machines-SVM generally regress and classify the functions from a
set of labeled training data, Cortes and Vapnik (1995), producing a binary output,
whilst the input is categorical.

Training of the SVMs is short in a sequential minimal optimization technique. Courtis


(1978), noted that for instances xi, i= 1,…l in labels yi {1,−1}, SVM are trained
optimising:
1
min f ( )   T Qα - e T α (15)
x 2
(1)

under: 0 ≤ αi ≤ C, i= 1,…l, yTα = 0,

where, e the vector of all 1, Q an l X l symmetric matrix of


Qi, j = yi y jK(xi , x j ), (16)

(2)
In accordance to Min, Lee, and Han, (2006), where optimization
emphasizes on the feature subset and the SVM parameters,
we move forth examining multiple hybrid SVM models.

Specifically the GAs were elaborated in different hybrids:


i) the SVM inputs only,
ii) the SVM outputs only, and
iii) both the inputs and outputs.

Batch learning was preferred on the weights after the


presentation of the whole training set.

All models were tested on 500 and 1000 epochs respectively,


to optimize the iterations number upon convergence.
Figure 3. The Support Vector Machines

Input layer Hidden layers Output layer

K(x, x1)

Bias

K(x, x2)
xi(1)

xi(2)

K(x, x3)
Σ

f(x)
xi(3)

K(x, x3)

xi(D)

K(x, xn)
Figure 4. The Hybrid Genetic Support Vector Machines optimized in GAs on
all the layers with or without Cross Validation, Loukeris et al. (2013)

Input layers Hidden layers Output Layer


Output layer

K(x, x1)
Bias

GA GA
GA

K(x, x2)
xi(1)

xi(2)

K(x, x3) Σ

f(x)
xi(3)

K(x, x3)

xi(D)

K(x, xn)
The wrapper approach selects the SVM optimal feature subset in
GA, the increase function terminates training under the Cross
Validation set when its MSE increases, avoiding the
overtraining, whilst best training weights are loaded on the
test set.

The GA solved the optimal values problem in:


a) number of neurons,
b) the Step size, and
c) the Momentum Rate,
requiring a multiple training of the network to conclude the
lowest error mode.

In case of the models with GA on the output layer, they


optimized the value of the Step size and the Momentum.
CONCLUDING REMARKS
The hybrid GFF nets have a promising performance
of high calibration that can allow them to be a
part of this model or its future developments.

Furthermore the hybrid GFF of 1 hidden layer


optimized by GAs on the inputs and outputs only
is a very reliable model of high classification,
performance, and low processing time, without
partiality exposure.

On the contrary the SVM of 500 epochs whilst in a


marginal lower rank is quite efficient, although
underperforming overfitted and in partiality.

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy