Optimization GFF, SVM Hybrids
Optimization GFF, SVM Hybrids
After the number of layers is given, the MLP requires significantly more time
of training the epochs than the generalized feedforward network
containing the same number of neurons.
The network will construct an MLP in which each layer feeds classic example
of this is the two spiral problem.
An MLP can solve any problem that a GFF network can solve, but the GFFs
often solve the problem much more efficiently.
GA GA GA
(2)
In accordance to Min, Lee, and Han, (2006), where optimization
emphasizes on the feature subset and the SVM parameters,
we move forth examining multiple hybrid SVM models.
K(x, x1)
Bias
K(x, x2)
xi(1)
xi(2)
K(x, x3)
Σ
f(x)
xi(3)
K(x, x3)
xi(D)
K(x, xn)
Figure 4. The Hybrid Genetic Support Vector Machines optimized in GAs on
all the layers with or without Cross Validation, Loukeris et al. (2013)
K(x, x1)
Bias
GA GA
GA
K(x, x2)
xi(1)
xi(2)
K(x, x3) Σ
f(x)
xi(3)
K(x, x3)
xi(D)
K(x, xn)
The wrapper approach selects the SVM optimal feature subset in
GA, the increase function terminates training under the Cross
Validation set when its MSE increases, avoiding the
overtraining, whilst best training weights are loaded on the
test set.