Modified GWO Hindawi PDF
Modified GWO Hindawi PDF
Research Article
Modified Grey Wolf Optimizer for
Global Engineering Optimization
Copyright © 2016 Nitin Mittal et al. This is an open access article distributed under the Creative Commons Attribution License,
which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.
Nature-inspired algorithms are becoming popular among researchers due to their simplicity and flexibility. The nature-inspired
metaheuristic algorithms are analysed in terms of their key features like their diversity and adaptation, exploration and exploitation,
and attractions and diffusion mechanisms. The success and challenges concerning these algorithms are based on their parameter
tuning and parameter control. A comparatively new algorithm motivated by the social hierarchy and hunting behavior of grey
wolves is Grey Wolf Optimizer (GWO), which is a very successful algorithm for solving real mechanical and optical engineering
problems. In the original GWO, half of the iterations are devoted to exploration and the other half are dedicated to exploitation,
overlooking the impact of right balance between these two to guarantee an accurate approximation of global optimum. To overcome
this shortcoming, a modified GWO (mGWO) is proposed, which focuses on proper balance between exploration and exploitation
that leads to an optimal performance of the algorithm. Simulations based on benchmark problems and WSN clustering problem
demonstrate the effectiveness, efficiency, and stability of mGWO compared with the basic GWO and some well-known algorithms.
Ant Colony Optimization (ACO) [21, 22] is inspired by the 2. Overview of Grey Wolf Optimizer Algorithm
pheromone trail laying behavior of real ant colonies. A new
evolutionary optimization algorithm, Cuckoo Search (CS) Grey Wolf Optimizer (GWO) is a typical swarm-intelligence
Algorithm [23], is inspired by lifestyle of cuckoo birds. The algorithm which is inspired from the leadership hierarchy
major algorithms include Ant Colony Optimization (ACO) and hunting mechanism of grey wolves in nature. Grey wolves
[21, 22], Particle Swarm Optimization (PSO) [18], Artificial are considered as apex predators; they have average group size
Bee Colony (ABC) Algorithm [24], Fish Swarm Algorithm of 5–12. In the hierarchy of GWO, alpha (𝛼) is considered
(FSA) [25], Glowworm Swarm Optimization (GSO) [26], the most dominating member among the group. The rest of
Grey Wolf Optimizer (GWO) [27], Fruit Fly Optimization the subordinates to 𝛼 are beta (𝛽) and delta (𝛿) which help
Algorithm (FFOA) [28], Bat Algorithm (BA) [20], Novel to control the majority of wolves in the hierarchy that are
Bat Algorithm (NBA) [29], Dragonfly Algorithm (DA) [30], considered as omega (𝜔). The 𝜔 wolves are of lowest ranking
Cat Swarm Optimization (CSO) [31], Cuckoo Search (CS) in the hierarchy.
Algorithm [23], Cuckoo Optimization Algorithm (COA) The mathematical model of hunting mechanism of grey
[32], and Spider Monkey Optimization (SMO) Algorithm wolves consists of the following:
[33].
The biologically inspired algorithms comprise natural (i) Tracking, chasing, and approaching the prey.
metaheuristics derived from living phenomena and behav-
ior of biological organisms. The intelligence derived with (ii) Pursuing, encircling, and harassing the prey until it
bioinspired algorithms is decentralized, distributed, self- stops moving.
organizing, and adaptive in nature under uncertain environ- (iii) Attacking the prey.
ments. The major algorithms in this field include Artificial
Immune Systems (AIS) [34], Bacterial Foraging Optimization
(BFO) [35], and Krill Herd Algorithm [36]. 2.1. Encircling Prey. Grey wolves encircle the prey during the
Because of their inherent advantages, such algorithms hunt which can be mathematically written as [27]
can be applied to various applications including power →
systems operations and control, job scheduling problems, 𝐷⃗ = 𝐶⃗ ⋅ 𝑋𝑝 (𝑡) − 𝑋⃗ (𝑡) ,
clustering and routing problems, batch process scheduling,
(1)
image processing, and pattern recognition problems. →
𝑋⃗ (𝑡 + 1) = 𝑋𝑝 (𝑡) − 𝐴⃗ ⋅ 𝐷,⃗
GWO is recently developed heuristics inspired from the
leadership hierarchy and hunting mechanism of grey wolves
in nature and has been successfully applied for solving eco- where 𝑡 indicates the current iteration, 𝐴⃗ and 𝐶⃗ are coefficient
→
nomic dispatch problems [37], feature subset selection [38], vectors, 𝑋𝑝 is the position vector of the prey, and 𝑋⃗ indicates
optimal design of double later grids [39], time forecasting the position vector of a grey wolf.
[40], flow shop scheduling problem [41], optimal power flow The vectors 𝐴⃗ and 𝐶⃗ are calculated as follows:
problem [42], and optimizing key values in the cryptography
𝐴⃗ = 2𝑎⃗ ⋅ →
algorithms [43]. A number of variants are also proposed
𝑟1 − 𝑎,⃗
to improve the performance of basic GWO that include (2)
binary GWO [44], a hybrid version of GWO with PSO [45], 𝐶⃗ = 2 ⋅ →
𝑟2 ,
integration of DE with GWO [46], and parallelized GWO
[47, 48].
where components of 𝑎 are linearly decreased from 2 to 0 over
Every optimization algorithm stated above needs to
the course of iterations and 𝑟1 and 𝑟2 are random vectors in
address the exploration and exploitation of a search space.
[0, 1].
In order to be successful, an optimization algorithm needs to
establish a good ratio between exploration and exploitation.
In this paper, a modified GWO (mGWO) is proposed to 2.2. Hunting. Hunting of prey is usually guided by 𝛼 and
balance the exploration and exploitation trade-off in original 𝛽, and 𝛿 will participate occasionally. The best candidate
GWO algorithm. Different functions with diverse slopes are solutions, that is, 𝛼, 𝛽, and 𝛿, have better knowledge about
employed to tune the parameters of GWO algorithm for the potential location of prey. The other search agents (𝜔)
varying exploration and exploitation combinations over the update their positions according to the position of three best
course of iterations. Increasing the exploration in comparison search agents. The following formulas are proposed in this
to exploitation increases the convergence speed and avoids regard:
the local minima trapping effect.
The rest of the paper is organized as follows. Section 2 → → →
𝐷𝛼 = 𝐶1 ⋅ 𝑋𝛼 − 𝑋⃗ ,
gives the overview of original GWO. The proposed mGWO
algorithm is explained in Section 3. The experimental results → → →
are demonstrated in Section 4. Section 5 solves the clustering 𝐷𝛽 = 𝐶2 ⋅ 𝑋𝛽 − 𝑋⃗ , (3)
problem in WSN for cluster head selection to demonstrate
the applicability of the proposed algorithm. Finally, Section 6 → → →
𝐷𝛿 = 𝐶3 ⋅ 𝑋𝛿 − 𝑋⃗ ,
concludes the paper.
Applied Computational Intelligence and Soft Computing 3
→ → → →
𝑋1 = 𝑋𝛼 − 𝐴 1 ⋅ (𝐷𝛼 ) , exploration of the search space prevents an algorithm from
finding an accurate approximation of the global optimum. On
→ → → → the other hand, mere exploitation results in local optima stag-
𝑋2 = 𝑋𝛽 − 𝐴 2 ⋅ (𝐷𝛽 ) , (4)
nation and again low quality of the approximated optimum.
→ → → → In GWO, the transition between exploration and
𝑋3 = 𝑋𝛿 − 𝐴 3 ⋅ (𝐷𝛿 ) , exploitation is generated by the adaptive values of 𝑎 and 𝐴. In
→ → → this, half of the iterations are devoted to exploration (|𝐴| ≥ 1)
𝑋 (𝑡) + 𝑋2 (𝑡) + 𝑋3 (𝑡) and the other half are used for exploitation (|𝐴| < 1), as
𝑋⃗ (𝑡 + 1) = 1 . (5)
shown in Figure 1(a). Generally, higher exploration of search
3
space results in lower probability of local optima stagnation.
2.3. Attacking Prey. In order to mathematically model for There are various possibilities to enhance the exploration
approaching the prey, we decrease the value of 𝑎.⃗ The rate as shown in Figure 1(b), in which exponential functions
are used instead of linear function to decrease the value of 𝑎
fluctuation range of 𝐴⃗ is also decreased by 𝑎.⃗ 𝐴⃗ is a random
over the course of iterations. Too much exploration is similar
value in the interval [−𝑎, 𝑎] where 𝑎 is decreased linearly from
to too much randomness and will probably not give good
2 to 0 over the course of iterations. When random values of 𝐴⃗ optimization results. But too much exploitation is related
are in [−1, 1], the next position of a search agent can be in any to too little randomness. Therefore, there must be a balance
position between its current position and the position of the between exploration and exploitation.
prey. The value |𝐴| < 1 forces the wolves to attack the prey. In GWO, the value of 𝑎 decreases linearly from 2 to 0
After the attack again they search for the prey in the next using the update equation as follows:
iteration, wherein they again find the next best solution 𝛼
among all wolves. This process repeats till the termination 𝑡
𝑎 = 2 (1 − ), (6)
criterion is fulfilled. 𝑇
where 𝑇 indicates the maximum number of iterations and
3. Modified GWO Algorithm 𝑡 is the current iteration. Our mGWO employs exponential
function for the decay of 𝑎 over the course of iterations.
Finding the global minimum is a common, challenging task Consider
among all minimization methods. In population-based opti-
mization methods, generally, the desirable way to converge 𝑡2
𝑎 = 2 (1 − ) (7)
towards the global minimum can be divided into two basic 𝑇2
phases. In the early stages of the optimization, the individuals
should be encouraged to scatter throughout the entire search as shown in Figure 1(c). Using this exponential decay func-
space. In other words, they should try to explore the whole tion, the numbers of iterations used for exploration and
search space instead of clustering around local minima. In exploitation are 70% and 30%, respectively.
the latter stages, the individuals have to exploit information The pseudocode of mGWO is given in Algorithm 1.
gathered to converge on the global minimum. In GWO, with
fine-adjusting of the parameters 𝑎 and 𝐴, we can balance 4. Results and Discussion
these two phases in order to find global minimum with fast
convergence speed. This section investigates the effectiveness of mGWO in
Although different improvements of individual-based practice. It is common in this field to benchmark the
algorithms promote local optima avoidance, the litera- performance of algorithms on a set of mathematical functions
ture shows that population-based algorithms are better in with known global optima. We also follow the same process
handling this issue. Regardless of the differences between and employ 27 benchmark functions for comparison. The test
population-based algorithms, the common approach is the functions are divided to four groups: unimodal, multimodal,
division of optimization process to two conflicting mile- fixed-dimension multimodal, and composite benchmark
stones: exploration versus exploitation. The exploration functions. The unimodal functions (𝐹1 –𝐹7 ) are suitable for
encourages candidate solutions to change abruptly and benchmarking the exploitation of algorithms since they have
stochastically. This mechanism improves the diversity of the one global optimum and no local optima. On the contrary,
solutions and causes high exploration of the search space. In multimodal functions (𝐹8 –𝐹13 ) have a large number of local
contrast, the exploitation aims for improving the quality of optima and are helpful to examine exploration and local
solutions by searching locally around the obtained promising optima avoidance of algorithms.
solutions in the exploration. In this milestone, candidate The mathematical formulation of the employed test func-
solutions are obliged to change less suddenly and search tions is presented in Tables 1–4. We consider 30 variables
locally. for unimodal and multimodal test function for further
Exploration and exploitation are two conflicting mile- improving their difficulties.
stones where promoting one results in degrading the other. Since heuristic algorithms are stochastic optimization
A right balance between these two milestones can guaran- techniques, they have to be run at least more than 10
tee a very accurate approximation of the global optimum times to generate meaningful statistical results. It is again
using population-based algorithms. On the one hand, mere a common strategy that an algorithm is run on a problem
4 Applied Computational Intelligence and Soft Computing
2 2
1.8 1.8
1.6 1.6
1.4 1.4
1.2 1.2
Value of a
Value of a
1 1
0.8 0.8
0.6 0.6
0.4 0.4
0.2 0.2
0 0
0 20 40 60 80 100 0 20 40 60 80 100
Iteration Iteration
Exploration Exploration
Exploitation Exploitation
(a) (b)
2
1.8
1.6
1.4
1.2
Value of a
0.8
0.6
0.4
0.2
0
0 20 40 60 80 100
Iteration
Exploration
Exploitation
(c)
Figure 1: (a) Updating the value of 𝑎 for GWO, (b) some samples of possible functions for updating 𝑎 over the course of iterations, and (c)
updating the value of 𝑎 over the course of iterations for mGWO.
𝑚 times and average/standard deviation/median of the best utilized 30 search agents and 3000 iterations for each of the
obtained solution in the last iteration are calculated as the algorithms.
metrics of performance. We follow the same method to The convergence curves of unimodal, multimodal, fixed-
generate and report the results over 30 independent runs. dimension multimodal, and composite benchmark functions
In order to verify the performance of mGWO algorithm, for the competitive optimization algorithms are given in
PSO, BA, CS, and GWO algorithms are chosen. Note that we Figures 2, 3, 4, and 5, respectively. As Table 5 shows,
Applied Computational Intelligence and Soft Computing 5
Table 1: Unimodal benchmark functions. find promising regions of the search space. In addition, high
Function Dim Range 𝑓min
local optima avoidance of this algorithm is another finding
𝑛 that can be inferred from these results.
𝐹1 (𝑥) = ∑ 𝑥𝑖2 30 [−100, 100] 0 The rest of the results, which belong to 𝐹14 –𝐹23 and
𝑖=1
𝑛 𝑛 𝐹24 –𝐹27 , are provided in Tables 7 and 8, respectively. The
𝐹2 (𝑥) = ∑|𝑥𝑖 | + ∏|𝑥𝑖 | 30 [−10, 10] 0 results are consistent with those of other test functions, in
𝑖=1 𝑖=1 which mGWO shows very competitive results compared to
𝑛 𝑖 2
other algorithms.
𝐹3 (𝑥) = ∑ (∑ 𝑥𝑗 ) 30 [−100, 100] 0
𝑖=1 𝑗−1
𝐹4 (𝑥) = max {𝑥𝑖 , 1 ≤ 𝑖 ≤ 𝑛} 30 [−100, 100] 0 5. Cluster Head Selection in
𝑖
𝑛−1
2
WSN Using mGWO
2
𝐹5 (𝑥) = ∑ [100 (𝑥𝑖+1 − 𝑥𝑖2 ) + (𝑥𝑖 − 1) ] 30 [−30, 30] 0
𝑖=1 Cluster head (CH) selection problem is a well-known prob-
𝑛
2 lem in the field of wireless sensor networks (WSNs) in
𝐹6 (𝑥) = ∑ ([𝑥𝑖 + 0.5]) 30 [−100, 100] 0
𝑖=1
which the energy consumption cost of the network should
𝑛 be minimized [49–53]. In this paper, this problem is solved
𝐹7 (𝑥) = ∑𝑖𝑥𝑖4 + random (0, 1) 30 [−1.28, 1.28] 0 using mGWO algorithm and compared with GA, PSO, BA,
𝑖=1
CS, and GWO.
The main challenges in designing and planning the
mGWO algorithm provides the best results in 5 out of 7 operations of WSNs are to optimize energy consumption and
unimodal benchmark test functions. The mGWO algorithm prolong network lifetime. Cluster-based routing techniques,
also provides very competitive results compared to CS on 𝐹5 such as the well-known low-energy adaptive clustering hier-
and 𝐹6 . As discussed above, unimodal functions are suitable archy (LEACH) [50], are used to achieve scalable solutions
for benchmarking exploitation of the algorithms. Therefore, and extend the network lifetime until the last node dies
these results evidence high exploitation capability of the (LND). In order to achieve prolonged network lifetime in
mGWO algorithm. cluster-based routing techniques, the lifetime of the CHs
The statistical results of the algorithms on multimodal plays an important role. Improper cluster formation may
test function are presented in Table 6. It may be seen that cause some CHs to be overloaded. Such overload may
mGWO algorithm highly outperforms other algorithms on cause high energy consumption of the CH and degrade
𝐹9 , 𝐹10 , 𝐹11 , and 𝐹12 . It should be noted that mGWO algorithm the overall performance of the WSN. Therefore, proper CH
outperforms other algorithms on these multimodal test selection is the most important issue for clustering sensor
functions except PSO for 𝐹13 . The results of multimodal test nodes. Designing an energy efficient clustering algorithm
function strongly prove that high exploration of mGWO is not an easy task. Therefore, nature-inspired optimization
algorithm is a suitable mechanism for avoiding local solu- algorithms may be applied to tackle cluster-based routing
tions. Since the multimodal functions have an exponential problem in WSN. Evolutionary algorithms (EAs) have been
number of local solutions, the results show that mGWO used in recent years as metaheuristics to address energy-
algorithm is able to explore the search space extensively and aware routing challenges by designing intelligent models
6
1 𝑛 1 𝑛
𝐹10 (𝑥) = −20 exp (−0.2√ ∑𝑥𝑖2 ) − exp ( ∑ cos (2𝜋𝑥𝑖 )) + 20 + 𝑒 30 [−32, 32] 0
𝑛 𝑖=1 𝑛 𝑖=1
𝑛
1 𝑛 2 𝑥
𝐹11 (𝑥) = ∑𝑥𝑖 − ∏ cos ( 𝑖 ) + 1 30 [−600, 600] 0
4000 𝑖=1 𝑖=1 √𝑖
𝑛−1 𝑛
𝜋 2
𝐹12 (𝑥) = {10 sin (𝜋𝑦1 ) + ∑ (𝑦𝑖 − 1)2 [1 + 10 sin2 (𝜋𝑦𝑖+1 )] + (𝑦𝑛 − 1) } + ∑𝑢 (𝑥𝑖 , 10, 100, 4)
𝑛 𝑖=1 𝑖=1
𝑚
{ 𝑘(𝑥𝑖 − 𝑎) 𝑥𝑖 > 𝑎
{
{ 30 [−50, 50] 0
𝑥𝑖 + 1 {
𝑦𝑖 = 1 + , 𝑢 (𝑥𝑖 , 𝑎, 𝑘, 𝑚) = {0 −𝑎 < 𝑥𝑖 < 𝑎
4 {
{
{ 𝑚
{𝑘(−𝑥𝑖 − 𝑎) 𝑥𝑖 < −𝑎
𝑛 𝑛
2 2
𝐹13 (𝑥) = 0.1 {sin2 (3𝜋𝑥1 ) + ∑ (𝑥𝑖 − 1) [1 + sin2 (3𝜋𝑥𝑖 + 1)] + (𝑥𝑛 − 1) [1 + sin2 (2𝜋𝑥𝑛 )]} + ∑𝑢 (𝑥𝑖 , 5, 100, 4) 30 [−50, 50] 0
𝑖=1 𝑖=1
Applied Computational Intelligence and Soft Computing
Applied Computational Intelligence and Soft Computing 7
2 2 [−2, 2] 3
⋅ [30 + (2𝑥1 − 3𝑥2 ) (18 − 32𝑥1 + 12𝑥21 + 48𝑥2 − 36𝑥1 𝑥2 + 27𝑥22 )]
4 3
2
𝐹19 (𝑥) = −∑𝑐𝑖 exp (−∑𝑎𝑖𝑗 (𝑥𝑗 − 𝑝𝑖𝑗 ) ) 3 [1, 3] −3.86
𝑖=1 𝑗=1
4 6
2
𝐹20 (𝑥) = −∑𝑐𝑖 exp (−∑𝑎𝑖𝑗 (𝑥𝑗 − 𝑝𝑖𝑗 ) ) 6 [0, 1] −3.32
𝑖=1 𝑗=1
5
𝑇 −1
𝐹21 (𝑥) = −∑ [(𝑋 − 𝑎𝑖 ) (𝑋 − 𝑎𝑖 ) + 𝑐𝑖 ] 4 [0, 10] −10.1532
𝑖=1
7
𝑇 −1
𝐹22 (𝑥) = −∑ [(𝑋 − 𝑎𝑖 ) (𝑋 − 𝑎𝑖 ) + 𝑐𝑖 ] 4 [0, 10] −10.4028
𝑖=1
10
𝑇 −1
𝐹23 (𝑥) = − ∑ [(𝑋 − 𝑎𝑖 ) (𝑋 − 𝑎𝑖 ) + 𝑐𝑖 ] 4 [0, 10] −10.5363
𝑖=1
that collaborate together to optimize an appropriate energy- The term 𝐸elec denotes the per-bit energy dissipation during
aware objective function [52]. GWO is one of the powerful transmission. The per-bit amplification energy is propor-
heuristics that can be applied for efficient load balanced tional to 𝑑4 when the transmission distance exceeds the
clustering. In this paper, mGWO based clustering algorithm threshold 𝑑0 (called crossover distance) and otherwise is
is used to solve the abovementioned load balancing problem. proportional to 𝑑2 . The parameters 𝜀friss amp and 𝜀two ray amp
The algorithm forms clusters in such a way that the overall denote transmitter amplification parameters for free-space
energy consumption of the network is minimized. Total and multipath fading models, respectively. The value of 𝑑0 is
energy consumption in the network is the sum of the total given by
energy dissipated from the non-CHs to send information to
their respective CHs and the total energy consumed by CH 𝜀friss amp
nodes to aggregate the information and send it to the base 𝑑0 = √ . (10)
𝜀two ray amp
station (BS).
Consider a WSN of 𝑛 sensor nodes randomly deployed The reception energy of the 𝑘-bit data message can be
in the sensing field and organized into 𝐾 clusters: expressed by
𝐶1 , 𝐶2 , . . . , 𝐶𝐾 . The fitness function for the energy consump-
tion may be defined as 𝐸RX = 𝑘𝐸elec , (11)
𝐾 𝐾 where 𝐸elec denotes the per-bit energy dissipation during
𝑓 = (∑ ∑ 𝐸TX𝑠,CH + 𝐸RX + 𝐸DA ) + ∑𝐸TXCH ,BS , (8) reception.
𝑖 𝑖
𝑖=1 𝑠∈𝐶𝑖 𝑖=1
𝐸DA is the data aggregation energy expenditure and is set
where 𝐾 is the total number of CHs, 𝑠 ∈ 𝐶𝑖 is a non-CH as 𝐸DA = 5 nj/bit. The values of other parameters are set to
associated with the 𝑖th CH, and 𝐸TXnode ,node is the energy 𝐸elec = 50 nj/bit, 𝜀friss amp = 100 pj/bit/m2 , and 𝜀two ray amp =
1 2
dissipated for transmitting data from node1 to node2 . 0.0013 pj/bit/m4 , respectively [51].
In order to calculate radio energy transmission and For the simulation setup, 100 nodes are randomly
reception costs, a 𝑘-bit message and also the transmitter- deployed in a 100 m × 100 m area of the sensing field. BS
receiver separation distance 𝑑 are given by is placed at the center of the field. The initial energy of all
homogeneous nodes is set to 𝐸0 = 1 J. During this analysis,
2
{𝑘𝐸elec + 𝑘𝜀friss amp 𝑑 , if 𝑑 < 𝑑0 three parameters, namely, first node dead (FND), half nodes
𝐸TX = { (9) dead (HND), and last node dead (LND) are employed to
𝑘𝐸 + 𝑘𝜀two ray amp 𝑑4 , if 𝑑 ≥ 𝑑0 .
{ elec outline the network lifetime.
8 Applied Computational Intelligence and Soft Computing
F1 F2
0
10
100
10−50
Best score obtained so far
10−150
10−100
10−200
500 1000 1500 2000 2500 3000 500 1000 1500 2000 2500 3000
Iteration Iteration
F3 F4
0
10
100
Best score obtained so far
10−20
10−20
10−40 10−40
500 1000 1500 2000 2500 3000 500 1000 1500 2000 2500 3000
Iteration Iteration
F5 F6
108
100
Best score obtained so far
106
10−5
104
10−10
102
10−15
500 1000 1500 2000 2500 3000 500 1000 1500 2000 2500 3000
Iteration Iteration
Figure 2: Continued.
10 Applied Computational Intelligence and Soft Computing
F7
2
10
10−2
PSO GWO
BA mGWO
CS
Figure 2: Convergence graph of unimodal benchmark functions.
F8 F9
−100
100
10−5
−10200
10−10
−10300
0 1000 2000 3000 500 1000 1500 2000 2500 3000
Iteration Iteration
F10 F11
100
100
Best score obtained so far
10−5 10−5
10−10
10−10
10−15
500 1000 1500 2000 2500 3000 500 1000 1500 2000 2500 3000
Iteration Iteration
F12 F13
108
108
106 106
Best score obtained so far
104 104
102 102
100
100
500 1000 1500 2000 2500 3000 500 1000 1500 2000 2500 3000
Iteration Iteration
F14 F15
102
10−1
10−2
1
10
10−3
100
500 1000 1500 2000 2500 3000 500 1000 1500 2000 2500 3000
Iteration Iteration
F17 F18
102
Best score obtained so far
100
101
500 1000 1500 2000 2500 3000 500 1000 1500 2000 2500 3000
Iteration Iteration
F19 F20
−100.51 −100.1
−100.53 −100.2
Best score obtained so far
−100.3
−100.55
−100.4
0.57
−10
−100.5
500 1000 1500 2000 2500 3000 500 1000 1500 2000 2500 3000
Iteration Iteration
Figure 4: Continued.
Applied Computational Intelligence and Soft Computing 13
F21 F22
−100
0
−10
−101 −101
500 1000 1500 2000 2500 3000 500 1000 1500 2000 2500 3000
Iteration Iteration
F23
−100
Best score obtained so far
−101
500 1000 1500 2000 2500 3000
Iteration
PSO GWO
BA mGWO
CS
F24 F25
103 102.8
102.4
102.2
500 1000 1500 2000 2500 3000 500 1000 1500 2000 2500 3000
Iteration Iteration
F26 F27
Best score obtained so far
103
103
500 1000 1500 2000 2500 3000 500 1000 1500 2000 2500 3000
Iteration Iteration
Table 9: Comparison results of CH selection problem in WSN. The paper also considered the clustering problem in WSN
in which the CH selection is performed using the proposed
Optimal
Algorithm FND HND LND mGWO algorithm, which is a challenging and NP hard
cost of FF
problem. The results show that the proposed method is found
GA 39.798 mJ 2047.1 2678.2 3337.5 to be very effective for real-world applications due to fast
PSO 39.854 mJ 2034.8 2665.2 3324.6 convergence and fewer chances to get stuck at local minima.
BA 40.324 mJ 1996.7 2654.8 3298.9 It can be concluded that the proposed algorithm is able to
CS 38.412 mJ 2079.2 2702.5 3372.4 outperform the current well-known and powerful algorithms
GWO 38.560 mJ 2073.9 2696.3 3369.8 in the literature. The results prove the competence and
superiority of mGWO to existing metaheuristic algorithms
mGWO 38.209 mJ 2084.6 2714.4 3382.2
and it has an ability to become an effective tool for solving
real word optimization problems.
space over the course of iterations. The results proved that Competing Interests
the proposed algorithm benefits from high exploration in
comparison to the standard GWO. The authors declare that they have no competing interests.
Applied Computational Intelligence and Soft Computing 15
References [19] X.-S. Yang, “Firefly algorithm, stochastic test functions and
design optimization,” International Journal of Bio-Inspired Com-
[1] D. E. Goldberg, Genetic Algorithms in Search, Optimization, and putation, vol. 2, no. 2, pp. 78–84, 2010.
Machine Learning, Addison-Wesley, 1989. [20] X.-S. Yang and A. H. Gandomi, “Bat algorithm: a novel
[2] T. Back, F. Hoffmeister, and H. P. Schwefel, “A survey of approach for global engineering optimization,” Engineering
evolution strategies,” in Proceedings of the 4th International Computations, vol. 29, no. 5, pp. 464–483, 2012.
Conference on Genetic Algorithms, San Diego, Calif, USA, July
[21] M. Dorigo, V. Maniezzo, and A. Colorni, “Ant system: optimiza-
1991.
tion by a colony of cooperating agents,” IEEE Transactions on
[3] J. R. Koza, Genetic Programming: On the Programming of Systems, Man, and Cybernetics, Part B: Cybernetics, vol. 26, no.
Computers by Natural Selection, MIT Press, Cambridge, UK, 1, pp. 29–41, 1996.
1992.
[22] M. Dorigo and L. M. Gambardella, “Ant colony system: a coop-
[4] R. Storn and K. Price, “Differential evolution—a simple and
erative learning approach to the traveling salesman problem,”
efficient heuristic for global optimization over continuous
IEEE Transactions on Evolutionary Computation, vol. 1, no. 1, pp.
spaces,” Journal of Global Optimization, vol. 11, no. 4, pp. 341–
53–66, 1997.
359, 1997.
[23] X.-S. Yang and S. Deb, “Engineering optimisation by cuckoo
[5] D. Simon, “Biogeography-based optimization,” IEEE Transac-
search,” International Journal of Mathematical Modelling and
tions on Evolutionary Computation, vol. 12, no. 6, pp. 702–713,
Numerical Optimisation, vol. 1, no. 4, pp. 330–343, 2010.
2008.
[6] W. Gong, Z. Cai, C. X. Ling, and H. Li, “A real-coded [24] D. Karaboga and B. Akay, “A comparative study of artificial Bee
biogeography-based optimization with mutation,” Applied colony algorithm,” Applied Mathematics and Computation, vol.
Mathematics and Computation, vol. 216, no. 9, pp. 2749–2758, 214, no. 1, pp. 108–132, 2009.
2010. [25] X. Li, Z. Shao, and J. Qian, “An optimizing method base
[7] W. Gong, Z. Cai, and C. X. Ling, “DE/BBO: a hybrid differential on autonomous animates: fish swarm algorithm,” Systems
evolution with biogeography-based optimization for global Engineering—Theory & Practice, vol. 22, pp. 32–38, 2002.
numerical optimization,” Soft Computing, vol. 15, no. 4, pp. 645– [26] K. Krishnanand and D. Ghose, “Glowworm swarm optimi-
665, 2011. sation: a new method for optimising multi-modal functions,”
[8] H. Ma and D. Simon, “Blended biogeography-based optimiza- International Journal of Computational Intelligence Studies, vol.
tion for constrained optimization,” Engineering Applications of 1, no. 1, pp. 93–119, 2009.
Artificial Intelligence, vol. 24, no. 3, pp. 517–525, 2011. [27] S. Mirjalili, S. M. Mirjalili, and A. Lewis, “Grey wolf optimizer,”
[9] U. Singh and T. S. Kamal, “Design of non-uniform circular Advances in Engineering Software, vol. 69, pp. 46–61, 2014.
antenna arrays using biogeography-based optimisation,” IET [28] W.-T. Pan, “A new fruit fly optimization algorithm: taking
Microwaves, Antennas and Propagation, vol. 5, no. 11, pp. 1365– the financial distress model as an example,” Knowledge-Based
1370, 2011. Systems, vol. 26, pp. 69–74, 2012.
[10] S. Kirkpatrick, C. D. Gelatt, and M. P. Vecchi, “Optimization by [29] X.-B. Meng, X. Z. Gao, Y. Liu, and H. Zhang, “A novel bat
simulated annealing,” Science, vol. 220, no. 4598, pp. 671–680, algorithm with habitat selection and Doppler effect in echoes
1983. for optimization,” Expert Systems with Applications, vol. 42, no.
[11] P. Moscato, “On evolution, search, optimization, genetic algo- 17-18, pp. 6350–6364, 2015.
rithms and martial arts: towards Memetic Algorithms,” Caltech [30] S. Mirjalili, “Dragonfly algorithm: a new meta-heuristic opti-
Concurrent Computation Program Report 826, 1989. mization technique for solving single-objective, discrete, and
[12] Z. W. Geem, J. H. Kim, and G. V. Loganathan, “A new heuristic multi-objective problems,” Neural Computing & Applications,
optimization algorithm: harmony search,” Simulation, vol. 76, 2015.
no. 2, pp. 60–68, 2001.
[31] S. C. Chu and P. W. Tsai, “Computational intelligence based
[13] K. S. Lee and Z. W. Geem, “A new meta-heuristic algorithm for on the behaviour of cats,” International Journal of Innovative
continuous engineering optimization: harmony search theory Computing Information and Control, vol. 3, pp. 163–173, 2007.
and practice,” Computer Methods in Applied Mechanics and
[32] R. Rajabioun, “Cuckoo optimization algorithm,” Applied Soft
Engineering, vol. 194, no. 36–38, pp. 3902–3933, 2005.
Computing Journal, vol. 11, no. 8, pp. 5508–5518, 2011.
[14] M. Eusuff, K. Lansey, and F. Pasha, “Shuffled frog-leaping
algorithm: a memetic meta-heuristic for discrete optimization,” [33] J. C. Bansal, H. Sharma, S. S. Jadon, and M. Clerc, “Spider
Engineering Optimization, vol. 38, no. 2, pp. 129–154, 2006. Monkey Optimization algorithm for numerical optimization,”
Memetic Computing, vol. 6, no. 1, pp. 31–47, 2014.
[15] E. Rashedi, H. Nezamabadi-Pour, and S. Saryazdi, “GSA: a
gravitational search algorithm,” Information Sciences, vol. 179, [34] D. Dasgupta, Artificial Immune Systems and Their Applications,
no. 13, pp. 2232–2248, 2009. Springer, 1999.
[16] S. Mirjalili, S. M. Mirjalili, and A. Hatamlou, “Multi-verse [35] S. Das, A. Biswas, S. Dasgupta, and A. Abraham, “Bacterial for-
optimizer: a nature-inspired algorithm for global optimization,” aging optimization algorithm: theoretical foundations, analysis,
Neural Computing & Applications, vol. 27, no. 2, pp. 495–513, and applications,” Studies in Computational Intelligence, vol. 203,
2016. pp. 23–55, 2009.
[17] A. Y. S. Lam and V. O. K. Li, “Chemical-reaction-inspired meta- [36] A. H. Gandomi and A. H. Alavi, “Krill herd: a new bio-inspired
heuristic for optimization,” IEEE Transactions on Evolutionary optimization algorithm,” Communications in Nonlinear Science
Computation, vol. 14, no. 3, pp. 381–399, 2010. and Numerical Simulation, vol. 17, no. 12, pp. 4831–4845, 2012.
[18] J. Kennedy and R. Eberhart, “Particle swarm optimization,” [37] V. K. Kamboj, S. K. Bath, and J. S. Dhillon, “Solution of
in Proceedings of the IEEE International Conference on Neural non-convex economic load dispatch problem using Grey Wolf
Network, vol. 4, pp. 1942–1948, Perth, Australia, December 1995. Optimizer,” Neural Computing and Applications, 2015.
16 Applied Computational Intelligence and Soft Computing
Advances in
Fuzzy
Systems
Modelling &
Simulation
in Engineering
Hindawi Publishing Corporation
Hindawi Publishing Corporation Volume 2014 http://www.hindawi.com Volume 2014
http://www.hindawi.com
International Journal of
Advances in Computer Games Advances in
Computer Engineering Technology Software Engineering
Hindawi Publishing Corporation Hindawi Publishing Corporation Hindawi Publishing Corporation Hindawi Publishing Corporation Hindawi Publishing Corporation
http://www.hindawi.com Volume 2014 http://www.hindawi.com Volume 2014 http://www.hindawi.com Volume 2014 http://www.hindawi.com Volume 2014 http://www.hindawi.com Volume 2014
International Journal of
Reconfigurable
Computing