Wang 2018 4527968 PDF
Wang 2018 4527968 PDF
Research Article
A Many-Objective Optimization Algorithm Based on Weight
Vector Adjustment
Received 29 June 2018; Revised 29 August 2018; Accepted 13 September 2018; Published 22 October 2018
Copyright © 2018 Yanjiao Wang and Xiaonan Sun. This is an open access article distributed under the Creative Commons
Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is
properly cited.
In order to improve the convergence and distribution of a many-objective evolutionary algorithm, this paper proposes an
improved NSGA-III algorithm based on weight vector adjustment (called NSGA-III-WA). First, an adaptive weight vector
adjustment strategy is proposed to decompose the objective space into several subspaces. According to different subspace
densities, the weight vector is sparse or densely adjusted to ensure the uniformity of the weight vector distribution on the Pareto
front surface. Secondly, the evolutionary model that combines the new differential evolution strategy and genetic evolution
strategy is proposed to generate new individuals and enhance the exploration ability of the weight vector in each subspace. The
proposed algorithm is tested on the optimization problem of 3–15 objectives on the DTLZ standard test set and WFG test
instances, and it is compared with the five algorithms with better effect. In this paper, the Whitney–Wilcoxon rank-sum test is
used to test the significance of the algorithm. The experimental results show that NSGA-III-WA has a good effect in terms of
convergence and distribution.
can enhance the selection pressure, but this relaxed The weight vectors can specify a unique subarea. A
strategy is limited to handle the situation with a little smaller objective space helps overcome the invalidity
number of objectives. Moreover, it is very hard that of the many-objective Pareto dominance relationship
parameters need to be adjusted for different opti- [23]; the distribution and the uniformity of the so-
mization problems. lution surface decrease because of the sparse solution
(2) Method based on decomposition [12], which means to of each subspace edge caused by the fixed subspace. In
decompose the objective space into several subspaces 2016, Cheng et al. proposed an evolutionary algorithm
without changing the dimension of the objective, based on reference vector guidance for solving
thus transforming MAOP into single-objective sub- MAOPs [24] (RVEA). Its principle of the adaptive
problem or many-objective subproblems. In 2007, strategy is to adjust the weight vectors dynamically
Zhang and Li proposed a many-objective evolutionary according to the objective function form. The weight
algorithm based on decomposition [13] (MOEA/D) vectors generated by the methods above are uniformly
for the first time whose convergence effect is signif- distributed, while the reference point on the solution
icantly better than MOGLS and NSGA-II. In 2016, surface cannot be guaranteed to be uniform, and there
Yuan et al. proposed a distance-based update strategy, is also a possibility that the convergence may be lost.
MOEA/DD [14], to maintain the diversity of algo- In order to further improve the convergence and dis-
rithms in the evolutionary process by exploiting tribution of many-objective algorithms, based on NSGA-
the vertical distance between solutions and weight III, a many-objective optimization algorithm (NSGA-III-
vectors. In 2017, Segura and Miranda proposed WA) based on weight vector adjustment is proposed. First,
a decomposition-based MOEA/D-EVSD [15] evolu- in order to enhance the exploration ability of the solution to
tionary algorithm, steady-state form, and a reference the weight vector, an evolutionary model in which a novel
direction to guide the search. In 2017, Xiang et al. differential evolution strategy and a genetic evolution
proposed the framework of VAEA [16] algorithm strategy are integrated is used to generate new individuals.
based on angle decomposition. The algorithm does not Then, in order to ensure the uniform distribution of weight
require the reference point, and the convergence and vectors in the solution surface, the objective space is di-
diversity of many-objective space are well balanced. vided into several subspaces by clustering the objective
However, the self-adjusting characteristics of the al- vectors. According to the adjustment of the weight of each
gorithms mentioned above make them fall into local subspace, the spatial distribution of the objective is im-
optimum more easily although the convergence speed proved. We will carry out simulation experiments on the
is improved. The distribution is not unsatisfactory. DTLZ standard test set [25] and WFG standard test set
(3) The reference point method, this kind of algorithm [26]. We compare the proposed algorithm with the five
decomposes the MAOPs into a group of many- algorithms that are currently performing better on the
objective optimization subproblems with simple optimization problem of 3 to 15 objectives. The GD, IGD,
frontier surfaces [17]. However, unlike the de- and HV are compared as performance indicators. The
composition method, the subproblem is solved using experimental results show that NSGAWA has good effect in
the many-objective optimization method. In 2013, convergence and distribution.
Wang et al. [18] proposed the PICEA-w-r algorithm The rest of the paper is organized as follows. Section 2
whose set of weight vectors evolved along with introduces the original algorithm. Section 3 describes the
populations; the weight vector adjusts adaptively proposed many-objective evolutionary algorithm. Section 4
according to its own optimal solution. In 2014, Qi compares the similarities and differences between this al-
et al. adopted an enhanced weight vector adjustment gorithm and similar algorithms. Section 5 gives the exper-
method in the MOEA/D-AWA algorithm [19]. In imental parameters of each algorithm and comprehensive
2014, Deb and Jain proposed a nondominated sorting experiments and analysis. Finally, Section 6 summarizes the
evolution many-objective optimization algorithm full text and points out the issues to be studied next.
based on reference points [20] (NSGA-III), and its
reference point is uniformly distributed throughout
the objective space; in the same year, Liu et al. pro- 2. NSGA-III
posed the MOEA/D-M2M method; the entire PF can
be divided into multiple segments and solved sepa- The NSGA-III algorithm is similar to the NSGA-II algorithm
rately by dividing the entire objective space into in that it selects individuals based on nondominated or-
multiple subspaces. Each segment corresponds to dering. The difference is that the individual choice after
a many-objective optimal subproblem [21], which nondominated sorting is different. The NSGA-III algorithm
improves the distribution of solution sets. In 2016, Bi is introduced as follows:
and Wang proposed an improved NSGA-III many- First, a population A of size N is set up, and population Pt
objective optimization algorithm [22] (NSGA-III- is operated by genetic operators (selection, reorganization,
OSD) based on objective space decomposition. The and variation) to obtain a population Qt of the same size, and
uniformly distributed weight vector was decomposed then population Pt and population Qt are mixed to obtain
into several subspaces through a clustering approach. a population Rt of 2N.
Computational Intelligence and Neuroscience 3
The population Rt is subjected to nondominated sorting result and terminate the algorithm; otherwise, continue
to obtain layers of individuals with nondominated levels iterating.
(F1, F2, and so on). Individuals with nondominated levels
are sequentially added to the set St of the next generation of
children until the size of the set St is greater than N. The 3.2. Initialization. The initial population is randomly gen-
nondominated level at this time is defined as the L layer. Pick erated whose size is the same as the number of weight vectors
K individuals from the L level so that the sum of K and all in its space. This article uses Das and Dennis’s systematic
previous levels is equal to N. Prior to this, the objective value method [27] to set weight vectors W_unit � w1 , w2 , . . . ,
is normalized by the ideal point and the extreme point. After wN }. The total number of weight vectors is equal to
normalization, the ideal point of St is a zero vector, and the N � CM−1H+M − 1 , where H represents the dimension of the
provided reference point is located exactly on the normal- solution vector and M is the number of objective functions.
ized hyperplane. The vertical distance between each indi- The initialized weight vectors (reference points) are uni-
vidual in St and each weight vector (connect the origin to the formly distributed in the objective space, and each weight
reference point) is calculated. Each individual in St is then vector generation method is as follows: for wi ≥ 0, i � 1, . . . ,
associated with a reference point having a minimum vertical H, H i�1 wi � 1.
distance.
Finally, the niche operation is used to select members
from F1. A reference point may be associated with one or 3.3. Evolutionary Strategy. The evolutionary strategy is es-
more objective vectors or there are also possibilities that none sential to the convergence speed and accuracy of the solu-
of the objective vectors is associated with a reference point. tions because it will determine the quality of new solutions to
The purpose of the niche operation is to select the K closest subquestions directly during evolutionary process. In order
reference points from the F1 layer into the next generation. to improve the convergence speed, this paper proposes a new
Firstly, calculate the number of individuals associated with differential evolution strategy to replace the original strategy.
each reference point in the St/Fl population and use ρj to The pseudocode is shown in Algorithm 2. Every individual
represent the number of individuals associated with the jth performs the same operation as follows.
reference point. The specific operation is as follows:
When the number of individuals associated with a ref-
erence point is zero, in other words, ρj is equal to zero, the 3.3.1. Variation. It is mainly divided into two parts:
operation next depends on whether there are individuals (a) Select three individuals xr1 , xr2 , xr3 randomly from
related to the reference point in Fl. If one or more individuals the population. A new individual xv will be ob-
are related to the reference vector, extract the point with the tained using (1) for parental vector variation to
smallest distance, add it to the next generation, and set maintain population diversity. At the beginning of
ρj � ρj + 1. If no individual is associated with the reference the algorithm, the mutation rate should be relatively
point in Fl, the reference point vector in this generation will large to make the individuals different from each
be deleted. If ρj > 0, choose the nearest reference point until other. This not only improves the search ability of
the population size is N. the algorithm, but also prevents the individual from
falling into local optimum. The mutation rate
3. The Proposed Algorithm should decrease as the number of iterations in-
creases and the solution approaches the Pareto
3.1. The Proposed Algorithm Framework. In order to further optimal front surface (PFs) [28]. At this time, the
improve the convergence speed and distribution of NSGA- mutation rate should decrease to accelerate the
III algorithm, a multiobjective optimization algorithm based convergence of the algorithm to the optimal. This
on weight vector adjustment (NSGA-III-WA) is proposed. not only improves the convergence speed, but also
Algorithm 1 is the framework of the NSGA-III-WA algo- reduces the complexity of the algorithm. Based on
rithm. The algorithm is mainly improved in two aspects: the above analysis of the needs of the algorithm, this
evolution strategy and weight vector. This paper also adds paper proposes an adaptive mutated factor
the discriminating condition for enabling weight vector F � 0.5 + 0.5 cos(π × gen/gen_ max). It can be seen
adjustment, which speeds up the running of the algorithm that as the number of iterations increases, the
without affecting the performance of the algorithm. First, we mutation rate decreases in size. At the beginning of
initialize population Pt with population size N and weight the algorithm, it can enhance the individual’s ability
vector W_unit. Secondly, we enter the algorithm iteration to jump out of local optimum and find superior
process, generate the population Qt by the operating pop- individuals. As the mutation rate is smaller, the
ulation Pt using the differential operator, and then obtain algorithm tends to be stable. To maintain the di-
population Rt sized 2N using the combination of Pt and Qt. versity of populations, select individuals xr1 , xr2 to
Rt should be updated through the environmental selection generate a new individual (line 3 in Algorithm 2) by
strategy. The next generation of population Pt+1 is obtained. simulating the binary recombination operator (2)
Lastly, adjust the weight vector and determine if the ter- and (3). In (4), u is a random number between [0, 1]
mination condition is satisfied. If so, output the current and η is a constant with the fixed value of 20.
4 Computational Intelligence and Neuroscience
3.3.2. Crossover. This article selects the single-point crossing 3.4.4. Niche Preservation Operation. If the number of
method, which is located in lines 13–20 of the pseudocode in populations associated with this reference point is zero (but
Algorithm 2. In this way, individual selectivity is enhanced. there is an individual associated with the reference point vector
Generate a random number between [0, 1]. If the number is in Fl), then find the point with the smallest distance and extract
less than or equal to the crossover operator CR, then select it from Fl to join the selected next generation population. In
a certain dimension of the individual randomly and execute the setting, the number of associated populations is increased
the crossover operations at the selected point. A large by one. If each individual is not referenced to the reference
number of experiments have confirmed that when the value point in the Fl, the reference point vector is deleted. If the
of CR is 0.4, the effect is better. Then use the generated number of associated populations is not zero, then the nearest
individuals and their fitness values to replace the original reference point is selected until the population size is N.
ones.
3.5. Weight Adjustment. The uniformity of the solution
3.4. Environmental Selection. The purpose of the environ- surface cannot be achieved when the algorithm reaches
mental selection operation is to select the next generation of a certain stable state, although the weight vectors distribute
individuals. The framework of Algorithm 3 includes the uniformly in the space. This is because of the complexity
following steps: (1) Nondominated sorting is conducted on caused by the irregular shape of the PFs of the objective
the population Rt, and individuals with a rank of 1, 2, 3, . . . functions. The distribution of weight vectors is particularly
after nondominating sorting are added to the offspring important when all individuals are indistinguishable from
collection St in order. (2) When the size of St is greater than each other and locate on the first level of the dominance
or equal to N, note the nondominant level Fl at this time and level. Therefore, in order to improve the distribution of
determine when to terminate the operation (|St | � N) or many-objective algorithms, a weight vector adjustment
enter the next step (St | > N). (3) Select individuals in St /Fl strategy whose framework is shown in Algorithm 4 is
and enter Pt+1 until its size is N. The specific operation is proposed. The distribution of weight vectors is appropriately
discussed below. adjusted according to the shape of the nondominated
frontier. In order to prevent the weight vector adjustment in
the high-dimensional space from being concentrated to
3.4.1. Normalize Objective. Since the magnitude of the re- a certain objective, the K-means clustering method is used to
spective objective values is different, it is necessary to divide the weight vector into different subspaces. The specific
normalize objective values for the sake of fairness. First, operations are described below.
calculate the minimum value zi of each dimension for every First, each weight vector is associated with the population
objective function. The sets of zi constitute the ideal points. member (line 1 in Algorithm 4). Secondly, the solution space
All individuals are then normalized according to (5), where is decomposed into many subspaces using the K-means
ai is the intercept of each dimension that can be calculated clustering method (lines 2–5 in Algorithm 4), as shown in
according to the achievement scalarizing function (ASF) Figure 1. To prevent errors caused by excessive differences in
shown in (6). the solution set in space, the subspace should not be too large
or small, and it can be divided into C spaces according to the
f′ i (x) f (x) − zmin
fni (x) � min � i i
, for i � 1, 2, . . . , M, size of the population. A large number of experiments
ai − zi ai − zmin
i confirmed that when C � 13, better results can usually be
(5) obtained. The solution set is decomposed into [N/C] cluster
spaces, and the weight vectors are adjusted by comparing the
M fi (x) − zmin
i density of the entire objective space and subspaces (lines 6–17
ASF(x, w) � max , for x ∈ St . (6)
i�1 wi in Algorithm 4). As shown in Figure 2, w2 should be away
from w1 and approach w3. Finally, the number of weight
vectors is adjusted to ensure that it can match the original
number. If the number is greater than N, then the weight
3.4.2. Associate Each Member of St with a Reference Point. vector is deleted at the densest position in the entire objective
In order to associate each individual in St with the reference space. If the number is less than N, then a weight vector is
points after normalization, a reference line is defined for added in sparse position (lines 18–20 in Algorithm 4).
each reference point on the hypersurface. In the normalized The definition of spatial density is obtained by averaging
objective space, the reference point with the shortest dis- the distances of similar individuals in the population. The
tance is considered to be related to population members. minimum spatial density is defined as h1 ρo . The maximum
spatial density is defined as h2 ρo . Under normal circumstances,
when the value of h1 is 0.2 and h2 is 1.3, relatively good results
3.4.3. Compute the Niche Count of the Reference Point.
can be obtained. Note ρo is the density of the overall objective
Traversing every individual in the population, calculate the
space and ρi is density of the subspace. The adjustment process
distances between itself and all reference points and record
is divided into two situations described below:
the number of individuals associated with each reference
point using ρj , which represents the number of individuals (1) When the subspace density is less than the objective
associated with the jth reference point. space density, determine whether the subspace
6 Computational Intelligence and Neuroscience
ALGORITHM 3: Environmental_selection.
ALGORITHM 4: Weight_Adjustment.
density is too small. If the density of the subspace is distance. Let ρi be the density value. The vectors
less than the minimum space density h1 ρo , then the W_unit(gwk) and W_unit(gwl) are neighbor
subspace density is considered too small. In this case, weights of the respective weights.
the weight vector should be evacuated. At this point, W_unit(k) � W_unit(k) + ρi − mt × W_unit(gwk),
using the two nearest neighbor weight vectors and
adding their sum vectors to the set of weight vectors, (7)
the parent vectors are deleted. Otherwise, the weight
vector should be fine-tuned to achieve uniformity W_unit(l) � W_unit(l) + ρi − mt × W_unit(gwl).
across the objective plane. At this time, according to (8)
the density difference, the nearest two weight vectors
in the subspace are adjusted according to (7) and (8). (2) When the subspace density is greater than the ob-
Among them, the vectors W_unit(k) and W_unit(l) jective space density, determine whether the subspace
are the closest weight vectors. Let mt be the minimum density is too large. If the density of the subspace is
Computational Intelligence and Neuroscience 7
1.5
1
z
0.5
0
0
0.5 0
0.5
1 1
y x
1.5 1.5
Figure 1: Many subspaces by using the K-means clustering.
1.2
1 w1
w2 w3
0.8
z 0.6
0.4
0.2
1.5
0 1
0 0.2 0.4 0.5 y
0.6 0.8 1 1.2 0
x
Figure 2: Weight adjustment.
Table 1: The population size (N) for different numbers of Table 3: Parameter values used in NSGA-III and MOEA/D.
objectives.
Parameters NSGA-III MOEA/D
Number of Segment Population Crossover probability pc 1 1
objectives M parameter p size N Variation probability pm 1/V 1/V
3 12 92 Cross-distribution index ηc 30 20
5 6 212 Variance distribution index ηm 20 20
8 p1 3, p2 2 156
10 p1 2, p2 2 112
15 p1 2, p2 1 136 their sum vectors to the set of weight vectors. Oth-
erwise, at this time, according to the density differ-
Table 2: MFE times for different numbers of objectives. ence, adjust the weight vectors according to (9) and
Test instance M3 M5 M8 M 10 M 15
(10). Among them, the vectors W_unit(k) and
W_unit(l) are the furthest weight vectors. Note mx is
DTLZ1 36,800 127,200 117,000 112,000 204,000
DTLZ2 23,000 74,200 78,000 84,000 136,000
the maximum distance and ρi is the density value.
DTLZ3 92,000 212,000 156,000 168,000 272,000 mx − ρi
W_unit(k) W_unit(k) + × W_unit(l), (9)
DTLZ4 55,200 212,000 195,000 224,000 408,000 2
DTLZ5 55,200 212,000 187,200 168,000 272,000
DTLZ6 36,800 74,200 117,000 224,000 272,000 mx − ρi
WFG1∼4 36,800 127,200 195,000 224,000 408,000 W_unit(l) W_unit(l) + × W_unit(k). (10)
2
greater than the maximum space density h1 ρo , then It is worth emphasizing that the edge vector is im-
the subspace density is too large. In this case, the movable; otherwise, the search range of the algorithm will be
weight vector should be aggregated. At this point, take affected. Half of the maximum number of iterations was
the two furthest neighboring weight vectors and add selected as an enabling condition for the weight vector
8
Table 4: The GD average and standard deviation of NSGA-III-WA and other five algorithms on DTLZ1-6 testing problems.
Problem M NSGA-III-WA NSGA-III VAEA RVEA MOEA/D MOEA/D-M2M
3 2.003e 2 06 (1.017e 2 05) 9.210e − 04 + (2.322e − 04) 1.776e − 04 + (8.957e − 05) 1.091e − 03 + (2.154e − 03) 1.799e − 04 + (1.533e − 04) 6.851e − 03 + (8.316e − 03)
5 7.400e 2 08 (3.413e 2 08) 1.543e − 04 + (2.730e − 04) 1.077e − 04 + (1.656e − 04) 4.839e − 04 + (3.785e − 05) 4.745e − 05 + (3.232e − 05) 3.885e − 03 + (3.274e − 03)
DTLZ1 8 7.121e 2 06 (4.504e 2 06) 2.045e − 03 + (1.882e − 03) 1.722e − 03 + (2.746e − 03) 1.225e − 04 + (5.246e − 05) 1.524e − 04 + (9.587e − 05) 7.884e − 03 + (6.720e − 03)
10 3.065e 2 06 (4.042e 2 06) 4.618e − 03 + (3.393e − 03) 2.259e − 03 + (9.641e − 04) 2.637e − 03 + (1.233e − 04) 2.187e − 04 + (2.477e − 04) 2.199e − 02 + (6.495e − 03)
15 1.639e 2 07 (2.239e 2 07) 7.531e − 02 + (8.504e − 03) 3.695e − 03 + (1.214e − 03) 1.920e − 04 + (1.682e − 04) 1.386e − 04 + (9.843e − 05) 4.632e − 02 + (1.382e − 03)
3 1.715e 2 06 (6.988e 2 07) 1.269e − 04 + (1.679e − 05) 2.657e − 04 + (6.309e − 05) 4.557e − 04 + (7.907e − 05) 3.918e − 03 + (1.464e − 03) 2.926e − 04 + (3.825e − 05)
5 6.131e 2 05 (1.146e 2 05) 2.524e − 04 + (2.555e − 05) 4.729e − 04 + (5.229e − 05) 3.822e − 04 + (4.425e − 05) 9.614e − 02 + (1.702e − 03) 2.736e − 03 + (3.057e − 03)
DTLZ2 8 2.972e 2 04 (3.302e 2 05) 6.529e − 04 + (8.548e − 05) 6.463e − 04 + (1.351e − 04) 5.385e − 04 + (9.497e − 05) 3.303e − 03 + (5.187e − 05) 1.778e − 03 + (3.743e − 04)
10 6.169e 2 04 (1.346e 2 04) 1.139e − 03 + (2.172e − 04) 7.390e − 04 + (2.642e − 04) 9.629e − 04 + (1.761e − 04) 6.532e − 03 + (8.470e − 04) 2.556e − 03 + (4.978e − 04)
15 2.192e 2 04 (1.070e 2 04) 5.657e − 04 + (9.716e − 05) 7.725e − 04 + (1.826e − 04) 4.202e − 04 + (2.084e − 04) 9.719e − 02 + (2.696e − 03) 1.201e − 02 + (3.749e − 04)
3 2.224e 2 10 (2.411e 2 06) 5.619e − 04 + (1.986e − 04) 1.827e − 04 + (5.711e − 04) 3.788e − 03 + (1.353e − 03) 3.971e − 03 + (2.979e − 05) 1.157e − 03 + (6.428e − 04)
5 2.942e 2 06 (3.373e 2 06) 8.990e − 04 + (4.012e − 04) 2.232e − 03 + (1.227e − 03) 1.262e − 03 + (2.190e − 04) 7.459e − 03 + (1.364e − 04) 7.998e − 03 + (3.265e − 04)
DTLZ3 8 1.153e 2 04 (3.104e 2 05) 4.541e − 03 + (3.786e − 04) 3.961e − 03 + (9.325e − 04) 1.901e − 03 + (7.506e − 04) 3.127e − 02 + (9.883e − 04) 3.481e − 02 + (2.819e − 03)
10 2.368e 2 04 (9.563e 2 05) 5.926e − 03 + (2.259e − 03) 4.713e − 03 + (2.284e − 03) 9.694e − 03 + (2.543e − 03) 5.910e − 02 + (2.794e − 03) 3.009e − 02 + (2.295e − 03)
15 2.508e 2 04 (2.785e 2 04) 4.179e − 03 + (2.756e − 03) 7.510e − 03 + (4.256e − 04) 8.655e − 04 + (1.074e − 03) 9.834e − 02 + (3.003e − 03) 5.392e − 02 + (2.949e − 03)
3 4.345e 2 09 (4.389e 2 09) 4.611e − 04 + (2.188e − 04) 2.790e − 04 + (1.033e − 04) 2.852e − 04 + (1.749e − 04) 4.547e − 03 + (1.739e − 03) 1.061e − 04 + (3.147e − 05)
5 7.963e 2 15 (3.060e 2 14) 1.905e − 04 + (5.277e − 05) 5.008e − 04 + (1.074e − 04) 2.794e − 04 + (4.251e − 05) 3.596e − 04 + (1.854e − 05) 5.092e − 04 + (2.427e − 04)
DTLZ4 8 3.605e 2 12 (1.607e 2 11) 4.234e − 04 + (7.418e − 05) 7.629e − 04 + (1.203e − 04) 6.762e − 04 + (1.645e − 04) 7.233e − 03 + (2.509e − 04) 3.038e − 03 + (3.791e − 04)
10 3.272e 2 16 (9.275e 2 16) 5.561e − 04 + (9.294e − 05) 8.086e − 04 + (2.006e − 04) 1.084e − 03 + (2.514e − 04) 1.060e − 03 + (2.939e − 05) 2.981e − 03 + (5.370e − 04)
15 4.437e 2 15 (2.404e 2 14) 3.722e − 04 + (8.070e − 05) 7.542e − 04 + (1.518e − 04) 2.257e − 04 + (1.222e − 04) 1.319e − 03 + (4.983e − 05) 3.659e − 03 + (2.773e − 04)
3 1.400e − 01 (7.900e − 03) 1.344e − 01 + (9.360e − 03) 1.962e − 01 + (1.695e − 02) 2.065e − 01 + (1.245e − 02) 5.394e − 01 + (1.339e − 03) 5.097e 2 02 2 (2.371e 2 03)
5 7.633e − 02 (4.638e − 03) 1.659e − 01 + (1.039e − 03) 1.559e − 01 + (8.369e − 04) 3.419e − 01 + (5.779e − 03) 4.709e − 02 � (3.682e − 03) 4.543e 2 02 2 (5.821e 2 04)
DTLZ5 8 7.580e 2 02 (6.965e 2 03) 1.885e − 01 + (4.210e − 03) 2.219e − 01 + (1.306e − 03) 5.373e − 01 + (2.058e − 02) 8.841e − 02 + (1.342e − 03) 8.630e − 02 � (3.746e − 03)
10 7.731e 2 02 (6.006e 2 03) 2.259e − 01 + (4.640e − 03) 2.725e − 01 + (8.818e − 04) 3.896e − 01 + (3.558e − 03) 1.241e − 01 + (1.379e − 03) 9.947e − 02 + (1.385e − 03)
15 7.714e 2 02 (7.846e 2 03) 2.352e − 01 + (2.731e − 03) 2.503e − 01 + (7.531e − 04) 1.196e − 01 + (4.752e − 02) 1.468e − 01 + (8.034e − 04) 9.665e − 02 + (4.543e − 03)
3 9.107e 2 02 (1.067e 2 02) 1.605e − 01 + (1.225e − 02) 1.647e − 01 + (1.689e − 02) 2.299e − 01 + (4.912e − 03) 3.661e − 01 + (3.201e − 03) 2.010e − 01 + (3.569e − 03)
5 1.714e − 02 (2.046e − 03) 1.279e 2 02 2 (2.679e 2 03) 3.713e − 02 + (2.098e − 03) 3.492e − 02 + (2.748e − 03) 1.346e − 01 + (1.267e − 04) 6.235e − 02 + (8.714e − 04)
DTLZ6 8 1.482e − 02 (3.358e − 03) 4.119e 2 03 2 (4.968e 2 04) 7.671e − 02 + (6.004e − 03) 1.448e − 02 � (8.183e − 03) 1.034e − 01 + (6.880e − 03) 2.821e − 02 + (3.728e − 03)
10 1.868e − 02 (4.510e − 03) 1.331e 2 03 2 (3.501e 2 04) 7.597e − 03 − (2.829e − 04) 1.891e − 02 � (8.185e − 03) 5.831e − 02 + (2.794e − 03) 1.495e − 02 − (3.746e − 03)
15 5.735e − 03 (4.757e − 04) 3.154e 2 03 2 (1.438e 2 04) 4.743e − 03 − (1.742e − 04) 1.701e − 02 + (9.023e − 03) 1.361e − 02 + (1.011e − 04) 3.109e − 02 + (7.158e − 02)
# +/�/− — 26/0/4 28/0/2 28/2/0 29/1/0 26/1/3
Computational Intelligence and Neuroscience
Table 5: The IGD average and standard deviation of NSGA-III-WA and other five algorithms on DTLZ1-6 testing problems.
Problem M NSGA-III-WA NSGA-III VAEA RVEA MOEA/D MOEA/D-M2M
3 3.148e − 02 (6.732e − 04) 2.096e 2 02 � (6.245e − 04) 7.776e − 02 + (8.086e − 04) 6.202e − 02 + (2.796e − 03) 4.086e − 02 + (7.159e − 03) 4.315e − 02 + (5.569e − 03)
5 4.781e 2 02 (1.445e − 04) 6.547e − 02 + (1.645e − 04) 5.203e − 02 + (2.858e − 04) 4.840e − 02 + (2.853e − 04) 7.737e − 02 + (3.165e − 04) 1.086e − 01 + (1.434e − 03)
DTLZ1 8 8.196e − 02 (3.003e − 03) 9.294e − 02 + (1.489e − 03) 9.351e − 01 + (4.178e − 03) 7.720e 2 02 2 (5.637e − 03) 1.149e − 01 + (9.790e − 03) 1.489e − 01 + (5.276e − 03)
10 9.134e 2 02 (6.039e − 03) 1.309 e − 01 + (4.313e − 03) 1.119 e − 01 + (1.114e − 03) 1.142e − 01 + (2.156e − 03) 1.022e − 01 + (1.435e − 04) 2.464e − 01 + (3.592e − 03)
15 9.923e 2 02 (1.501e − 03) 1.324e − 01 + (2.857e − 03) 1.136e − 01 + (4.151e − 03 1.188e − 01 + (2.306e − 03) 1.132e − 01 + (4.539e − 03) 1.382e − 01 + (1.736e − 03)
3 5.474e − 02 (2.074e − 04) 5.452e 2 02 � (5.829e − 04) 5.637e − 02 + (4.368e − 04) 5.490e − 02 � (1.888e − 04) 6.392e − 02 + (7.698e − 04) 9.412e − 02 + (2.835 e − 03)
5 1.527e − 01 (8.099e − 04) 1.612e − 01 + (3.528e − 04) 1.553e − 01 + (4.447e − 03) 1.519e 2 01 2 (8.868e − 04) 3.486e − 01 + (1.318e − 03) 2.095e − 01 + (5.578e − 03)
DTLZ2 8 2.598e 2 01 (3.285e − 03) 2.675e − 01 + (7.748e − 04) 2.979e − 01 + (5.934e − 03) 2.617e − 01 � (4.335e − 03) 3.500e − 01 + (2.287e − 03) 4.494e − 01 + (5.147e − 03)
Computational Intelligence and Neuroscience
10 3.286e 2 01 (1.116e − 03) 3.570e − 01 + (9.137e − 03) 3.574e − 01 + (1.895e − 03) 4.600e − 01 + (1.013e − 03) 4.009e − 01 + (6.283e − 04) 4.603e − 01 + (5.291e − 03)
15 3.188e 2 01 (3.626e − 04) 3.580e − 01 + (8.035e − 04) 4.547e − 01 + (2.230e − 03) 3.592e − 01 + (1.777e − 03) 4.596e − 01 + (7.071e − 03) 4.583e − 01 + (3.264e − 03)
3 5.893e − 02 (7.098e − 04) 9.937e − 02 + (8.864e − 04) 5.593e 2 02 2 (1.972e − 03) 6.608e − 02 + (4.416e − 03) 6.385e − 02 + (1.490e − 03) 9.495e − 02 + (1.291e − 03)
5 1.671e − 01 (3.529e − 03) 1.598e 2 01 2 (4.145e − 03) 1.650e − 01 + (9.129e − 03) 1.583e − 01 � (4.052e − 03) 5.327e − 01 + (1.052e − 03) 5.158e − 01 + (4.742e − 03)
DTLZ3 8 2.857e 2 01 (2.753e − 02) 4.185e − 01 + (1.043e − 01) 3.706e − 01 + (5.212e − 02) 3.117e − 01 + (2.923e − 02) 4.196e − 01 + (4.810e − 02) 4.032e − 01 + (8.360e − 02)
10 3.252e 2 01 (1.778e − 02) 4.751e − 01 + (1.321e − 02) 6.767e − 01 + (4.540e − 02) 3.835e − 01 + (1.495e − 02) 4.401e − 01 + (4.446e − 03) 7.313e − 01 + (5.746e − 02)
15 3.225e 2 01 (5.028e − 03) 5.076e − 01 + (4.409e − 02) 5.469e − 01 + (2.917e − 02) 3.636e − 01 + (7.496e − 03) 4.414e − 01 + (2.664e − 02) 5.743e − 01 + (5.692e − 02)
3 2.998e 2 03 (1.402e − 04) 3.685e − 03 + (7.272e − 04) 5.537e − 02 + (1.937e − 01) 3.359e − 03 + (2.443e − 04) 6.434e − 02 + (1.009e − 01) 7.938e − 02 + (3.162e − 02)
5 9.586e 2 03 (6.178e − 04) 1.173e − 02 + (2.162e − 03) 1.704e − 01 + (1.054e − 03) 1.039e − 02 + (3.622e − 04) 4.485e − 02 + (2.562e − 03) 1.419e − 02 + (2.946e − 03)
DTLZ4 8 2.820e 2 02 (1.099e − 03) 3.257e − 02 + (2.645e − 03) 4.432e − 01 + (3.122e − 03) 3.082e − 02 + (5.237e − 04) 2.741e − 01 + (2.447e − 03) 4.622e − 01 + (4.797e − 03)
10 5.562e − 02 (3.895e − 03) 5.043e 2 02 2 (1.118e − 03) 7.208e − 01 + (3.446e − 03) 5.556e − 02 � (4.444e − 03) 1.411e − 01 + (8.020e − 03) 9.292e − 02 + (5.714e − 03)
15 9.265e − 02 (2.780e − 02) 8.040e 2 02 2 (2.634e − 03) 1.183e − 01 + (5.087e − 03) 1.002e − 01 + (1.266e − 02) 1.303e − 01 + (9.904e − 03) 1.033e − 01 + (3.758e − 03)
3 1.281e − 01 (1.585e − 02) 1.143e − 01 − (8.659e − 03) 1.674e − 01 + (5.705e − 02) 2.057e − 01 + (3.254e − 03) 4.196e − 01 + (2.332e − 03) 4.329e 2 02 2 (8.859e 2 03)
5 3.854e 2 01 (6.196e − 02) 1.137e + 00 + (1.129e − 01) 5.398e − 01 + (1.447e − 01) 5.198e − 01 + (8.029e − 02) 6.048e − 01 + (1.373e − 03) 4.785e − 01 + (4.553e − 02)
DTLZ5 8 3.702e 2 01 (5.821e − 02) 6.228e − 01 + (1.042e − 02) 7.637e − 01 + (3.741e − 02) 4.112e − 01 + (4.772e − 02) 4.647e − 01 + (5.300e − 02) 4.697e − 01 + (3.117e − 02)
10 3.853e − 01 (6.602e − 02) 7.052e − 01 + (5.501e − 02) 6.375e − 01 + (6.288e − 02) 3.821e 2 01 2 (1.585e − 02) 7.844e − 01 + (2.872e − 02) 5.840e − 01 + (3.769e − 02)
15 5.905e 2 01 (6.090e − 02) 8.251e − 01 + (2.694e − 02) 9.463e − 01 + (1.180e − 02) 6.022e − 01 � (4.108e − 02) 8.558e − 01 + (2.129e − 02) 6.296e − 01 + (5.632e − 02)
3 9.766e 2 01 (2.520e − 02) 1.516e + 00 + (9.127e − 02) 1.656e + 00 + (5.092e − 02) 1.303e + 00 + (2.028e − 02) 1.515e + 00 + (7.586e − 03) 1.826e + 00 + (3.657e − 03)
5 5.673e 2 01 (1.851e − 02) 6.385e − 01 + (3.418e − 02) 6.251e − 01 + (9.602e − 03) 7.416e − 01 + (6.407e − 03) 7.880e − 01 + (9.136e − 03) 9.257e − 01 + (5.256e − 03)
DTLZ6 8 5.141e 2 01 (4.012e − 02) 5.233e − 01 + (3.393e − 02) 5.460e − 01 + (1.476e − 02) 5.383e − 01 + (3.976e − 02) 7.703e − 01 + (8.253e − 03) 7.437e − 01 + (5.732e − 03)
10 4.425e − 01 (2.195e − 02) 3.994e 2 01 2 (2.948e − 02) 5.030e − 01 + (2.205e − 02) 6.184e − 01 + (1.084e − 02) 7.130e − 01 + (4.135e − 03) 6.718e − 01 + (5.169e − 02)
15 3.147e 2 01 (2.727e − 02) 3.558e − 01 + (1.079e − 02) 3.566e − 01 + (2.154e − 02) 5.502e − 01 + (8.639e − 02) 4.214e − 01 + (3.918e − 02) 6.972e − 01 + (4.715e − 02)
#+/�/− — 23/2/5 29/0/1 22/5/3 30/0/0 29/0/1
9
10
Table 6: The HV average and standard deviation of NSGA-III-WA and other five algorithms on DTLZ1-6 testing problems.
Problem M NSGA-III-WA NSGA-III VAEA RVEA MOEA/D MOEA/D-M2M
3 9.727e 2 01 (1.679e − 03) 9.661e − 01 � (3.208e − 03) 6.745e − 01 + (8.305e − 03) 9.379e − 01 + (9.716e − 03) 6.232e − 01 + (1.434e − 03) 9.595e − 01 + (5.082e − 03)
5 9.987e − 01 (4.645e − 04) 9.941e − 01 + (5.491e − 03) 9.936e − 01 + (8.779e − 04) 9.990e 2 01 − (3.939e − 04) 8.516e − 01 + (5.741e − 03) 8.409e − 01+(1.297e − 03)
DTLZ1 8 9.986e 2 01 (1.009e − 03) 9.910e − 01 + (8.001e − 03) 8.763e − 01 + (3.096e − 03) 9.724e − 01 + (3.224e − 03) 8.396e − 01 + (4.159e − 03) 8.682e − 01 + (5.529e − 03)
10 9.983e 2 01 (5.898e − 04) 9.858e − 01 + (1.008e − 03) 9.082e − 01 + (1.463e − 03) 9.972e − 01 � (3.197e − 03) 8.974e − 01 + (3.229e − 03) 8.754e − 01 + (2.519e − 03)
15 9.998e 2 01 (1.572e − 04) 9.980e − 01 + (7.871e − 04) 9.960e − 01 + (1.004e − 03) 9.995e − 01 + (5.718e − 04) 8.830e − 01 + (8.141e − 03) 7.307e − 01 + (2.841e − 03)
3 9.244e − 01 (2.256e − 03) 9.250e − 01 � (2.214e − 03) 9.231e − 01 + (1.889e − 03) 9.251e 2 01 − (3.107e − 03) 7.737e − 01 + (1.329e − 03) 8.968e − 01 + (2.764e − 03)
5 9.909e 2 01 (1.504e − 03) 9.890e − 01 + (5.152e − 04) 9.899e − 01 + (9.500e − 04) 9.379e − 01 + (7.638e − 03) 7.323e − 01 + (4.672e − 03) 9.760e − 01 + (2.302e − 03)
DTLZ2 8 9.992e 2 01 (1.119e − 04) 9.984e − 01 + (2.264e − 04) 9.885e − 01 + (1.367e − 03) 9.985e − 01 + (2.970e − 03) 7.386e − 01 + (4.697e − 03) 8.922e − 01 + (6.281e − 03)
10 9.989e 2 01 (4.487e − 04) 9.942e − 01 + (4.741e − 03) 9.969e − 01 + (2.531e − 03) 9973e − 01 + (3.720e − 03) 7.020e − 01 + (5.068e − 03) 8.815e − 01 + (5.413e − 03)
15 9.999e − 01 (2.973e − 04) 1.000e + 00 � (2.199e − 03) 9.998e − 01 + (1.731e − 03) 9.999e − 01 + (1.017e − 03) 8.788e − 01 + (7.351e − 03) 9.082e − 01 + (4.352e − 03)
3 9.261e 2 01 (2.199e − 03) 9.202e − 01 � (1.894e − 03) 9.235e − 01 + (1.788e − 03) 9.197e − 01 + (4.206e − 03) 8.482e − 01 + (1.881e − 03) 9.062e − 01 + (7.041e − 03)
5 9.899e 2 01 (1.063e − 03) 9.892e − 01 � (9.066e − 04) 9.865e − 01 � (1.690e − 03) 8.921e − 01 + (2.124e − 03) 6.265e − 01 + (7.156e − 03) 4.653e − 01 + (2.740e − 03)
DTLZ3 8 9.831e − 01 (1.132e − 03) 9.984e 2 01 − (2.184e − 04) 8.619e − 01 + (5.486e − 03) 9.981e − 01 − (7.039e − 04) 7.947e − 01 + (4.331e − 03) 5.237e − 01 + (2.762e − 03)
10 9.975e 2 01 (4.712e − 03) 9.789e − 01 + (5.968e − 03) 8.869e − 01 + (1.820e − 02) 9.916e − 01 + (3.154e − 03) 7.193e − 01 + (5.231e − 02) 3.010e − 01 + (7.925e − 02)
15 1.000e + 00 (5.351e − 04) 9.998e − 01 + (1.778e − 03) 9.737e − 01 + (4.021e − 03) 9.999e − 01 + (3.956e − 04) 8.390e − 01 + (2.721e − 03) 3.417e − 01 + (8.294e − 03)
3 9.252e − 01 (2.831e − 03) 8.762e − 01 � (6.284e − 03) 8.950e − 01 + (1.073e − 03) 9.261e 2 01 � (2.322e − 03) 7.646e − 01 + (1.483e − 03) 9.097e − 01 � (6.935e − 03)
5 9.867e − 01 (1.377e − 03) 9.887e 2 01 − (8.484e − 04) 9.853e − 01 � (9.799e − 04) 9.831e − 01 + (1.641e − 03) 6.276e − 01 + (2.529e − 03) 9.861e − 01 � (1.544e − 03)
DTLZ4 8 9.987e − 01 (5.391e − 04) 9.987e − 01 � (2.606e − 04) 9.957e − 01 + (4.359e − 03) 9.989e 2 01 − (3.131e − 04) 7.757e − 01 + (5.391e − 04) 9.943e − 01 + (1.414e − 04)
10 9.998e 2 01 (1.117e − 04) 9.998e 2 01 � (9.643e − 04) 9.996e − 01 + (5.604e − 04) 9.995e − 01 + (5.008e − 04) 7.350e − 01 + (5.951e − 04) 9.964e − 01 + (2.945e − 03)
15 1.000e + 00 (9.863e − 04) 9.999e − 01 + (2.858e − 04) 9.995e − 01 + (1.863e − 03) 9.998e − 01 + (8.315e − 04) 8.456e − 01 + (4.485e − 03) 9.927e − 01 + (4.372e − 03)
3 8.370e − 01 (1.361e − 03) 8.128e − 01 + (5.771e − 03) 8.049e − 01 + (2.157e − 03) 8.689e 2 01 − (1.159e − 03) 8.094e − 01 + (1.693e − 03) 7.330e − 01 − (2.916e − 02)
5 8.622e 2 01 (3.332e − 02) 3.775e − 01 + (1.310e − 02) 6.676e − 01 + (7.426e − 02) 8.057e − 01 + (2.957e − 02) 4.525e − 01 + (9.054 e − 03) 8.099e − 01 + (3.771e − 03)
DTLZ5 8 8.099e 2 01 (4.163e − 02) 6.056e − 01 + (3.357e − 02) 5.373e − 01 + (4.556e − 02) 7.143e − 01 + (3.091e − 02) 7.205e − 01 + (6.349e − 02) 6.365e − 01 + (5.982e − 02)
10 7.885e 2 01 (3.761e − 02) 7.052e − 01 + (5.501e − 02) 6.375e − 01 + (6.294e − 02) 6.399e − 01 + (9.123e − 02) 6.859e − 01 + (5.109e − 02) 3.786e − 01 + (2.764e − 02)
15 6.892e − 01 (7.425e − 02) 5.436e − 01 + (9.687e − 02) 7.714e 2 01 2 (9.537e − 02) 5.691e − 01 + (9.341e − 02) 5.864e − 01 + (4.891e − 02) 5.144e − 01 + (2.467e − 02)
3 1.079e + 00 (1.335e − 03) 1.043e + 00 + (4.541e − 03) 1.056e + 00 + (4.296e − 03) 9.258e − 01 + (9.737e − 03) 9.493e − 01 + (8.997e − 03) 1.041e+00 � (3.657e − 03)
5 1.429e + 00 (1.647e − 02) 1.384e + 00 + (5.443e − 02) 1.166e + 00 + (2.272e − 02) 1.402e + 00 + (3.644e − 02) 1.248e + 00 + (2.181e − 03) 1.275e+00 + (4.681e − 03)
DTLZ6 8 1.468e + 00 (5.101e − 02) 1.416e + 00 + (2.891e − 02) 1.213e + 00 + (8.664e − 02) 1.176e + 00 + (9.943e − 02) 1.095e + 00 + (2.064e − 02) 8.201e − 01 + (7.193e − 02)
10 1.123e + 00 (9.184e − 02) 1.127e + 00 � (5.759e − 02) 9.897e − 01 + (2.878e − 02) 9.816e − 01 + (1.076e − 02) 8.064e − 01 + (4.445e − 02) 9.322e − 01 + (4.926e − 02)
15 9.319e − 01 (2.287e − 02) 9.107e − 01 + (7.325e − 03) 9.461e 2 01 � (4.023e − 02) 7.651e − 01 + (9.930e − 02) 7.378e − 01 + (3.547e − 02) 7.194e − 01 + (3.271e − 02)
#+/�/− — 19/9/2 26/3/1 23/2/5 30/0/0 26/3/1
Computational Intelligence and Neuroscience
Computational Intelligence and Neuroscience 11
adjustment strategy and adjust every four generations in this The NSGA-III-WA algorithm divides the objective
paper. In this time, the objective vectors have approached space into several subspaces and adjusts the weight
the PFs, so the guidance of the weight vectors is relatively vectors according to the individual density of the
accurate, and the population update is relatively stable (i.e., it objective space. This method can better ensure the
is close to the PF). uniformity of the weight vectors on the objective
surface, thus ensuring the uniformity of the so-
4. Discussion lution set.
(×10–3)
0.09 1
8 0.085
0.08 0.95
6 0.075
0.07
IGD
HV
GD
0.9
4 0.065
0.06
2 0.055 0.85
0.05
0 0.045 0.8
NSGA3-WA
NSGA3
VAEA
RVEA
MOEAD
NSGA3-WA
NSGA3
VAEA
RVEA
MOEAD
NSGA3-WA
NSGA3
VAEA
RVEA
MOEAD
Figure 3: Boxplots of GD, IGD, and HV index by the four algorithms with 5 objectives on DTLZ1 problem.
(×10–3)
0.4 1
7
6 0.35
0.9
5
0.3
4
IGD
HV
GD
3 0.25 0.8
2
0.2
1 0.7
0 0.15
NSGA3-WA
NSGA3
VAEA
RVEA
MOEAD
NSGA3-WA
NSGA3
VAEA
RVEA
MOEAD
NSGA3-WA
NSGA3
VAEA
RVEA
MOEAD
Figure 4: Boxplots of GD, IGD, and HV index by the four algorithms with 5 objectives on DTLZ2 problem.
(×10–3)
1
8
0.5
7
6 0.9
5 0.4
IGD
0.8
HV
GD
4
3 0.3
2 0.7
1 0.2
0 0.6
NSGA3-WA
NSGA3
VAEA
RVEA
MOEAD
NSGA3-WA
NSGA3
VAEA
RVEA
MOEAD
NSGA3-WA
NSGA3
VAEA
RVEA
MOEAD
Figure 5: Boxplots of GD, IGD, and HV index by the four algorithms with 5 objectives on DTLZ3 problem.
(×10–4)
9 0.05 1
8
7 0.04 0.9
6
5
0.03
IGD
0.8
HV
GD
4
3
0.02 0.7
2
1
0 0.01 0.6
NSGA3-WA
NSGA3
VAEA
RVEA
MOEAD
NSGA3-WA
NSGA3
VAEA
RVEA
MOEAD
NSGA3-WA
NSGA3
VAEA
RVEA
MOEAD
Figure 6: Boxplots of GD, IGD, and HV index by the four algorithms with 5 objectives on DTLZ4 problem.
0.35
0.9
1.2
0.3
0.8
0.25 1
0.7
0.2 0.8
IGD
HV
0.6
0.15 0.6
0.5
0.1 0.4 0.4
0.05
0.2
NSGA3-WA
NSGA3
VAEA
RVEA
MOEAD
NSGA3-WA
NSGA3
VAEA
RVEA
MOEAD
NSGA3-WA
NSGA3
VAEA
RVEA
MOEAD
Figure 7: Boxplots of GD, IGD, and HV index by the four algorithms with 5 objectives on DTLZ5 problem.
0.14 1.5
0.8
0.12
0.75 1.4
0.1
0.08 0.7
1.3
IGD
HV
0.06 0.65
0.04 1.2
0.6
0.02
0.55 1.1
NSGA3-WA
NSGA3
VAEA
RVEA
MOEAD
NSGA3-WA
NSGA3
VAEA
RVEA
MOEAD
NSGA3-WA
NSGA3
VAEA
RVEA
MOEAD
Figure 8: Boxplots of GD, IGD, and HV index by the four algorithms with 5 objectives on DTLZ6 problem.
corresponding parameter settings for each algorithm, and variable is k M − 1 and the objective dimension is M;
second, to explain the experimental results, compare, and the distance variable is l 10. The population sizes of
analyze them. NSGA-III and RVEA are related to the uniformly dis-
tributed weight scale and determined by the combination
5.1. General Experimental Settings. The number of decision number of M and the number of p on each objective. The
variables for all test functions is V M + k − 1, and M is double-layer distribution method in [13] is adopted in
the number of objective functions, k 5 for DTLZ1, and order to tackle the problem. The specific parameter
k 10 for DTLZ2-6. The number of decision variables for settings are given in Table 1. For fair comparison, the
all WFG test functions is V k + 1, where the position population size is the same as the other three algorithms.
14 Computational Intelligence and Neuroscience
1.4 0.4
1
1.2
0.35 0.98
1 0.96
0.3
0.8 0.94
0.25
GD
IGD
0.6 0.92
HV
0.4 0.2 0.9
0.2 0.15 0.88
0.86
0 0.1 0.84
NSGA3-WA
NSGA3
VAEA
RVEA
MOEAD
NSGA3-WA
NSGA3
VAEA
RVEA
MOEAD
NSGA3-WA
NSGA3
VAEA
RVEA
MOEAD
Figure 9: Boxplots of GD, IGD, and HV index by the four algorithms with 15 objectives on DTLZ1 problem.
0.1 0.5 1
0.09
0.08 0.95
0.07 0.45
0.06 0.9
GD
IGD
HV
0.05 0.4
0.04
0.03 0.85
0.02 0.35
0.01 0.8
0
NSGA3-WA
NSGA3
VAEA
RVEA
MOEAD
NSGA3-WA
NSGA3
VAEA
RVEA
MOEAD
NSGA3-WA
NSGA3
VAEA
RVEA
MOEAD
Figure 10: Boxplots of GD, IGD, and HV index by the four algorithms with 15 objectives on DTLZ2 problem.
0.1 0.9 1
0.09
0.08 0.8 0.95
0.07
0.06 0.7
0.9
IGD
GD
0.05 0.6
HV
0.04 0.85
0.03 0.5
0.02 0.8
0.4
0.01
0 0.3 0.75
NSGA3-WA
NSGA3
VAEA
RVEA
MOEAD
NSGA3-WA
NSGA3
VAEA
RVEA
MOEAD
NSGA3-WA
NSGA3
VAEA
RVEA
MOEAD
Figure 11: Boxplots of GD, IGD, and HV index by the four algorithms with 15 objectives on DTLZ3 problem.
The algorithm runs independently for 30 times on each 5.2. Results and Analysis. In order to verify the performance
test function. The algorithm uses the maximum function saliency of the proposed NSGA-III-WA algorithm on many
evaluation (MFE) as the terminal condition for each run. objective optimization problems, general performance
Due to objective dimensions of the solution problem, evaluation indicators GD, IGD and HV were used. It is
MFE is also not the same. According to [16], the specific compared with five good algorithms, MOEA/D, NSGA-III,
settings are shown in Table 2. The maximum number of it- VAEA, RAEA, and MOEA/D-M2M, and representative
erations is calculated by gen_max MFE/N. The parameter algorithms with the objectives 3, 5, 8, 10, and 15 on the
settings of NSGA-III and MOEA/D are shown in Table 3. DTLZ1-6 test function and WFG1-4 test instances.
In addition, the number of MOEA/D neighbor weight T is 10;
the RVEA penalty factor change rate α is 2, and the VAEA
angle threshold is expressed as δ (π/2)/(N + 1). 5.2.1. Testing and Analysis of DTLZ Series Functions.
Computational Intelligence and Neuroscience 15
(×10–4)
14
0.18 1
12
0.16
10 0.95
8 0.14
IGD
GD
HV
6 0.9
0.12
4
0.1 0.85
2
0 0.08
0.8
NSGA3-WA
NSGA3
VAEA
RVEA
MOEAD
NSGA3-WA
NSGA3
VAEA
RVEA
MOEAD
NSGA3-WA
NSGA3
VAEA
RVEA
MOEAD
Figure 12: Boxplots of GD, IGD, and HV index by the four algorithms with 15 objectives on DTLZ4 problem.
0.8 1
0.7 0.8
0.6
0.8
0.5 0.7
HV
0.4
GD
IGD
0.6 0.6
0.3
0.2 0.5
0.1 0.4
0 0.4
NSGA3-WA
NSGA3
VAEA
RVEA
MOEAD
NSGA3-WA
NSGA3
VAEA
RVEA
MOEAD
NSGA3-WA
NSGA3
VAEA
RVEA
MOEAD
Figure 13: Boxplots of GD, IGD, and HV index by the four algorithms with 15 objectives on DTLZ5 problem.
0.035
1
0.03 0.6
0.025 0.9
0.5
0.02
IGD
GD
HV
0.015 0.8
0.4
0.01 0.7
0.3
0.005
0.6
NSGA3-WA
NSGA3
VAEA
RVEA
MOEAD
NSGA3-WA
NSGA3
VAEA
RVEA
MOEAD
NSGA3-WA
NSGA3
VAEA
RVEA
MOEAD
Figure 14: Boxplots of GD, IGD, and HV index by the four algorithms with 15 objectives on DTLZ6 problem.
This section shows the results and analysis of the GD, IGD, competing algorithm on each test case. The test results are
and HV performance test data of the DTLZ1-6 test function. given at the end of each cell, represented by the symbols “+,”
The experimental results are shown in Tables 4–6. They “,” or “−,” which indicate that the NSGA-III performance is
are the average values and standard deviations of 30 in- better than the algorithm in the corresponding column,
dependent running results. The best results are shown in equal to, and worse. At the same time, the last row of
black and bold, and the values in parentheses indicate the Tables 4–6 summarizes the number of test instances that
standard deviation; the number in square brackets is the NSGA-III-WA is significantly better than, equal to, and
algorithm performance ranking, which is based on the below its competitors. Tables 7–9 show the results of the
Whitney–Wilcoxon rank-sum test [33]. To investigate comparison of NSGA-III-WA algorithm with the other five
whether NSGA-III-WA is statistically superior to other al- algorithms under different objective numbers.
gorithms, Wilcoxon’s rank-sum test is performed at a 0.05 It can be seen from the experimental results in Table 4
significance level between NSGA-III-WA and each that the GD values of NSGA-III-WA on the DTLZ1-4 are
16 Computational Intelligence and Neuroscience
1 1
0.9 0.9
0.8 0.8
0.7 0.7
Objective value
Objective value
0.6 0.6
0.5 0.5
0.4 0.4
0.3 0.3
0.2 0.2
0.1 0.1
0 0
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15
Objective number Objective number
(a) (b)
1 1
0.9 0.9
0.8 0.8
0.7 0.7
Objective value
Objective value
0.6 0.6
0.5 0.5
0.4 0.4
0.3 0.3
0.2 0.2
0.1 0.1
0 0
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15
Objective number Objective number
(c) (d)
1 1
0.9 0.9
0.8 0.8
0.7 0.7
Objective value
Objective value
0.6 0.6
0.5 0.5
0.4 0.4
0.3 0.3
0.2 0.2
0.1 0.1
0 0
2 4 6 8 10 12 14 2 4 6 8 10 12 14
Objective number Objective number
(e) (f )
Figure 15: Parallel graph of the final solution set of each algorithm on the 15-objective DTLZ2 test problem. (a) NSGA-III-WA. (b) NSGA-
III. (c) MOEA/D. (d) RVEA. (e) VAEA. (h) MOEA/D-M2M.
superior to the other five algorithms, only 8th, 10th, and 15th many-objective problems, the convergence of NSGA-III-
dimensions are superior on the DTLZ5, and NSGA-III on WA is more effective than that of NSGA-III algorithm
the DTLZ6 gets the best results. It shows that in solving and is better than other algorithms. Table 7 shows summary
Table 10: The IGD average and standard deviation of NSGA-III-WA and other five algorithms on WFG1-4 testing problems.
Problem M NSGA-III-WA NSGA-III VAEA RVEA MOEA/D MOEA/D-M2M
3 1.171e + 00 (2.727e − 01) 1.370e + 00 + (3.356e − 01) 1.324e + 00 + (2.315e − 01) 1.047e + 00 2 (2.417e − 01) 1.216e + 00 + (2.173e − 01) 1.211e + 00 + (3.725e − 01)
5 2.828e + 00 (1.057e − 01) 2.927e + 00 + (3.726e − 01) 3.203e + 00 + (2.941e − 01) 3.171e + 00 + (3.173e − 01) 3.701e + 00 + (2.053e − 01) 1.953e + 00 − (3.278e − 01)
WFG1 8 5.721e + 00 (1.714e − 01) 5.230e + 00 − (1.572e − 01) 5.139e + 00 2 (1.437e − 01) 5.520e + 00 − (1.635e − 01) 6.623e + 00 + (1.052e − 01) 5.769e + 00 � (1.678e − 01)
Computational Intelligence and Neuroscience
10 7.146e + 00 (3.182e − 02) 7.071e + 00 2 (5.709e − 02) 7.238e + 00 + (4.083e − 02) 7.182e + 00 + (4.281e − 02) 9.541e + 00 + (3.219e − 01) 7.816e + 00 + (9.023e − 02)
15 8.942e + 00 (1.284e − 01) 9.079e + 00 + (1.673e − 01) 9.057e + 00 + (3.073e − 01) 9.149e + 00 + (3.726e − 01) 1.183e + 01 + (2.961e − 01) 1.235e + 00 + (3.618e − 01)
3 2.149e 2 01 (6.137e − 02) 2.839e − 01 + (1.040e − 01) 3.218e − 01 + (8.931e − 02) 3.157e − 01 + (4.366e − 02) 1.317e + 00 + (7.013e − 02) 3.714e − 01 + (4.827e − 02)
5 5.237e 2 01 (9.814e − 02) 6.125e − 01 + (1.375e − 01) 9.052e − 01 + (2.781e − 01) 7.026e − 01 + (1.437e − 01) 3.971e + 00 + (2.739e − 01) 1.411e + 00 + (3.894e − 01)
WFG2 8 2.316e + 00 (1.835e − 01) 3.146e + 00 + (1.379e − 01) 2.007e + 00 2 (2.371e − 01) 2.572e + 00 + (1.638e − 01) 8.837e + 00 + (5.462e − 01) 2.885e + 00 + (4.732e − 01)
10 2.037e + 00 (2.171e − 01) 2.923e + 00 + (4.518e − 01) 3.592e + 00 + (4.178e − 01) 2.964e + 00 + (3.926e − 01) 1.027e + 01 + (1.001e + 00) 2.1416e + 00 � (9.287e − 01)
15 5.187e + 00 (1.373e − 01) 6.223e + 00 + (2.381e − 01) 5.250e + 00 � (5.936e − 01) 4.945e + 00 � (3.826e − 01) 1.346e + 01 + (2.964e − 01) 4.014e + 00 − (5.382e − 01)
3 2.163e − 01 (2.647e − 02) 3.791e − 01 + (8.167e − 02) 1.489e 2 01 2 (6.916e − 03) 1.977e − 01 − (3.283e − 02) 1.793e − 01 − (3.034e − 02) 2.361e − 01 + (4.624e − 02)
5 4.746e 2 01 (6.437e − 03) 5.274e − 01 + (7.136e − 03) 4.793e − 01 + (6.374e − 03) 4.827e − 01 + (5.378e − 03) 5.418e − 01 + (2.542e − 02) 7.633e − 01 + (1.091e − 01)
WFG3 8 1.308e + 00 (2.748e − 02) 1.709e + 00 + (1.061e − 01) 1.427e + 00 + (2.135e − 02) 1.604e + 00 + (3.375e − 02) 1.829e + 00 + (4.873e − 02) 2.487e + 00 + (3.823e − 02)
10 1.864e + 00 (2.073e − 02) 2.176e + 00 + (2.874e − 01) 1.725e + 00 2 (3.271e − 02) 1.845e + 00 � (4.273e − 02) 2.966e + 00 + (7.627e − 02) 3.369e + 00 + (6.379e − 02)
15 2.815e + 00 (2.733e − 01) 4.206e + 00 + (1.537e − 01) 2.963e + 00 + (2.736e − 01) 3.028e + 00 + (1.893e − 01) 5.265e + 00 + (8.733e − 02) 6.738e + 00 + (1.284e − 01)
3 2.043e 2 01 (2.274e − 03) 2.147e − 01 + (3.859e − 04) 2.317e − 01 + (7.352e − 03) 2.272e − 01 + (3.72e − 03) 2.475e − 01 + (3.758e − 03) 3.581e − 01 + (3.146e − 03)
5 9.635e − 01 (3.762e − 03) 9.865e − 01 + (4.873e − 03) 9.535e − 01 + (5.378e − 03) 9.526e 2 01 2 (3.288e − 03) 1.284e + 00 + (3.725e − 02) 1.676e + 00 + (1.003e − 02)
WFG4 8 3.021e + 00 (4.887e − 03) 3.262e + 00 + (6.256e − 03) 3.023e + 00 � (1.567e − 02) 3.114e + 00 + (6.331e − 03) 6.642e + 00 + (1.526e − 02) 4.6209e + 00 + (2.462e − 02)
10 4.063e + 00 (1.879e − 02) 4.621e + 00 + (2.834e − 02) 3.982e + 00 � (3.274e − 02) 3.870e + 00 2 (2.716e − 01) 9.826e + 00 + (1.873e − 01) 6.698e + 00 + (2.706e − 01)
15 8.926e + 00 (2.768e − 01) 9.732e + 00 + (3.381e − 02) 8.541e + 00 2 (2.834e − 01) 8.737e + 00 − (2.762e − 01) 1.496e + 01 + (9.276e − 03) 1.103e + 01 + (1.241e − 01)
# +/�/− — 18/0/2 12/3/5 12/2/6 19/0/1 16/2/2
17
18 Computational Intelligence and Neuroscience
of statistical test results from Table 4. NSGA-III-WA is Figures 9–14 show the performance box diagram under the
compared to five other more advanced multiobjective al- fifteen goals. Each box diagram is calculated by inputting 30
gorithms and counts the number of wins (+), equal to (�), independent running results. It reflects the median, maxi-
and number of loses (−). As can be seen from the table, mum, minimum, upper quartile, lower quartile, and outliers
NSGA-III-WA is clearly superior to the five most advanced of the five algorithms on indicators GD, IGD, and HV.
designs selected. From Figures 3–8, it can be seen that NSGA-III-WA can
From Table 5, the NSGA-III-WA can get the best results achieve better results when dealing with most test problems.
especially the objectives 5, 10, and 15 on the DTLZ1, 8, 10, Its convergence and breadth are significantly better than the
and 15 on the DTLZ2 and DTLZ3, 3, 5, and 8 on the DTLZ4, other five algorithms. The overall performance indicators
and 5, 8, and 15 on the DTLZ5. Moreover, the objectives 3 achieve the best results on the DTLZ1 and DTLZ4 and get the
and 8 on the DTLZ1, the objectives 3 and 5 on the DTLZ2, second best results on the DTLZ2 and DTLZ3. Although the
the objective 3 on the DTLZ3, and the objective 10 on the minimum value is obtained on the DTLZ5 and DTLZ6, there
DTLZ5 achieve the second best results. Nevertheless, on the exist abnormal values, indicating that the algorithm is rela-
DTLZ5 and DTLZ6, the results of NSGA-III-WA are not tively unstable. This is because DTLZ5 and DTLZ6 test the
significant because the DTLZ5 and DTLZ6 are used to test ability of the algorithm to converge to a straight line, while
the ability to converge to a curve. Owing to the reason that NSGA-III-WA needs to build a hypersurface, so it cannot
NSGA-III-WA needs to build a hypersurface, M extreme converge to a curve well. However, the overall robustness of
points cannot be found in the later stage of the algorithm to NSGA-III-WA is relatively better with all test function results.
construct the hypersurface, and it cannot converge to a curve From Figures 9–14, it can be seen that the NSGA-III-WA
well. In addition, NSGA-III-WA has noticeable effects on has the ability to handle most problems under the 15 ob-
other test functions and is a kind of stable and relatively jectives. The convergence on the DTLZ1-DTLZ5 is signifi-
comprehensive algorithm. Table 8 shows summary of cantly better than the other five algorithms. The overall
statistical test results from Table 5. It can be seen from the performance obtains the best results on the DTLZ1-DTLZ3,
table that NSGA-III-WA performs best on the other five DTLZ5, and DTLZ6. The NSGA-III-WA under the 15 ob-
algorithms. jectives can get the minimum on DTLZ5 and DTLZ6 but is
From Table 6, it can be seen that the NSGA-III-WA can relatively unstable. The breadth achieves the best results on
effectively handle most test problems. It can get the best the DTLZ1, DTLZ3, and DTLZ4 and gets the second best
results especially the objectives 3, 8, 10, and 15 on the results on the DTLZ2, DTLZ5, and DTLZ6, and there exist
DTLZ1, 5 and 10 on the DTLZ2, 3, 5, 10, and 15 on the abnormal values. On the 15 objectives, it is evident that the
DTLZ3, 10 and 15 on the DTLZ4, 5, 8, and 10 on the DLTZ5, outliers of each algorithm increase. That is explained by the
and 3, 5, and 8 on the DTLZ6. Moreover, the objective 5 on fact that the stability of algorithms in the high-dimensional
the DTLZ1, objective 15 on the DTLZ2, objectives 3 and 5 on space will decline due to the increase of the spatial breadth.
the DTLZ4, objectives 3 and 15 on the DTLZ5, and ob- Depending on the results of all test functions, NSGA-III-WA
jectives 10 and 15 on the DTLZ6 achieve the second best has better stability.
results. However, the performances of objective 3 on the In order to visually reflect the distribution of the solution
DTLZ2 and 8 on the DTLZ3 are poor. Although NSGA-III, set in the high-dimensional target space, parallel coordinates
VAEA, RVEA, and MOEA/D can obtain optimal values for are used to visualize the high-dimensional data as shown in
a particular dimension in the function, NSGA-III-WA has Figure 15.
the best overall performance considering all dimensional From Figure 15, it can be seen that NSGA-III-WA and
objective results. Table 9 shows summary of statistical test RVEA find the final solution set in this problem to be similar
results from Table 6. As can be seen from the table, NSGA- in convergence and distribution. In contrast, MOEA/D-
III-WA hypervolume performance is better than the other M2M and NSGA-III are slightly less distributed than the
five algorithms. above three algorithms. VAEA finds that the distribution of
In order to express the effect of the algorithm more the solution is poor. MOEA/D appears the concentrated
intuitively, the performance of the algorithm is presented in solution. Lose extreme solutions at 12 objectives and the
the form of a box diagram. Due to space limitations, only the distribution of MOEA/D is seriously missing.
analysis of the box diagrams of the four algorithms under
five goals and fifteen goals is given here. Figures 3–8 show 5.2.2. Testing and Analysis of WFG Series Functions. The
the performance box diagram under the four goals, and performance indicators of the WFG test function are mainly
Table 12: The HV average and standard deviation of NSGA-III-WA and other five algorithms on WFG1-4 testing problems.
Problem M NSGA-III-WA NSGA-III VAEA RVEA MOEA/D MOEA/D-M2M
3 5.114e − 01 (2.425e − 02) 5.013e − 01 + (2.279e − 02) 5.217e 2 01 2 (2.725e − 02) 4.963e − 01 + (2.673e − 02) 4.927e − 01 + (9.245e − 03) 4.824e − 01 + (5.245e − 02)
5 4.725e − 01 (4.269e − 03) 4.632e − 01 + (5.352e − 03) 5.172e − 01 − (4.281e − 03) 4.824e − 01 � (3.729e − 03) 5.793e 2 01 − (4.736e − 03) 4.875e − 01 � (4.237e − 03)
WFG1 8 4.481e 2 01 (2.724e − 03) 4.116e − 01 + (3.861e − 03) 4.480e − 01 � (3.783e − 03) 4.376e − 01 � (2.751e − 02) 4.472e − 01 � (2.747e − 02) 4.324e − 01 � (3.783e − 02)
Computational Intelligence and Neuroscience
10 6.063e − 01 (3.217e − 02) 5.937e − 01 � (6.273e − 02) 5.997e − 01 � (4.238e − 02) 6.218e 2 01 2 (3.628e − 02) 4.926e − 01 + (1.026e − 01) 5.382e − 01 + (7.375e − 02)
15 6.279e 2 01 (2.781e − 02) 6.163e − 01 + (3.273e − 02) 6.181e − 01 + (2.672e − 02) 6.197e − 01 + (2.418e − 02) 3.472e − 01 + (2.163e − 01) 4.781e − 01 + (1.784e − 01)
3 8.373e − 01 (4.263e − 02) 8.524e 2 01 2 (3.279e − 02) 8.393e − 01 � (3.789e − 02) 8.334e − 01 � (2.274e − 02) 7.251e − 01 + (3.194e − 02) 8.425e − 01 − (2.785e − 02)
5 9.602e 2 01 (3.793e − 02) 9.547e − 01 + (3.926e − 02) 9.482e − 01 + (4.821e − 02) 9.376e − 01 + (4.245e − 02) 9.172e − 01 + (8.278e − 02) 9.318e − 01 + (6.351e − 02)
WFG2 8 9.223e − 01 (2.891e − 02) 9.502e − 01 − (3.268e − 02) 9.172e − 01 + (2.753e − 02) 9.514e 2 01 2 (2.724e − 02) 8.702e − 01 + (6.932e − 02) 8.945e − 01 + (4.837e − 02)
10 9.492e 2 01 (2.893e − 02) 9.471e − 01 + (2.062e − 02) 9.261e − 01 + (3.625e − 02) 9.372e − 01 + (1.341e − 01) 8.981e − 01 + (1.826e − 01) 9.148e − 01 + (4.782e − 01)
15 9.715e 2 01 (2.715e − 02) 9.678e − 01 + (2.361e − 02) 9.483e − 01 + (2.936e − 02) 9.572e − 01 + (3.798e − 02) 7.815e − 01 + (3.781e − 01) 8.147e − 01 + (3.461e − 01)
3 8.068e − 01 (3.278e − 03) 8.136e 2 01 2 (2.267e − 03) 7.978e − 01 + (4.624e − 03) 5.749e − 01 + (2.411e − 02) 7.371e − 01 + (4.267e − 02) 5.361e − 01 + (3.283e − 02)
5 8.723e − 01 (2.798e − 03) 8.825e 2 01 2 (3.142e − 03) 8.734e − 01 � (3.682e − 03) 5.921e − 01 + (5.274e − 02) 7.702e − 01 + (9.257e − 02) 5.032e − 01 + (3.863e − 02)
WFG3 8 9.267e 2 01 (3.278e − 03) 9.241e − 01 + (5.261e − 03) 9.257e − 01 + (4.258e − 02) 7.026e − 01 + (2.794e − 02) 7.315e − 01 + (8.903e − 02) 8.461e − 01 + (6.352e − 02)
10 9.349e − 01 (1.392e − 03) 9.352e − 01 � (3.267e − 03) 9.392e 2 01 2 (2.674e − 03) 5.252e − 01 + (2.493e − 02) 4.315e − 01 + (1.367e − 01) 7.947e − 01 + (6.375e − 01)
15 9.318e − 01 (2.717e − 03) 9.264e − 01 + (3.257e − 03) 9.381e 2 01 2 (2.916e − 03) 6.735e − 01 + (4.784e − 02) 7.106e − 01 + (8.628e − 02) 5.375e − 01 + (7.783e − 02)
3 6.997e − 01 (3.278e − 03) 6.805e − 01 + (2.581e − 03) 6.885e − 01 � (3.528e − 03) 7.293e 2 01 2 (3.271e − 03) 6.697e − 01 + (3.275e − 02) 6.019e − 01 + (4.251e − 02)
5 8.674e − 01 (2.678e − 03) 8.640e − 01 � (4.216e − 03) 8.601e − 01 + (3.728e − 03) 8.756e 2 01 2 (3.782e − 02) 8.602e − 01 � (5.272e − 03) 8.327e − 01 + (6.429e − 03)
WFG4 8 9.147e 2 01 (2.791e − 03) 9.020e − 01 + (3.736e − 03) 9.103e − 01 � (3.827e − 03) 9.128e − 01 � (3.782e − 03) 7.502e − 01 + (2.861e − 02) 8.462e − 01 + (4.571e − 02)
10 8.573e − 01 (3.728e − 03) 8.517e − 01 + (4.286e − 03) 8.237e − 01 + (3.183e − 03) 8.603e 2 01 � (4.247e − 03) 7.136e − 01 + (2.271e − 02) 8.354e − 01 + (1.180e − 02)
15 9.114e 2 01 (1.273e − 03) 9.077e − 01 + (7.263e − 03) 9.105e − 01 � (3.726e − 03) 8.982e − 01 + (4.251e − 03) 4.525e − 01 + (1.597e − 01) 7.239e − 01 + (2.743e − 01)
# +/�/− — 13/3/4 9/7/4 11/5/4 17/2/1 17/2/1
19
20 Computational Intelligence and Neuroscience
IGD and HV indicators. Therefore, this section tests the the individual’s ability to evolve through new differential
WFG1-4 test instance and analyzes the results, as shown in evolution strategies, and at the same time, dynamically
Tables 10–13. adjust the weight vector by means of the K-means to make
From the results in Table 10, it can be seen that NSGA-III- the weight vector as evenly distributed as possible on the
WA can handle most of the considered examples well. In objective surface. The NSGA-III-WA algorithm has good
particular, it achieved the best overall performance on the convergence ability and good distribution. To prove its
objectives 3, 5, and 10 on WFG2 instances and the objectives 5, effectiveness, the NSGA-III-WA is experimentally com-
8, and 15 on WFG3 instances. In addition, it achieves the best pared with the other five most advanced algorithms on the
performance on the objective 15 on WFG8 and the objectives 3 DTLZ test set and WFG test instances. The experimental
and 8 on WFG9. The VAEA performed well on the objective 8 results show that the proposed NSGA-III-WA performs
on WFG1 and WFG2 test instances and also achieved good well on the DTLZ test set and WFG test instances we
results on the objectives 3 and 10 on WFG3 and the objective 4 studied, and the obtained solution set has good conver-
on WFG3. RVEA obtains the best IGD value on the objective 3 gence and distribution. However, the proposed algorithm
on WFG1 and the objectives 5 and 10 on WFG4. It is worth has high complexity and it only plays the role of alleviating
noting that RVEA performs poorly for WFG2 and WFG3 sensitive frontiers. Further research will be conducted on
instances. But it performs relatively well compared to the the above problems.
NSGA3 and MOEA/D algorithms. NSGA-III and MOEA/D-
M2M typically have moderate performance on most WFG Data Availability
problems, and good results can only be achieved on specific
WFG test instances. MOEA/D does not produce satisfactory The data used to support the findings of this study are
results in all WFG test instances. As the number of objectives available from the corresponding author upon request.
increases, the results gradually deteriorate. Table 11 shows
summary of statistical test results from Table 10. It can be seen Conflicts of Interest
from the table that the performance of NSGA-III-WA is
significantly better than that of the other five algorithms. The authors declare that there are no conflicts of interest
regarding the publication of this paper.
From the results in Table 12, it can be seen that NSGA-III-
WA has obtained the best performance for most of the high- Acknowledgments
dimensional objective problems. NSGA-III works well on the
WFG1 and WFG2 test instances, and VAEA also gets good This work was supported in part by the National Natural
results on the objectives 10 and 15 on WFG3 test instances. Science Foundation of China under Grant nos. 61501107
RVEA obtains the best HV value on the objectives 3, 5, and 10 and 61501106, the Education Department of Jilin Province
on WFG4. MOEA/D and MOEA/D-M2M are not quite ef- Science and Technology Research Project of “13th Five-
fective on these five instances. Table 13 shows summary of Year” under Grant no. 95, and the Project of Scientific and
statistical test results from Table 12. The three-dimensional Technological Innovation Development of Jilin under Grant
performance of the NSGA-III-WA algorithm is not very nos. 201750219 and 201750227.
prominent. The performance under the eight-dimensional
algorithm is the same as that of the RVEA algorithm, but References
the NSGA-III-WA algorithm can achieve better performance
in high-dimensional objective problems. In general, the NSGA- [1] H. Son and C. Kim, “Evolutionary many-objective optimi-
III-WA algorithm outperforms the other five algorithms in this zation for retrofit planning in public buildings: a comparative
performance. study,” Journal of Cleaner Production, vol. 190, pp. 403–410,
In summary, after comparing the test results of GD, IGD, 2018.
and HV performance, the performance of the NSGA-III- [2] M. Pal, S. Saha, and S. Bandyopadhyay, “DECOR: differential
WA algorithm is superior. evolution using clustering based objective reduction for
many-objective optimization,” Information Sciences, vol. 423,
pp. 200–218, 2017.
6. Conclusion [3] Q. Zhang, W. Zhu, B. Liao, X. Chen, and L. Cai, “A modified
PBI approach for multi-objective optimization with complex
This paper proposes a many-objective optimization algo- Pareto fronts,” Swarm & Evolutionary Computation, vol. 40,
rithm based on weight vector adjustment, which increases pp. 216–237, 2018.
Computational Intelligence and Neuroscience 21
[4] A. Pan, H. Tian, L. Wang, and Q. Wu, “A decomposition- [20] K. Deb and H. Jain, “An evolutionary many-objective opti-
based unified evolutionary algorithm for many-objective mization algorithm using reference-point-based non-
problems using particle swarm optimization,” Mathemati- dominated sorting approach, Part I: solving problems with
cal Problems in Engineering, vol. 2016, Article ID 6761545, box constraints,” IEEE Transactions on Evolutionary Com-
15 pages, 2016. putation, vol. 18, no. 4, pp. 577–601, 2014.
[5] W. Gong, Z. Cai, and L. Zhu, “An efficient multiobjective [21] H. L. Liu, F. Gu, and Q. Zhang, “Decomposition of a multi-
differential evolution algorithm for engineering design,” objective optimization problem into a number of simple
Structural & Multidisciplinary Optimization, vol. 38, no. 2, multiobjective subproblems,” IEEE Transactions on Evolu-
pp. 137–157, 2012. tionary Computation, vol. 18, no. 3, pp. 450–455, 2014.
[6] M. Laumanns, L. Thiele, K. Deb, and E. Zitzler, “Combining [22] X. Bi and C. Wang, “An improved NSGA-III algorithm
convergence and diversity in evolutionary multiobjective based on objective space decomposition for many-
optimization,” Evolutionary Computation, vol. 10, no. 3, objective optimization,” Soft Computing, vol. 21, no. 15,
pp. 263–282, 2005. p. 4269, 2016.
[7] A. G. Hernández-Dı́az, L. V. Santana-Quintero, C. A. Coello [23] M. Zhang and H. Li, “A reference direction and entropy based
Coello, and J. Molina, “Pareto-adaptive ϵ-dominance,” Evo- evolutionary algorithm for many-objective optimization,”
lutionary Computation, vol. 15, no. 4, pp. 493–517, 2014. Applied Soft Computing, vol. 70, pp. 108–130, 2018.
[8] Y. Yuan, H. Xu, and B. Wang, “An improved NSGA-III [24] R. Cheng, Y. Jin, M. Olhofer, and B. Sendhoff, “A reference
procedure for evolutionary many-objective optimization,” vector guided evolutionary algorithm for many-objective
in Proceedings of Conference on Genetic & Evolutionary optimization,” IEEE Transactions on Evolutionary Compu-
Computation, pp. 661–668, ACM, Nanchang, China, October tation, vol. 20, no. 5, pp. 773–791, 2016.
2014. [25] Y. Li, Y. Kou, and Z. Li, “An improved nondominated sorting
[9] Y. Jingfeng, L. Meilian, X. Zhijie, and X. Jin, “A simple pareto genetic algorithm III method for solving multiobjective
adaptive ε-domination differential evolution algorithm for weapon-target assignment Part I: the value of fighter combat,”
multi-objective optimization,” Open Automation & Control International Journal of Aerospace Engineering, vol. 2018,
Systems Journal, vol. 7, no. 1, pp. 338–345, 2015. Article ID 8302324, 23 pages, 2018.
[10] X. Yue, Z. Guo, Y. Yin, and X. Liu, “Many-objective [26] Y. Sun, G. G. Yen, and Z. Yi, “IGD indicator-based evolu-
E-dominance dynamical evolutionary algorithm based on tionary algorithm for many-objective optimization prob-
adaptive grid,” Soft Computing, vol. 22, no. 1, pp. 137–146, lems,” IEEE Transactions on Evolutionary Computation, vol. 1,
2016. no. 1, pp. 99–114, 2018.
[11] M. M. Lin, H. Zhou, and L. P. Wang, “Research of many- [27] J. Xiao, J. J. Li, X. X. Hong et al., “An improved MOEA/D
objective evolutionary algorithm based on alpha dominance,” based on reference distance for software project portfolio
Computer Science, vol. 44, no. 1, pp. 264–270, 2017. optimization,” Mathematical Problems in Engineering,
[12] C. Dai and Y. Wang, “A new multiobjective evolutionary vol. 2018, Article ID 3051854, 16 pages, 2018.
algorithm based on decomposition of the objective space for [28] J. Shi, M. Gong, W. Ma, and L. Jiao, “A multipopulation
multiobjective optimization,” Journal of Applied Mathematics, coevolutionary strategy for multiobjective immune algo-
vol. 2014, Article ID 906147, 9 pages, 2014. rithm,” The Scientific World Journal, vol. 2014, Article ID
[13] Q. Zhang and H. Li, “MOEA/D: a multiobjective evolutionary 539128, 23 pages, 2014.
algorithm based on decomposition,” IEEE Transactions on [29] K. Deb and S. Jain, “Running performance metrics for evo-
Evolutionary Computation, vol. 11, no. 6, pp. 712–731, 2007. lutionary multi-objective optimization,” Technical Report No.
[14] Y. Yuan, H. Xu, B. Wang et al., “A new dominance relation- 2002004, Indian Institute of Technology Kanpur, Kanpur,
based evolutionary algorithm for many-objective optimiza- India, 2002.
tion,” IEEE Transactions on Evolutionary Computation, [30] S. Chand and M. Wagner, “Evolutionary many-objective
vol. 20, no. 1, pp. 16–37, 2016. optimization: a quick-start guide,” Surveys in Operations
[15] C. Segura and G. Miranda, “A multi-objective decomposition- Research & Management Science, vol. 20, no. 2, pp. 35–42,
based evolutionary algorithm with enhanced variable space 2015.
diversity control,” in Proceedings of Genetic and Evolutionary [31] H. R. Maier, Z. Kapelan, J. Kasprzyk et al., “Evolutionary
Computation Conference Companion, pp. 1565–1571, ACM, algorithms and other metaheuristics in water resources:
current status, research challenges and future directions,”
Berlin, Germany, July 2017.
Environmental Modelling & Software, vol. 62, pp. 271–299,
[16] Y. Xiang, Y. Zhou, M. Li, and Z. Chen, “A vector angle-based
2014.
evolutionary algorithm for unconstrained many-objective
[32] A. Dı́az-Manrı́quez, G. Toscano, J. H. Barron-Zambrano, and
optimization,” IEEE Transactions on Evolutionary Compu-
E. Tello-Leal, “R2-based multi/many-objective particle swarm
tation, vol. 21, no. 1, pp. 131–152, 2017.
optimization,” Computational Intelligence and Neuroscience,
[17] X. Guo, Y. Wang, and X. Wang, “Using objective clustering
vol. 2016, Article ID 1898527, 10 pages, 2016.
for solving many-objective optimization problems,” Mathe-
[33] K. Li, S. Kwong, and K. Deb, “A dual-population paradigm for
matical Problems in Engineering, vol. 2013, Article ID 584909,
evolutionary multiobjective optimization,” Information Sci-
12 pages, 2013.
ences, vol. 309, pp. 50–72, 2015.
[18] R. Wang, R. C. Purshouse, and P. J. Fleming, “Preference-
inspired co-evolutionary algorithm using adaptively gener-
ated goal vectors,” in Proceedings of the IEEE Congress on
Evolutionary Computation, pp. 916–923, Cancun, QROO,
Mexico, June 2013.
[19] Y. Qi, X. Ma, F. Liu, L. Jiao, J. Sun, and J. Wu, “MOEA/D with
adaptive weight adjustment,” Evolutionary Computation,
vol. 22, no. 2, pp. 231–264, 2014.
Advances in
Multimedia
Applied
Computational
Intelligence and Soft
Computing
The Scientific
Engineering
Journal of
Mathematical Problems
Hindawi
World Journal
Hindawi Publishing Corporation
in Engineering
Hindawi Hindawi Hindawi
www.hindawi.com Volume 2018 http://www.hindawi.com
www.hindawi.com Volume 2018
2013 www.hindawi.com Volume 2018 www.hindawi.com Volume 2018 www.hindawi.com Volume 2018
Advances in
International Journal of
Fuzzy
Reconfigurable Submit your manuscripts at Systems
Computing www.hindawi.com
Hindawi
www.hindawi.com Volume 2018 Hindawi Volume 2018
www.hindawi.com
Journal of
Computer Networks
and Communications
Advances in International Journal of
Scientific Human-Computer Engineering Advances in
Programming
Hindawi
Interaction
Hindawi
Mathematics
Hindawi
Civil Engineering
Hindawi
Hindawi
www.hindawi.com Volume 2018
www.hindawi.com Volume 2018 www.hindawi.com Volume 2018 www.hindawi.com Volume 2018 www.hindawi.com Volume 2018
International Journal of
Biomedical Imaging