0% found this document useful (0 votes)
33 views10 pages

Hybrid Intrusion Detection Systems Based Mean-Variance Mapping Optimization Algorithm and Random Search

This paper presents a hybrid model, MVMOR, that combines mean-variance mapping optimization (MVMO) and random search to enhance intrusion detection systems by optimizing feature selection and classifier parameters simultaneously. The proposed model demonstrated improved accuracy, achieving 88% compared to the conventional MVMO's 80%, indicating its potential for better protection against cyber threats. The study emphasizes the importance of effective feature extraction and hyperparameter optimization in improving the performance of machine learning algorithms in cybersecurity.

Uploaded by

aqeelhamzah
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
33 views10 pages

Hybrid Intrusion Detection Systems Based Mean-Variance Mapping Optimization Algorithm and Random Search

This paper presents a hybrid model, MVMOR, that combines mean-variance mapping optimization (MVMO) and random search to enhance intrusion detection systems by optimizing feature selection and classifier parameters simultaneously. The proposed model demonstrated improved accuracy, achieving 88% compared to the conventional MVMO's 80%, indicating its potential for better protection against cyber threats. The study emphasizes the importance of effective feature extraction and hyperparameter optimization in improving the performance of machine learning algorithms in cybersecurity.

Uploaded by

aqeelhamzah
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 10

Received: May 23, 2023. Revised: August 1, 2023.

552

Hybrid Intrusion Detection Systems Based Mean-Variance Mapping


Optimization Algorithm and Random Search

Adil L. Albukhnefis1* Amar A. Sakran2 AtaAllah Saleh Mahe3 Maryam Imran Mousa4
Ahmed Mohsin Mahdi1 Aqeel hamza al-fatlawi 5

1
College of Computer Science and Information Technology, University of Al-Qadisiyah, Iraq
2
College of Biomedical Informatics, University of information technology and Communication, Iraq
3
Ministry of Industry and Minerals, Iraq
4
Ministry of Education, Al-Qadisiyah Education Directorate, Iraq
5
Department of Computer Techniques, Imam Kadhum College, Iraq
* Corresponding author's Email: adil.lateef@qu.edu.iq

Abstract: Intrusion detection systems are critical in identifying and mitigating cyber threats. However, the intrusion
data often contains insufficient features, which can adversely affect the classification accuracy of machine learning
algorithms. To effectively select optimal features from intrusion attacks, a highly efficient model is required to extract
highly correlated features. Nevertheless, traditional detection systems may experience low accuracy and high false-
positive rates. This paper proposes a hybrid model for improving intrusion detection systems using mean-variance
mapping optimization (MVMO) and random search (MVMOR). The strategy of the proposed model is MVMO
algorithm is used to search for the optimal feature subset. while, the random search is employed to optimize the
hyperparameters of the machine learning algorithm (classifier). The proposed hybrid model seeks to optimize the
feature selection and parameters of classifiers at some time. The performance of the hybrid model is evaluated on the
NSL-KDD as a benchmark dataset. The proposed MVMOR achieves an accuracy of 88%, while the conventional
MVMO achieved only an 80% accuracy rate. The empirical results show the proposed model has the potential to offer
better protection against various cyber threats, thus making a valuable contribution to the field of Cybersecurity.
Keywords: Intrusion detection systems, Optimize machine learning algorithms, Feature selection, Mean-variance
mapping optimization (MVMO), Cybersecurity.

The exponential data growth in today's data science


1. Introduction is due to the increasing number of devices that
generate this data [9]. Feature selection is a valuable
Meta-heuristic optimization is a field of computer
data mining tool to optimize big or noisy data in
science that uses algorithms and engineering models
certain features. The wrapper model, which relies on
to optimize candidates and discover potentially best
trial and error to select relevant features that match
solutions using random factors [1]. Several meta-
the data's goal, is one feature selection strategy [10].
heuristic optimization algorithms are inspired by
Metaheuristic optimization structures the
nature or animal behaviour [2], such as particle
wrapper model's randomness [11, 12]. In particular,
swarm optimization (PSO) [3], genetic algorithm
this work uses the mean-variance mapping
(GA) [4], gray wolf optimizer (GWO) [5], ant colony
optimization (MVMO) algorithm for feature ranking
optimization (ACO) [6], bat search algorithm (BSA)
and random search for parameter tuning. MVMO is a
[7], and dolphin echolocation (DoE) [8]. These
highly efficient and flexible algorithm for various
algorithms are highly efficient in searching, flexible
problems in the context of meta-heuristic
in solving various optimization problems, and
optimization [13–15]. The algorithm has several
perform better than conventional search methods.
unique statistical properties, including using a
International Journal of Intelligent Engineering and Systems, Vol.16, No.5, 2023 DOI: 10.22266/ijies2023.1031.47
Received: May 23, 2023. Revised: August 1, 2023. 553

attribute analysis. The proposed model uses the


random search strategy to optimize the machine
learning algorithm (classifier) hyperparameters. The
proposed hybrid model aims to optimize the feature
selection and classifier parameters simultaneously.

• Contributions
Contributions highlight this research's innovative
techniques and methodologies, leading to significant
progress in intrusion detection systems. Several
noteworthy contributions are presented in the
proposed model:
1. It introduces MVMOR, a novel hybrid model
combining mean-variance mapping optimization
Figure. 1 The scenario of applying the IDS in networks
(MVMO) with random search. This hybrid
approach aims to improve the performance of
mapping function for the mutation operation based on
intrusion detection systems.
two critical statistical functions: the mean and the
2. Using metaheuristic optimization algorithms to
variance of the set of best solutions. Developed
refine the proposed solutions and effectively
initially for meta-heuristic searches in the continuous
identify the critical features for network attribute
domain, MVMO is limited to a discrete interval of [0,
analysis. This process ensures comprehensive and
1] when selecting features via binary search. It is
accurate network attribute evaluation.
essential to maintain high diversity to avoid
3. The proposed methodology achieves better
stagnation or sliding into local optima, as noted in [16,
performance and overall system optimization by
17]. The hybrid algorithm has the potential to support
optimizing feature selection and classifier
frequent jumps toward the global solution, improving
parameters simultaneously. By considering these
the overall progress of the search. The intrusion
two aspects together, the method ensures
detection system (IDS) analyses the frequency and
comprehensive optimization and improved
nature of attacks, and organizations can use this data
performance.
to improve their security measures or implement
more efficient controls [18]. In addition, an intrusion
• Paper organization
detection system can help organizations identify
The remainder of this paper is organized as
network device configuration issues or errors. Fig. 1
follows. Section 2 presents related work on pre-
shows the scenario of applying the IDS in networks.
processing, verifying, and identifying hand veins.
Integrating metaheuristic algorithms and wrapper
The proposed model and its components are
models into intrusion detection systems provide an
discussed in sections 3. and 4, which introduce the
intelligent and adaptive approach to cyber threat
principles of MVMO and the wrapper model,
detection and mitigation. It provides improved
respectively. The experimental results and analysis
accuracy, adaptability, scalability, and automation,
are discussed in section 4. Finally, the conclusions of
overcoming the limitations of traditional rule-based
this work and future improvements are presented in
methods and improving the overall security of
section 6.
networks and systems. This paper proposes a hybrid
model that combines mean-variance mapping
optimization (MVMO) and random search
2. Related works
(MVMOR) to enhance intrusion detection systems. Numerous studies have suggested optimizing trait
The MVMO algorithm is used in the proposed selection (F.S.). However, the majority of these
model's search strategy to find the ideal feature subset, studies have utilized standalone metaheuristic
and random search is used to optimize the machine algorithms, such as particle swarm optimization
learning algorithm's (classifier's) hyperparameters. (PSO), gray wolf optimizer (GWO), and bat search
The proposed hybrid model, MVMOR, combines algorithm (B.A.), without improving the search
mean-variance mapping optimization (MVMO) and operations within these algorithms. The beneficial
random search to improve intrusion detection models when seeking to optimize the feature
systems. The model uses meta-heuristic optimization selection and parameters of classifiers to improve the
algorithms to optimize the proposed solutions and accuracy of intrusion detection systems
identify the most important features for network In their study, Almasoudy et al. [18] introduced a

International Journal of Intelligent Engineering and Systems, Vol.16, No.5, 2023 DOI: 10.22266/ijies2023.1031.47
Received: May 23, 2023. Revised: August 1, 2023. 554

hybrid model based on bat algorithm (B.A.) for 3. Mean-variance mapping optimization
optimizing support vector machine (SVM) (MVMO)
parameters and selecting optimal features. The model
utilized a pool of bat vectors, where the first two MVMO is a type of population-based stochastic
positions were assigned to SVM parameters, and the optimization algorithm, a novel optimization
rest of the vector represented the feature selection approach [8, 14, 15]. Like other stochastic
mask. The proposed model's reliance on parameter optimization methods, MVMO employs evolutionary
adjustment based on gradients introduces limitations operations such as selection, crossover, and
in terms of the search space for the algorithm. mutation[20]. However, what sets MVMO apart is
Additionally, using a wrapper method for feature that it constrains the search space of all optimization
selection does not guarantee a reduction in the variables and the output of internal operations
number of selected features. Consequently, the between the values of [0,1]. Additionally, MVMO
performance of the SVM algorithm may suffer due to applies a mapping function as a mutation operation
its inefficiency in handling high-dimensional on the offspring generated through the crossover.
problems. This mapping function is calculated based on the
In another study by the author [12], two mean and variance of the n-best solutions and is used
metaheuristic algorithms, namely particle swarm to optimize the offspring further. Eqs. (1) to (5)
optimization (PSO) and bat search algorithm (BSA), illustrate the mathematical formals are used to find
were proposed to search for the best solutions the new offspring.
individually, which were shared between the two 1
algorithms. However, the proposed model suffered 𝑥𝑗′ = ∑𝑛𝑖=1 𝑥𝑖 (1)
𝑛
from stagnation and failed to enhance or reduce the
local optima problem. Where: 𝑥𝑖′ the mean value of offspring jth, n
In [19] the authors proposed a method in the field dimension of offspring, and j the sequence number.
of feature selection. It combines the Naive Bayes
classifier and the bat algorithm to select a subset of 1
𝑉𝑗 = ∑𝑛𝑖=1(𝑥𝑖 − 𝑥𝑗′ )2 (2)
features that contribute most to classification 𝑛
accuracy. However, a potential limitation of this
approach is the assumption of feature independence Where:
in Naive Bayes, which may not be present in real- The is a variance. The new population is
world scenarios where features are correlated or have generated by applying the H-function.
complex relationships. This limitation could affect
the model's ability to accurately capture feature 𝑋𝑗 = ℎ𝑥 + (1 − ℎ1 + ℎ0 ) ∗ 𝑥𝑗 − ℎ0 (3)
dependencies, potentially leading to suboptimal
feature subsets and lower performance in feature Where:
selection tasks. is offspring, h is a H-function is defined as
Taha et al. [20] utilized Naïve Bayes (N.B.) to follows.
assist B.A. in selecting optimal subgroup features.
They proposed decreasing the bat's velocity when the ℎ = 𝑥𝑗 (1 − 𝑒 −𝑥𝑠1 ) + (1 + 𝑥)𝑒 −(1−𝑥)𝑠2 (4)
difference between the past and current position is where:
negative. However, this system failed to improve the ℎ𝑥 = ℎ(𝑥 = 𝑥𝑖 ) , ℎ0 = ℎ(𝑥 = 0) , ℎ1 = ℎ(𝑥 = 1)
behaviour-based search progress, and the variety of
proposed solutions still left much to be desired when The 𝑠1 , and 𝑠2 shape variable depends on the
exploring the search process. value of 𝑆𝑖 , (i =1 or 2 ) which calculated by Eq. 5.
R. Nuiaa et al. [21] proposed the proactive feature
selection (PFS) model for detecting cyber-attacks 𝑠𝑖 = − ln(𝑣𝑗 ) 𝑓𝑠 (5)
based on subset feature selection. They introduced a
nature-inspired optimization algorithm and a
Where:
proactive feature selection threshold to optimize the
fs is function control on shapes variables.
feature selection technique. However, the proposed
model was only applied to optimize feature selection
4. Wrapper model
and not improve the overall search process.
Wrapper model feature selection is a widespread
technique in the field of machine learning used to
select the most relevant features for a given task [21].
International Journal of Intelligent Engineering and Systems, Vol.16, No.5, 2023 DOI: 10.22266/ijies2023.1031.47
Received: May 23, 2023. Revised: August 1, 2023. 555

Figure 2. Wrapper feature selection model

The method involves training a model multiple times this issue, such as training the models using a subset
using various subsets of the available features and of the data or using model ensembles to reduce
evaluating each model's performance to identify the variance in the feature selection process. Despite its
most crucial features. This approach aims to drawbacks, the wrapper model remains a widely used
maximize the model's performance while minimizing and effective technique for feature selection in
the number of features used [22]. machine learning. Fig. 2 shows the principles
The process of feature selection is viewed as an wrapper model.
optimization problem. The wrapper model utilizes a
particular learning algorithm, such as a decision tree 5. Proposed MVMOR Model
or a neural network, as the wrapper to assess the
The system under consideration comprises three
model's performance on a specific task and determine
fundamental phases: Initialization of machine
which features are most significant. The wrapper
learning perimeters and preparation data, feature
model is a feature selection method with several
selection, and tuning machine learning algorithm
advantages, including its ability to handle complex
(ML) parameters. Fig. 3 illustrates the main steps and
feature interactions and non-linear relationships. It
strategy of the proposed model (MVMOR).
provides a more accurate and reliable feature
selection process than other methods, such as filters 5.1 Initialize the algorithm parameters and
or embedded feature selection techniques. The prepper dataset
method involves two primary phases: feature subset
generation and model evaluation [23-25]. The proposed approach uses hot-coding
During the feature subset generation stage, a techniques to convert sets of attributes from nominal
subset of the available features is selected for each to numeric values to perform analysis. The one-hot
model iteration, either randomly or using techniques coding method is used to convert non-numeric
such as forward or backward selection or genetic attributes to numeric attributes. The pre-processing
algorithms. The wrapper model approach is then used phase is crucial in normalizing the dataset for specific
to train the model on the subset of features that scale values within a defined range. This ensures that
performs the best after being evaluated using a bias is removed from the data set while statistical
specific performance metric, such as accuracy. properties are preserved. The data are divided into
The main drawback of the wrapper model is the training and testing to train the model. In this way,
computational cost of training multiple models with the model can be trained on the training dataset, while
various subsets of features, which can be especially the test dataset serves as a means to validate the
problematic for large datasets or complex models that model's effectiveness. It is worth noting that hot
require a lot of computational power [26]. coding can be a very effective means of converting
Different methods have been proposed to address nominal attributes into numerical attributes. The data
International Journal of Intelligent Engineering and Systems, Vol.16, No.5, 2023 DOI: 10.22266/ijies2023.1031.47
Received: May 23, 2023. Revised: August 1, 2023. 556

Figure. 3 Main steps of the proposed MVMOR mode

can be more easily analyzed and modeled, giving it also leads to a longer, more time-consuming
researchers valuable insights into network traffic's induction process. The proposed method has
underlying patterns and behaviors. The proposed similarities with the hill-climbing system in terms of
approach represents a powerful tool for conducting the underlying concept. Both methods aim to find the
sophisticated analyses of network traffic attributes, optimal solution by iteratively improving the current
which can help researchers better understand and solution. However, unlike hill-climbing, which takes
address cybersecurity threats in real-world a deterministic approach, the proposed method
environments. adopts a stochastic process in which the parameters
are randomly changed to explore the solution space.
5.2 Feature selection While random search is a popular approach for
optimizing machine learning algorithms, it has
The wrapper model uses binary metaheuristic certain limitations. For example, the method may
algorithms that restrict the search space to the binary lead to suboptimal solutions if the large parameter
interval [0,1]. However, the MVMO algorithm space and the search process become impractical. In
improves the exploration process by extending the such cases, alternative optimization techniques like
space search interval to [-1,1] to check the gradient descent or Bayesian optimization may
corresponding subset features. The user sets the provide better results. The proposed method uses a
feature selection threshold for each experiment, e.g., random search strategy to optimize the parameters of
a threshold of 0.5, in which case values less than 0.5 the ML algorithms.
are omitted. In contrast, items in the current Although this approach can produce better results,
generation are taken for values greater than or equal it may require a longer induction process. The
to the threshold. The search process is terminated similarities with the hill-climbing algorithm suggests
when the maximum iteration limit is reached [23] that iterative improvement is an effective
5.3 Tuning parameters of machine learning: optimization technique. Eq. 6 illustrates the main
formal of the random search process.
The proposed method currently uses a random
𝜕 𝑀𝑎𝑥𝑖𝑡𝑟
search strategy to assign values to the parameters of 𝑋𝑖′ = 2
+ (𝑋𝑏𝑒𝑠𝑡 − 𝑋𝑖 ) + 𝑖𝑡𝑒𝑟
𝜀 (𝑋𝑖 ) (6)
the machine learning algorithm. Adding or removing
a theta value changes the parameters, and the search Where 𝑋𝑖 is the current value, ε are random
process is repeated until the algorithm achieves better variables in the interval [-1,1], 𝑀𝑎𝑥𝑖𝑡𝑟 is maximum
results. While this approach can lead to better results, iteration, 𝑖𝑡𝑒𝑟 is the current iteration's value, and 𝑋𝑖′
International Journal of Intelligent Engineering and Systems, Vol.16, No.5, 2023 DOI: 10.22266/ijies2023.1031.47
Received: May 23, 2023. Revised: August 1, 2023. 557

Table 1. Parameters of random forest algorithm The range of each parameter is a set previously to
Parameter Range Type bind the search space.
n_estimators [10 , 1000] integer
max_depth [2 ,20] integer 7. Experiment results
min_samples_leaf [1 ,10] integer
[1, number of features Within this section, we covered the benchmark
max_features integer dataset. We evaluated the performance of four
in dataset]
random_state [1,0.42] float commonly used machine learning algorithms
(decision tree (D.T.), random forest (R.F.), naive
Bayes (N.B.), and K-NN (k-nearest neighbors)).
Table 2. Paramters of K-nn algorithm Ultimately, we compared the results obtained from
Parameter Range Type our model with those of recent studies in IDS.
[1,(Number of classes
Nearest neighbors float
in dataset+1)] 7.1 Dataset

The proposed MVMOR is evaluated, and its


Table 3. Paramters of Niav Bayes algorithm validity is verified using two benchmark datasets in
Parameter Range Type the test IDS (NSL_KDDTest+ and NSL_KDDTest-
Threshold value [0,1] float 21). In addition, the performance of machine learning
algorithms (D.T., R.F., NB, and KNN) is compared
Table 4. Parameters of random forest algorithm
with the two optimization methods, MVMO and
Parameter Range Type MVMOR, and the standalone method. The
Maximum depth [1 , 1000] integer NSL_KDD [27] dataset includes a considerable
Minimum number of samples volume of packets. The first dataset contains four
[3, 25] integer primary attacks, while the second includes nine
per leaf
Minimum impurity decrease [1 ,500] integer effective attacks and two million packets. These
datasets contain the TCP/IP header information of the
TCP/IP suite. More details about these datasets are
Table 5. KDD_NSL dataset description provided in the following subsections. The
Testing Dataset NSL_KDD dataset includes four files, namely
Category

Training NSL_KDDTrain+ for complete training data and 20


Dataset NSL_KDD NSL_KDD
percent Training Set for additional training data. In
Test-21 Test+
addition, there are two test files, namely
Dos 9234 4342 7458 NSL_KDDTest+ and NSL_KDDTest-21. Table 5
Normal 13,449 2152 9711 describes our system's NSL_KDD training and test
Probe 2289 2402 2421 files.
U2R 12 200 200
R2L 197 2754 2754 7.2 Empirical results
Total 25,192 11,850 22544
Tables 6 and 7 show the evaluation results of the
proposed system, which uses both the MVMO
a new value of the tuning parameter. The first part of algorithm and machine learning techniques. In
Eq. (6) reduces the scattering search, while the particular, four different machine learning algorithms
second part reinforces the exploration of the random were used, including decision tree (DT), random
search algorithm. However, alternative optimization forest (RF), naive Bayes (NB), and K-NN (k-nearest
techniques may be necessary to achieve better results neighbors). After careful analysis, it was found that
in certain situations. the R.F. algorithm provided the most favorable
results when used with the proposed system. It is
6. Optimization parameters of ML by important to note that using machine learning
random search algorithms in combination with the MVMO
algorithm significantly increases the overall
Tables 1, 2, 3, and 4 illustrate the optimization performance of the proposed approach. The decision
parameter in the second part of the proposed tree, random forest, N.B., and K-NN algorithms were
MVMOR (Random search). The proposed model sets all evaluated for their effectiveness in improving
the maximums trails of searching randomly for the system performance. Ultimately, it was determined
optimal parameter of the machine learning algorithm. that the Random Forest algorithm provided the best
International Journal of Intelligent Engineering and Systems, Vol.16, No.5, 2023 DOI: 10.22266/ijies2023.1031.47
Received: May 23, 2023. Revised: August 1, 2023. 558

Table 6. Test of machine learning model and optimization


method (MVMO and MVMOR) over NSl_KDD test-21

Accuracy

Precision

F1_score
Recall
Model

KNN 54 60 54.55 56.1


KNN+MVMO 60.48 65 64.65 60.66
KNN+MVMOR 59.17 64 63.64 59.39
DT 51.69 65.06 51.69 57.61
Figure. 4 Enhancing percentage on ML algorithms
DT+MVMO 61 63.41 70 64.22
(NB, DT, RF, and KNN) overNSL_KDDTest_21 by
DT+MVMOR 63 62.93 73 67.92
MVMO, and MVMOR
RF 68.1 60 63.64 66.33
RF+MVMO 71.9 64.55 67.23 70.24
RF+MVMOR 75 75 75.49 75.76
NB 46.1 50.64 46.1 48.27
NB+MVMO 48.83 53.22 46.9 51.8
NB+MVMOR 48.83 53.22 46.9 51.8

Table 7. Test of machine learning model and optimization


method (MVMO and MVMOR) over NSL_KDD test-21
Model
Accuracy

precision

F1_score
Recall

Figure. 5 Enhancing percentage on ML algorithms


(NB, DT, RF, and KNN) over NSL_KDDTest-21
when applied MVMO, and MVMOR
KNN 75.68 78.32 75.68 76.98
KNN+MVMO 77.21 72.33 80.13 76.56
This implies that the effectiveness of the proposed
KNN+MVMOR 76.18 73.59 77.7 74.59
system is highly dependent on the number of
DT 74.56 81.01 74.56 77.67
DT+MVMO 78 78.2 75.9 74.23
parameters included in the machine learning
DT+MVMOR 79 82.31 77.56 78.27 algorithm. In other words, a more significant number
RF 78.83 81.03 78.83 79.92 of parameters in the algorithm corresponds to a
RF+MVMO 80.21 79.63 81.82 80.82 greater degree of improvement in the algorithm's
RF+MVMOR 88.76 79 85.08 83.21 overall performance. Therefore, optimizing the
NB 71.6 65.7 71.6 68.52 number and selection of variables is crucial to
NB+MVMO 74.45 70.89 78.21 77.21 achieving the best possible performance from the
NB+MVMOR 74.45 70.89 78.21 77.21 proposed system.
Figs. 4 and 5 show the enhancing percentage of
machine learning algorithms when applying the
results, indicating that it is the most appropriate optimization algorithm MVMO and proposed
machine learning algorithm for use in this particular MVMOR.
system. Incorporating machine learning techniques in The proposed method helps to ensure that the
the proposed approach is a significant field algorithm produces the desired outcome and achieves
advancement. the required improvements. Moreover, it helps
The ML algorithms allow the system to learn and identify potential weaknesses or limitations in the
adapt to new data, making more accurate and algorithm's design, enabling engineers to address
efficient predictions. In addition, the successful them promptly and improve its overall performance.
implementation of the RF algorithm demonstrates the In conclusion, using the standard division process for
potential for further improvements in the calibration is an essential step in examining the
performance of the proposed system. stability and implementation of an algorithm. This
The findings presented in Tables 6 and 7 suggest method enables software engineers to verify that the
that the efficiency of the proposed algorithm is algorithm functions as intended, producing consistent
closely related to the number of variables involved. and Reliable results and meeting the requirements.
Specifically, the algorithm's performance improves By incorporating this process into their development
accordingly as the number of variables increases.
International Journal of Intelligent Engineering and Systems, Vol.16, No.5, 2023 DOI: 10.22266/ijies2023.1031.47
Received: May 23, 2023. Revised: August 1, 2023. 559

suitable for use. The MVMOR algorithm improved


the system's ability to classify and predict outcomes,
resulting in better decision-making and more
accurate predictions. The proposed system surpasses
traditional detection systems in terms of accuracy and
false-positive rate, with an average accuracy rate of
94%. The paper suggests that alternative optimization
techniques may be necessary for iterative
improvement.

Conflicts of interest
Figure. 6 The standard deviation for the best result The authors declare no conflict of interest.
obtained after running MVMO and MVMOR thirty times
Author contributions
Table 8. Comparative with other studies
Ref Dataset Accuracy The author's Contributions are as follows:
name "Conceptualization, first and second author;
[7] 82 Methodology and software, first author; validation,
[19] 81 second author; formal analysis, investigation, third
KDD_NSL
[20] 82 author; resources, fourth author; data curation, third
Test+
Proposed 88 author; writing—original draft preparation, first
MVMOR authors; writing—review and editing, visualization,
fifth authors; supervision, project administration, first
authors; funding acquisition, fifth author.
workflow, engineers can ensure their algorithms are
optimized for maximum performance and efficiency. Reference
Fig. 6 shows the evaluation regarding the low
standard division between MVMO and MVMOR. [1] A. R. Gad, A. A. Nashat, and T. M. Barkat,
"Intrusion Detection System Using Machine
7.3 Comparative with other studies: Learning for Vehicular Ad Hoc Networks Based
on ToN-IoT Dataset", IEEE Access, Vol. 9, pp.
This section compares our proposed system with 142206–142217, 2021, doi:
previous studies [7, 19, 20] conducted on the same 10.1109/ACCESS.2021.3120626.
dataset. We present the algorithm we correspond with, [2] E. Pashaei, E. Pashaei, and N. Aydin, "Gene
MVMOR, in Table 6. Additionally, Table 7 selection using hybrid binary black hole
compares our proposed MVMOR method with the algorithm and modified binary particle swarm
other two works on IDS. optimization", Genomics, Vol. 111, No. 4, pp.
669–686, 2019, doi:
8. Conclusion 10.1016/j.ygeno.2018.04.004.
The research paper proposes a hybrid model, [3] M. Z. Zakaria, S. Mutalib, S. A. Rahman, S. J.
MVMOR, combining mean-variance mapping Elias, and A. Z. Shahuddin, "Solving RFID
optimization (MVMO) and random search to mobile reader path problem with optimization
improve intrusion detection systems. This model uses algorithms", Indones. J. Electr. Eng. Comput.
meta-heuristic optimization algorithms to optimize Sci., Vol. 13, No. 3, pp. 1110–1116, 2019, doi:
the proposed solutions and identify the most relevant 10.11591/ijeecs.v13.i3.pp1110-1116.
features for network attribute analysis. The wrapper [4] Y. Yang, Y. Wu, H. Yuan, M. Khishe, and M.
model is a feature selection method. It is used to Mohammadi, "Nodes clustering and multi-hop
select relevant features, and hot-coding techniques routing protocol optimization using hybrid
are applied to convert non-numeric attributes into chimp optimization and hunger games search
numeric ones. The proposed method uses a random algorithms for sustainable energy efficient
search strategy to optimize the parameters of the underwater wireless sensor networks", Sustain.
machine learning algorithm, which leads to better Comput. Informatics Syst., Vol. 35, p. 100731,
results but requires a longer induction process. The 2022.
random forest algorithm gave the best results when [5] A. Bhattacharyya, R. Chakraborty, S. Saha, S.
used with the proposed system, indicating that it is Sen, R. Sarkar, and K. Roy, "A Two-Stage Deep

International Journal of Intelligent Engineering and Systems, Vol.16, No.5, 2023 DOI: 10.22266/ijies2023.1031.47
Received: May 23, 2023. Revised: August 1, 2023. 560

Feature Selection Method for Online high-dimensional biomedical data", Concurr.


Handwritten Bangla and Devanagari Basic Comput. Pract. Exp., Vol. 34, No. 10, p. e6776,
Character Recognition", S.N. Comput. Sci., Vol. 2022, doi: https://doi.org/10.1002/cpe.6776.
3, No. 4, p. 260, 2022, doi: 10.1007/s42979-022- [15] R. M. Pringles and J. L. Rueda, "Optimal
01157-2. transmission expansion planning using Mean-
[6] P. Shunmugapriya and S. Kanmani, "A hybrid Variance Mapping Optimization", In: Proc. of
algorithm using ant and bee colony optimization 2012 6th IEEE/PES Transm. Distrib. Lat. Am.
for feature selection and classification (AC- Conf. Expo. T D-LA 2012, pp. 1–8, 2012, doi:
ABC Hybrid)", Swarm Evol. Comput., Vol. 36, 10.1109/TDC-LA.2012.6319132.
pp. 27–36, 2017, doi: [16] J. L. Rueda and I. Erlich, "Evaluation of the
10.1016/j.swevo.2017.04.002. mean-variance mapping optimization for
[7] S. S. Alkafagi and R. M. Almuttairi, "A solving multimodal problems", In: Proc. of 2013
Proactive Model for Optimizing Swarm Search IEEE Symp. Swarm Intell. SIS 2013 - 2013 IEEE
Algorithms for Intrusion Detection System", J. Symp. Ser. Comput. Intell. SSCI 2013, pp. 7–14,
Phys. Conf. Ser., Vol. 1818, No. 1, 2021, doi: 2013, doi: 10.1109/SIS.2013.6615153.
10.1088/1742-6596/1818/1/012053. [17] D. B. Rawat, R. Doku, and M. Garuba,
[8] A. H. A. Saeedi, "Binary Mean-Variance "Cybersecurity in Big Data Era: From Securing
Mapping Optimization Algorithm (BMVMO)", Big Data to Data-Driven Security", IEEE Trans.
J. Appl. Phys. Sci., Vol. 2, No. 2, pp. 42–47, Serv. Comput., Vol. 14, No. 6, pp. 2055–2072,
2016, doi: 10.20474/japs-2.2.3. 2021, doi: 10.1109/TSC.2019.2907247.
[9] A. L. A. A. H. A. A. H. A. M. E. Manna, "A [18] F. H. Almasoudy, W. L. A. Yaseen, and A. K.
proactive metaheuristic model for optimizing Idrees, "Differential Evolution Wrapper Feature
weights of artificial neural network", Indones. J. Selection for Intrusion Detection System",
Electr. Eng. Comput. Sci., Vol. 20, No. 2, pp. Procedia Comput. Sci., Vol. 167, No. 2019, pp.
976–984, 2020, doi: 1230–1239, 2020, doi:
10.11591/ijeecs.v20.i2.pp976-984.S. 10.1016/j.procs.2020.03.438.
[10] S. M. Ali, A. H. Alsaeedi, D. A. Shammary, H. [19] O. Almomani, "A feature selection model for
H. Alsaeedi, and H. W. Abid, "Efficient network intrusion detection system based on pso,
intelligent system for diagnosis pneumonia gwo, ffa and ga algorithms", Symmetry (Basel).,
(SARSCOVID19) in X-ray images empowered Vol. 12, No. 6, pp. 1–20, 2020, doi:
with initial clustering", Indones. J. Electr. Eng. 10.3390/sym12061046.
Comput. Sci., Vol. 22, No. 1, pp. 241–251, 2021, [20] A. M. Taha, A. Mustapha, and S. D. Chen,
doi: 10.11591/ijeecs.v22.i1.pp241-251. "Naive Bayes-guided bat algorithm for feature
[11] Alfoudi, A. Saeed, M. R. Aziz, Z. A. A. selection", Sci. World J., Vol. 2013, 2013, doi:
Alyasseri, A. H. Alsaeedi, R. R. Nuiaa, M. A. 10.1155/2013/325973.
Mohammed, K. H. Abdulkareem, and M. M. [21] R. R. Nuiaa, S. Manickam, A. H. Alsaeedi, and
Jaber, "Hyper clustering model for dynamic E. S. Alomari, "Enhancing the Performance of
network intrusion detection", IET Detect DRDoS DNS Attacks Based on the
Communications, pp. 1–13 (2022) doi: Machine Learning and Proactive Feature
10.1049/cmu2.12523. Selection (PFS) Model", IAENG Int. J. Comput.
[12] A. S. Alfoudi, A. H. Alsaeedi, M. H. Abed, A. Sci., Vol. 49, No. 2, 2022.
M. Otebolaku, and Y. S. Razooqi, "Palm Vein [22] W. Nakawiro, I. Erlich, and J. L. Rueda, "A
Identification Based on Hybrid Feature novel optimization algorithm for optimal
Selection Model", Int. J. Intell. Eng. Syst., Vol. reactive power dispatch: A comparative study",
14, No. 5, pp. 469–478, 2021, doi: In: Proc. of DRPT 2011 - 2011 4th Int. Conf.
10.22266/ijies2021.1031.41. Electr. Util. Deregul. Restruct. Power Technol.,
[13] A. H. Jabor and A. H. Ali, "Dual Heuristic No. 1, pp. 1555–1561, 2011, doi:
Feature Selection Based on Genetic Algorithm 10.1109/DRPT.2011.5994144.
and Binary Particle Swarm Optimization", J. [23] D. A. Shammary, A. L. Albukhnefis, A. H.
Univ. BABYLON Pure Appl. Sci., Vol. 27, No. 1, Alsaeedi, and M. A. Asfoor, "Extended particle
pp. 171–183, 2019, doi: swarm optimization for feature selection of
10.29196/jubpas.v27i1.2106. high-dimensional biomedical data", Concurr.
[14] D. A. Shammary, A. L. Albukhnefis, A. H. Comput. Pract. Exp., Vol. 34, No. 10, 2022, doi:
Alsaeedi, and M. A. Asfoor, "Extended particle 10.1002/cpe.6776.
swarm optimization for feature selection of [24] A. S. Alfoudi, A. H. Alsaeedi, M. H. Abed, A.
International Journal of Intelligent Engineering and Systems, Vol.16, No.5, 2023 DOI: 10.22266/ijies2023.1031.47
Received: May 23, 2023. Revised: August 1, 2023. 561

M. Otebolaku, and Y. S. Razooqi, "Palm Vein


Identification Based on Hybrid Feature
Selection Model", Int. J. Intell. Eng. Syst., Vol.
14, No. 5, pp. 469–478, 2021, doi:
10.22266/ijies2021.1031.41.
[25] S. Mohammadi, H. Mirvaziri, M. G Ahsaee, and
H. Karimipour, "Cyber intrusiondetection by
combined feature selection algorithm", J. Inf.
Secur. Appl., Vol. 44, pp. 80–88, 2019, doi:
10.1016/j.jisa.2018.11.007.
[26] R. R. Nuiaa, A. H. Alsaeedi, S. Manickam, and
D. E. J. A. Shammary, "Evolving Dynamic
Fuzzy Clustering (EDFC) to Enhance
DRDoS_DNS Attacks Detection Mechnism",
Int. J. Intell. Eng. Syst., Vol. 15, No. 1, pp. 509–
519, 2022, doi: 10.22266/IJIES2022.0228.46.
[27] S. M. Hadi, A. H. Alsaeedi, M. I. Dohan, R. R.
Nuiaa, S. Manickam, and A. S. D. Alfoudi,
"Dynamic Evolving Cauchy Possibilistic
Clustering Based on the Self-Similarity
Principle (DECS) for Enhancing Intrusion
Detection System", Int. J. Intell. Eng. Syst., Vol.
15, No. 5, pp. 252–260, 2022, doi:
10.22266/ijies2022.1031.23.
[28] S. M. Hadi, A. H. Alsaeedi, M. I. Dohan, R. R.
Nuiaa, S. Manickam, and A. S. D. Alfoudi,
"Dynamic Evolving Cauchy Possibilistic
Clustering Based on the Self-Similarity
Principle (DECS) for Enhancing Intrusion
Detection System", Int. J. Intell. Eng. Syst., Vol.
15, No. 5, pp. 252–260, 2022, doi:
10.22266/ijies2022.1031.23.
[29] P. P. Debata and P. Mohapatra, "Identification of
significant bio-markers from high-dimensional
cancerous data employing a modified multi-
objective meta-heuristic algorithm", J. King
Saud Univ. - Comput. Inf. Sci., Vol. 34 No. 8,
2022, doi: 10.1016/j.jksuci.2020.12.014.
[30] I. A. Turaiki and N. Altwaijry, "A Convolutional
Neural Network for Improved Anomaly-Based
Network Intrusion Detection", Big Data, Vol. 9,
No. 3, pp. 233–252, 2021, doi:
10.1089/big.2020.0263.

International Journal of Intelligent Engineering and Systems, Vol.16, No.5, 2023 DOI: 10.22266/ijies2023.1031.47

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy