Abstract
Metaheuristics have been successfully applied to solve complex real-world problems in many application domains. Their performance strongly depends on the values of their parameters. Many tuning algorithms have already been proposed to find a set of suitable values. However, the amount of computational time required to obtain these values is usually high. Our objective is to propose a collaborative strategy to: (1) improve the quality of configurations obtained by tuner algorithms and (2) reduce the time consumed in the tuning process. Here, we introduce a novel opposite scoring (OS) strategy that learns from configurations that produce a positive and a negative effect in the target algorithm. However, OS guides its trajectory by choosing parameter configurations that decrease the performance of the target algorithm. For the learning process, OS stores the quality of all the evaluated configurations and computes a score for each value in the visited parameter configurations. Then, OS generates the initial set of configurations for a tuner, where values that obtain a better score will have a higher probability of being part of this set. We evaluate our proposal using the well-known Evolutionary Calibrator (Evoca). Also, we tune three different algorithms: an Ant Colony Optimization algorithm for solving the Multidimensional Knapsack Problem, a Genetic Algorithm for solving landscapes that follow the NK model (N components and degree K), and a Particle Swarm Optimization algorithm for solving continuous optimization problems. Results show that OS-Evoca obtains better quality configurations than Evoca, consuming less computational resources.







Similar content being viewed by others
Explore related subjects
Discover the latest articles, news and stories from top researchers in related subjects.Data availability
The code of opposite scoring and datasets used during and/or analyzed during the current study is available from the corresponding author on reasonable request.
Notes
Evoca implementation is available in http://emontero.pag.alumnos.inf.utfsm.cl/EVOCA/index.html.
The code of PSO-X is available in http://iridia.ulb.ac.be/supp/IridiaSupp2021-001/PSO-X.zip.
References
Wang X, Li X, Chen X, Cao C (2020) Diagnosis model of pancreatic cancer based on fusion of distribution estimation algorithm and genetic algorithm. Neural Comput Appl 32(10):5425–5434
Behera R, Naik D, Rath S, Dharavath R (2020) Genetic algorithm-based community detection in large-scale social networks. Neural Comput Appl 32(13):9649–9665
García-Álvarez J, González M, Vela C (2018) Metaheuristics for solving a real-world electric vehicle charging scheduling problem. Appl Soft Comput 65:292–306
Khan S, Mahmood A (2019) Fuzzy goal programming-based ant colony optimization algorithm for multi-objective topology design of distributed local area networks. Neural Comput Appl 31(7):2329–2347
Doerr B, Doerr C (2018) Theory of parameter control for discrete black-box optimization: provable performance gains through dynamic parameter choices. Comput Res Repos arXiv:1804.05650
Montero E, Riff M-C, Neveu B (2014) A beginner’s guide to tuning methods. Appl Soft Comput 17:39–51. https://doi.org/10.1016/j.asoc.2013.12.017
Eiben AE, Michalewicz Z, Schoenauer M, Smith JE (2007) Parameter control in evolutionary algorithms. In: Lobo FG, Lima CF, Michalewicz Z (eds) Parameter setting in evolutionary algorithms, vol 54. Studies in computational intelligence. Springer, Berlin, pp 19–46
Huang C, Li Y, Yao X (2020) A survey of automatic parameter tuning methods for metaheuristics. IEEE Trans Evol Comput 24(2):201–216. https://doi.org/10.1109/TEVC.2019.2921598
Karafotias G, Hoogendoorn M, Eiben E (2015) Parameter control in evolutionary algorithms: trends and challenges. IEEE Trans Evol Comput 19(2):167–187. https://doi.org/10.1109/TEVC.2014.2308294
Nannen V, Eiben AE (2007) Efficient relevance estimation and value calibration of evolutionary algorithm parameters. In: Proceedings of the IEEE congress on evolutionary computation, CEC 2007, 25–28 September 2007. IEEE, pp 103–110. https://doi.org/10.1109/CEC.2007.4424460
Hutter F, Stützle T, Leyton-Brown K, Hoos H (2009) Paramils: an automatic algorithm configuration framework. J Artif Intell Res 36:267–306
Hutter F, Hoos H, Leyton-Brown K (2011) Sequential model-based optimization for general algorithm configuration. In: Coello CAC (ed) Learning and intelligent optimization—5th international conference, LION 5, Rome, Italy, January 17–21, 2011. Selected papers. Lecture notes in computer science, vol 6683. Springer, Berlin, pp 507–523. https://doi.org/10.1007/978-3-642-25566-3_40
Riff M-C, Montero E (2013) A new algorithm for reducing metaheuristic design effort. In: Proceedings of the IEEE congress on evolutionary computation, CEC 2013, June 20–23, 2013. IEEE, pp 3283–3290. https://doi.org/10.1109/CEC.2013.6557972
Lopez-Ibanez M, Dubois-Lacoste J, Pérez L, Stützle T, Birattari M (2016) The irace package: iterated racing for automatic algorithm configuration. Oper Res Perspect 3:43–58
Montero E, Riff M-C, Rojas-Morales N (2018) Tuners review: how crucial are set-up values to find effective parameter values? Eng Appl Artif Intell 76:108–118. https://doi.org/10.1016/j.engappai.2018.09.001
Rojas-Morales N, Riff M-C, Montero E (2017) A survey and classification of opposition-based metaheuristics. Comput Ind Eng 110:424–435. https://doi.org/10.1016/j.cie.2017.06.028
Malisia AR (2008) Improving the exploration ability of ant-based algorithms. In: Tizhoosh HR, Ventresca M (eds) Oppositional concepts in computational intelligence, vol 155. Studies in computational intelligence. Springer, Berlin, pp 121–142. https://doi.org/10.1007/978-3-540-70829-2_7
Rojas-Morales N (2018) Opposite learning strategies for improving the search process of ant-based algorithms. PhD thesis, Universidad Técnica Federico Santa María
Rojas-Morales N, Riff MC, Neveu B (2021) Learning and focusing strategies to improve ACO that solves CSP. Eng Appl Artif Intell 105:104408. https://doi.org/10.1016/j.engappai.2021.104408
Rojas-Morales N, Riff MC, Montero E (2021) Opposition-inspired synergy in sub-colonies of ants: the case of focused ant solver. Knowl Based Syst 229:107341. https://doi.org/10.1016/j.knosys.2021.107341
Leung SW, Zhang X, Yuen SY (2012) Multiobjective differential evolution algorithm with opposition-based parameter control. In: Proceedings of the IEEE congress on evolutionary computation, CEC2012, June 10–15, 2012. IEEE, pp 1–8. https://doi.org/10.1109/CEC.2012.6256612
Liu H, Wu Z, Wang H, Rahnamayan S, Deng C (2014) Improved differential evolution with adaptive opposition strategy. In: Proceedings of the IEEE congress on evolutionary computation, CEC 2014, July 6–11, 2014. IEEE, pp 1776–1783. https://doi.org/10.1109/CEC.2014.6900298
Rojas-Morales N, Riff MC (2020) A practical tuner based on opposite information. In: IEEE congress on evolutionary computation, CEC 2020, July 19–24, 2020. IEEE, pp 1–8. https://doi.org/10.1109/CEC48606.2020.9185746
Alaya I, Solnon C, Ghedira K (2004) Ant algorithm for the multi-dimensional knapsack problem. In: International conference on bioinspired optimization methods and their applications (BIOMA 2004), pp 63–72
Pelikan M (2008) Analysis of estimation of distribution algorithms and genetic algorithms on NK landscapes. In: Ryan C, Keijzer M (eds) Genetic and evolutionary computation conference, GECCO 2008, proceedings, July 12–16, 2008. ACM, pp 1033–1040. https://doi.org/10.1145/1389095.1389287
Camacho-Villalón C, Dorigo M, Stützle T (2022) PSO-X: a component-based framework for the automatic design of particle swarm optimization algorithms. IEEE Trans Evol Comput 26(3):402–416. https://doi.org/10.1109/TEVC.2021.3102863
Rojas-Morales N, Riff MC (2021) Reducing the effort of evolutionary calibrator using opposite information. In: IEEE Latin American conference on computational intelligence, LA-CCI 2021, Temuco, Chile, November 2–4, 2021. IEEE, pp 1–6. https://doi.org/10.1109/LA-CCI48322.2021.9769793
Duraipandian M (2020) Long term evolution-self organizing network for minimization of sudden call termination in mobile radio access networks. J Trends Comput Sci Smart Technol (TCSST) 2(02):89–97
Sun J, Garibaldi J, Hodgman C (2012) Parameter estimation using metaheuristics in systems biology: a comprehensive review. IEEE ACM Trans Comput Biol Bioinform 9(1):185–202
Kim S, Hooker AC, Shi Y, Kim GH, Wong WK (2021) Metaheuristics for pharmacometrics. CPT Pharmacomet Syst Pharmacol 10(11):1297–1309
Ghawi R, Pfeffer J (2019) Efficient hyperparameter tuning with grid search for text categorization using kNN approach with BM25 similarity. Open Comput Sci 9(1):160–180. https://doi.org/10.1515/comp-2019-0011
Dhilsath F, Samuel SJ (2021) Hyperparameter tuning of ensemble classifiers using grid search and random search for prediction of heart disease. Comput Intell Healthc Inform 139–158
Feurer M, Hutter F (2019) Hyperparameter optimization. In: Hutter F, Kotthoff L, Vanschoren J (eds) Automated machine learning—methods, systems, challenges. The Springer series on challenges in machine learning. Springer, Berlin, pp 3–33. https://doi.org/10.1007/978-3-030-05318-5_1
Feurer M, Springenberg JT, Hutter F (2015) Initializing Bayesian hyperparameter optimization via meta-learning. In: Bonet BSK (ed) Proceedings of the twenty-ninth AAAI conference on artificial intelligence, January 25–30, 2015, Austin, Texas, USA. AAAI Press, pp 1128–1135
Osaba E, Villar-Rodriguez E, Del Ser J, Nebro A, Molina D, LaTorre A, Suganthan P, Coello Coello C, Herrera F (2021) A tutorial on the design, experimentation and application of metaheuristic algorithms to real-world optimization problems. Swarm Evolut Comput 64:100888. https://doi.org/10.1016/j.swevo.2021.100888
Treimun-Costa G, Montero E, Ochoa G, Rojas-Morales N (2020) Modelling parameter configuration spaces with local optima networks. In: Coello Coello CA (ed) GECCO’20: genetic and evolutionary computation conference. ACM, pp 751–759. https://doi.org/10.1145/3377930.3390199
Cleghorn C, Ochoa G (2021) Understanding parameter spaces using local optima networks: a case study on particle swarm optimization. In: Krawiec K (ed) GECCO’21: genetic and evolutionary computation conference, companion, July 10–14, 2021. ACM. pp 1657–1664
Stützle T, Hoos H (2000) MAX–MIN ant system. Futur Gener Comput Syst 16(8):889–914. https://doi.org/10.1016/S0167-739X(00)00043-1
Kauffman SA (1993) The origins of order: self-organization and selection in evolution. Oxford University Press, USA
Martins M, El Yafrani M, Delgado M, Lüders R, Santana R, Siqueira H, Akcay H, Ahiod B (2021) Analysis of Bayesian network learning techniques for a hybrid multi-objective Bayesian estimation of distribution algorithm: a case study on MNK landscape. J Heuristics 27(4):549–573. https://doi.org/10.1007/s10732-021-09469-x
Suganthan PN, Hansen N, Liang JJ, Deb K, Chen Y-P, Auger A, Tiwari S (2005) Problem definitions and evaluation criteria for the CEC’ 2005 special session on real-parameter optimization. KanGAL Report Number 2005005
Liang JJ, Qu BY, Suganthan PN (2013) Problem definitions and evaluation criteria for the CEC 2014 special session and competition on single objective real-parameter numerical optimization. Computational Intelligence Laboratory, Zhengzhou University, Zhengzhou China and Technical Report N. 201311, Nanyang Technological University, Singapore 635, 490
Lozano M, Molina D, Herrera F (2011) Editorial scalability of evolutionary algorithms and other metaheuristics for large-scale continuous optimization problems. Soft Comput 15(11):2085–2087
Acknowledgements
The authors thank Christian Camacho for sending us the code of PSO-X. Also, the authors thank Leslie Perez and Elizabeth Montero for helping us with defining the PSO-X tuning scenarios. The work is supported by the Fondo Nacional de Desarrollo Científico y Tecnológico (FONDECYT) under Project No. 1200126 and UTFSM DGIIE Funding Project N. PI_LII_2022_03.
Author information
Authors and Affiliations
Corresponding author
Ethics declarations
Conflict of interest
The authors declare that they have no known competing financial interests or personal relationships that could have appeared to influence the work reported in this paper.
Additional information
Publisher's Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Appendix A: Details of functions
Appendix A: Details of functions
The problem instances in the tuning and testing process of PSO-X are presented in Table 10.
Rights and permissions
Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.
About this article
Cite this article
Rojas-Morales, N., Riff, MC. Opposite scoring: focusing the tuning process of evolutionary calibrator. Neural Comput & Applic 35, 9269–9283 (2023). https://doi.org/10.1007/s00521-023-08203-x
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s00521-023-08203-x