Download PDFOpen PDF in browserHRA: A Multi-Criteria Framework for Ranking Metaheuristic Optimization AlgorithmsEasyChair Preprint 149428 pages•Date: September 18, 2024AbstractMetaheuristic algorithms are essential for solving complex optimization problems in different fields. However, the difficulty in comparing and rating these algorithms remains due to the wide range of performance metrics and problem dimensions usually involved. On the other hand, nonparametric statistical methods and post hoc tests are time-consuming, especially when we only need to identify the top performers among many algorithms. The Hierarchical Rank Aggregation (HRA) algorithm aims to efficiently rank metaheuristic algorithms based on their performance across many criteria and dimensions. The HRA employs a hierarchical framework that begins with collecting performance metrics on various benchmark functions and dimensions. Rank-based normalization is employed for each performance measure to ensure comparability and the robust TOPSIS aggregation is applied to combine these rankings at several hierarchical levels, resulting in a comprehensive ranking of the algorithms. Our study uses data from the CEC 2017 competition to demonstrate the robustness and efficacy of the HRA framework. It examines 30 benchmark functions and evaluates the performance of 13 metaheuristic algorithms across five performance indicators in four distinct dimensions. This presentation highlights the potential of the HRA to enhance the interpretation of the comparative advantages and disadvantages of various algorithms by simplifying practitioners' choices of the most appropriate algorithm for certain optimization problems. Keyphrases: CEC2017, Evolutionary Algorithms, Hierarchical Rank Aggregation, Metaheuristic algorithms, Multi-Criteria Decision Making (MCDM), Rank-Based Normalization, Robust TOPSIS, algorithm performance, competition dataset, hierarchical structure, multiple performance metrics
|