ArticlePDF Available

A Surrogate Model-based Aquila Optimizer for Solving High-dimensional Computationally Expensive Problems

Authors:

Abstract

This paper introduces a variant version of the Aquila Optimizer (AO) for efficiently solving high-dimensional computationally expensive problems. Traditional optimization techniques struggle with problems characterized by expensive objective functions and a large number of variables. To address this challenge, this paper proposes a Surrogate Model-based Aquila Optimizer (SMAO) that leverages machine learning techniques to approximate the objective function. SMAO utilizes Radial Basis Functions (RBFs) to build an accurate and efficient surrogate model. By iteratively optimizing the surrogate model, the search process is directed toward the global optimum while significantly reducing the computational cost compared to traditional optimization methods. To evaluate and compare the performance of SMAO with the surrogate model-based versions of the Gazelle Optimization Algorithm Gazelle Optimization Algorithm (GOA), Reptile Search Algorithm (RSA), Prairie Dog Optimization (PDO), and Fick's Law Optimization Algorithm (FLA), they are analyzed on a set of benchmark test functions with dimensions varying from 30 to 200. According to the reported results, SMAO has a higher performance compared to others in terms of achieving the nearest solutions to an optimum, early convergence, and accuracy.
Journal of Computing and Security
Research Article, 2024, Volume 11, Number 1 (pp. 1–18)
http://www.jcomsec.org
A Surrogate Model-based Aquila Optimizer for Solving
High-dimensional Computationally Expensive Problems
Alireza Rouhi a,Einollah Pira a
aFaculty of Information Technology and Computer Engineering, Azarbaijan Shahid Madani University, Tabriz, Iran.
ARTICLE INFO.
Article history:
Received: 08 September 2023
Revised: 23 December 2023
Accepted: 07 January 2024
Published Online: 16 February 2024
Keywords:
Aquila optimizer,
High-dimensional expensive
problems, Radial basis function
(RBF) model, Surrogate models.
ABSTRACT
This paper introduces a variant version of the Aquila Optimizer (AO) for
eciently solving high-dimensional computationally expensive problems.
Traditional optimization techniques struggle with problems characterized
by expensive objective functions and a large number of variables. To
address this challenge, this paper proposes a Surrogate Model-based Aquila
Optimizer (SMAO) that leverages machine learning techniques to approximate
the objective function. SMAO utilizes Radial Basis Functions (RBFs) to
build an accurate and ecient surrogate model. By iteratively optimizing
the surrogate model, the search process is directed toward the global
optimum while signicantly reducing the computational cost compared
to traditional optimization methods. To evaluate and compare the
performance of SMAO with the surrogate model-based versions of the Gazelle
Optimization Algorithm Gazelle Optimization Algorithm (GOA), Reptile
Search Algorithm (RSA), Prairie Dog Optimization (PDO), and Fick’s Law
Optimization Algorithm (FLA), they are analyzed on a set of benchmark test
functions with dimensions varying from 30 to 200. According to the reported
results, SMAO has a higher performance compared to others in terms of
achieving the nearest solutions to an optimum, early convergence, and accuracy.
1 Introduction
High-dimensional optimization problems with compu-
tationally expensive objective functions pose signi-
cant challenges across a multitude of elds, ranging
from engineering design [1] to nance [2], and from
machine learning [3] to scientic research [4]. In these
problems, the search for optimal solutions is hindered
by two fundamental factors: (i) the need for a large
number of function evaluations to assess the quality
Corresponding author.
Email addresses: rouhi@azaruniv.ac.ir (A. Rouhi),
pira@azaruniv.ac.ir (E. Pira)
https://dx.doi.org/10.22108/JCS.2024.139051.1132
ISSN: 2322-4460
of potential solutions and (ii) the high dimensionality
of the parameter space. The convergence of these two
intricacies frequently creates a computational bottle-
neck that not only hampers prompt decision-making
but also obstructs progress in various domains [5].
Traditional optimization techniques, including
gradient-based methods [6] and evolutionary algo-
rithms [7], struggle to cope with the inherent com-
putational demands of such problems. Their reliance
on direct evaluations of the objective function, which
can be prohibitively expensive, renders them im-
practical when dealing with high-dimensional spaces.
Furthermore, as the dimensionality increases, the ex-
ploration of the solution space becomes increasingly
2A Surrogate Model-based Aquila Optimizer for Solving High-dimensional —A. Rouhi and E. Pira
challenging, making it dicult to locate the global
optimum within reasonable time frames.
In response to these challenges, the Aquila Opti-
mizer (AO) [8] has emerged as a promising optimiza-
tion algorithm designed to address high-dimensional
optimization problems. AO exhibits noteworthy ca-
pabilities in eciently exploring vast solution spaces;
however, its eectiveness becomes constrained when
applied to problems characterized by computation-
ally expensive objective functions [9]. In such cases,
AO’s performance may suer from extended compu-
tation times, rendering it less practical for real-world
applications.
Recent researches show that surrogate models can
play a crucial role in optimizing complex and compu-
tationally expensive functions by approximating their
behavior with a simpler, more ecient model [10]. In
the context of the AO, integrating surrogate models
can oer a powerful strategy to enhance optimization
eciency. By using surrogate models, which mimic
the underlying objective function, Aquila can intel-
ligently explore the search space, reducing the num-
ber of expensive function evaluations and accelerat-
ing the convergence towards optimal solutions. This
incorporation of surrogate models not only enhances
the optimizer’s performance but also makes it adapt-
able to real-world scenarios with resource-intensive
objectives.
This paper introduces the Surrogate Model-based
Aquila Optimizer (SMAO) [10]. SMAO represents a
novel approach that seamlessly integrates the power
of surrogate modeling with the agility of AO [8]. This
hybridization aims to overcome the computational
hurdles encountered when solving high-dimensional
optimization problems. The core idea behind SMAO
lies in the utilization of surrogate models, which are
data-driven approximations of the costly objective
function. Instead of directly evaluating the objective
function, SMAO iteratively optimizes the surrogate
model, allowing for signicantly faster computations
while guiding the search toward the global optimum.
At the heart of SMAO’s surrogate modeling is the
use of RBFs [11], a mathematical tool renowned for
its capacity to approximate complex and expensive
functions with remarkable accuracy.
This paper presents an in-depth exploration of
SMAO, shedding light on its conceptual framework,
optimization procedure, and the underlying rationale
for employing RBFs as the surrogate modeling tech-
nique. Subsequently, an extensive set of experiments
is conducted to evaluate and compare SMAO’s per-
formance with the surrogate model-based versions of
other state-of-the-art optimization algorithms, GOA
[12], RSA [13], PDO [14], and FLA [15]. Our evalua-
tion is based on a comprehensive set of benchmark
test functions with dimensions varying from 30 to
200, all known for their computational expense and
multimodality [16].
The empirical results reveal that SMAO surpasses
its counterparts in terms of convergence speed, and
hit rate, thus establishing its ecacy in eciently
addressing high-dimensional optimization problems
characterized by computationally expensive objective
functions. This paper serves as a comprehensive in-
troduction to SMAO, setting the stage for a detailed
exploration of its components, optimization process,
experimental methodology, results, and implications
in the subsequent sections. This research aims to pro-
vide a compelling solution to a pervasive challenge,
facilitating the advancement of applications across di-
verse domains that hinge on the ecient resolution
of high-dimensional optimization problems with com-
putationally expensive objectives.
The remainder of this paper is organized as fol-
lows: Section 2presents a brief review of pieces of
research that have used surrogate model-based opti-
mization for solving high-dimensional computation-
ally expensive problems as well as a brief review of im-
proved versions of the AO in the literature. Section 3
presents a brief introduction to the high-dimensional
optimization challenges, the basics of AO algorithm
as well as compared algorithms. Section 4provides
a detailed description of our proposed SMAO’s op-
timization procedure, elucidating the steps involved
in leveraging surrogate models for ecient optimiza-
tion. Following this, Section 5presents the experi-
mental methodology used to evaluate SMAO’s perfor-
mance and compare it to the surrogate model-based
version of other state-of-the-art algorithms, including
the GOA, RSA, PDO, and FLA and the empirical
results of our experiments, showcasing SMAO’s supe-
rior performance in terms of convergence speed, and
hit rate across various dimensions. Finally, Section
6concludes the paper, summarizing the key ndings
and their implications, while also discussing avenues
for future research.
2 Literature Review
Here, for the sake of completeness of the paper, a brief
literature review is presented by highlighting the stud-
ies that have used surrogate model-based optimiza-
tion for solving high-dimensional computationally ex-
pensive problems [10]. Also, a review is presented on
improving the AO algorithm.
Müller [17] introduced SOCEMO, a surrogate
model algorithm tailored for computationally ex-
pensive multi-objective optimization problems. By
Research Article, 2024, Volume 11, Number 1 (pp. 1–18) 3
leveraging surrogate models and innovative sam-
pling strategies, SOCEMO outperforms traditional
methods, providing ecient solutions for challenging
optimization tasks.
García-García et al. [18] introduced a surrogate-
based cooperative optimization framework designed
for computationally expensive black-box problems.
The approach utilizes surrogate models to approxi-
mate the expensive objective function, enabling e-
cient optimization through collaboration among mul-
tiple surrogate models. This method aims to enhance
the optimization process for problems where evaluat-
ing the objective function is resource-intensive.
Yi et al. [19] introduced a multi-delity Radial
Basis Function (RBF) surrogate-based optimization
framework designed for computationally expensive
multi-modal problems. The framework is applied to
address capacity planning in manufacturing systems,
demonstrating its eectiveness in optimizing com-
plex systems with varying computational costs and
multiple solution modes.
Liu et al. [20] introduced a novel approach, the His-
torical Surrogate Model Ensemble Assisted Bayesian
Evolutionary Optimization Algorithm, designed to
address expensive multi-objective optimization prob-
lems. By leveraging a surrogate model ensemble with
historical information, the algorithm enhances the ef-
ciency of Bayesian optimization. This methodology
proves eective in optimizing complex problems with
multiple objectives while minimizing computational
costs.
Zhang et al. [21] introduced a dynamic surrogate
model-based optimization (DSMO) approach for
maximum power point tracking (MPPT) in central-
ized thermoelectric generation systems under het-
erogeneous temperature dierences (HeTD). DSMO
employs a dynamic surrogate model and a guided
search to eciently locate the global maximum
power point, outperforming traditional methods and
meta-heuristic algorithms.
Shi et al. [22] introduced a novel multi-delity
surrogate model called Co_SVR, based on support
vector regression, for ecient engineering design and
optimization. Co_SVR leverages high-delity (HF)
and low-delity (LF) samples by mapping them into a
high-dimensional space and employing a linear model
to establish input-output relationships. The study
demonstrates Co_SVR’s competitive performance
through numerical experiments and a practical pres-
sure relief valve design problem, distinguishing it
from previous SVR-based multi-delity models.
Tabar et al. [23] presented a novel neural network-
based surrogate model method for optimizing spot
welding sequences in individual sheet metal assem-
blies to minimize geometrical deviations. This ap-
proach signicantly reduces computational time, up
to 90% compared to genetic algorithms, while achiev-
ing close-to-optimal geometrical outcomes, address-
ing a critical challenge in the automotive industry’s
Body-In-White assembly process.
Bamdad et al. [24] introduced an innovative active
sampling strategy using a committee of surrogate
models to enhance building energy optimization
methods. This strategy targets regions of parameter
space where surrogate model predictions are uncer-
tain yet have low energy use, improving optimiza-
tion performance. Comparative analysis reveals that
surrogate model-based optimization outperforms
simulation-based methods initially, with Ant Colony
Optimization excelling later.
Pedrozo et al. [25] introduced an algorithm for op-
timizing ethylene production from shale gas using it-
erative renement of piecewise linear surrogate mod-
els within a superstructure framework. A more cost-
eective chemical looping oxidative dehydrogenation
technology has been identied here, yielding a 12%
higher net present value (NPV) and a 15% reduc-
tion in ethylene production costs compared to con-
ventional steam cracking.
Akyol [26] introduced a novel hybrid optimization
approach by combining the Aquila optimizer and tan-
gent search algorithm. This method is designed for
global optimization and is presented in the context
of ambient intelligence and humanized computing,
demonstrating improved performance in solving com-
plex optimization problems.
Demiral et al. [27] presented a multi-delity
surrogate-based approach for aerodynamic shape op-
timization, specically designed for computationally
expensive problems. The method aims to eciently
optimize aerodynamic shapes by utilizing surrogate
models at various levels of delity, reducing the
computational cost associated with the optimization
process.
Nadimi-Shahraki et al. [28] introduced the Binary
Aquila Optimizer as a feature selection method for
medical data analysis, with a focus on a COVID-19
case study. The proposed optimizer aims to identify
eective features in medical datasets, particularly in
the context of COVID-19, using a binary optimization
approach.
Yu et al. [29] introduced an Enhanced Aquila op-
timizer algorithm, designed for global optimization
and addressing constrained engineering problems.
The presented algorithm’s ecacy is explored in
mathematical biosciences and engineering applica-
4A Surrogate Model-based Aquila Optimizer for Solving High-dimensional —A. Rouhi and E. Pira
tions, presenting results in the form of enhanced
performance and eectiveness in solving complex op-
timization challenges. This algorithm is proposed as
a valuable tool for improving optimization solutions
in diverse elds.
Xiao et al. [30] introduced IHAOAVOA, a novel op-
timization algorithm that combines the Aquila opti-
mizer and the African Vultures Optimization Algo-
rithm (AVOA) to enhance global optimization per-
formance. The proposed hybrid algorithm aims to
achieve improved convergence and exploration capa-
bilities for solving optimization problems, as demon-
strated in mathematical biosciences and engineering
applications.
Li et al. [31] introduced a novel optimization
method for addressing expensive problems, employ-
ing multiple surrogate models. The approach lever-
ages the Multimodal Expected Improvement Crite-
rion to enhance eciency in optimizing complex and
resource-intensive problems, oering a promising
solution for real-world applications.
Liu et al. [32] introduced a novel approach for
global optimization by combining a reinforcement
learning-based hybrid Aquila Optimizer with an en-
hanced Arithmetic Optimization Algorithm. The
proposed method aims to improve the eciency of
global optimization tasks, leveraging both the Aquila
Optimizer’s reinforcement learning capabilities and
the enhanced Arithmetic Optimization Algorithm.
Wu et al. [33] propose an Adaptive Multi-Surrogate
and Module-based Optimization Algorithm designed
to address high-dimensional and computationally ex-
pensive problems. The algorithm employs a combina-
tion of adaptive surrogate modeling and modular op-
timization strategies, enhancing its eciency in han-
dling complex optimization tasks. This approach aims
to improve the convergence and solution accuracy for
challenging problems in various domains.
Baş [34] introduced a novel optimization algorithm,
the Binary Aquila Optimizer, designed specically
for solving 0–1 knapsack problems. The proposed
method demonstrates eectiveness in eciently tack-
ling combinatorial optimization challenges, as evi-
denced by its application in various engineering sce-
narios. This study highlights the algorithm’s promis-
ing performance in addressing NP-hard problems
associated with binary decision variables.
El-Mageed et al. [35] propose an enhanced feature
selection strategy for supervised classication using
an improved Binary Aquila Optimization Algorithm.
The algorithm eciently identies and selects rele-
vant features, optimizing classication performance.
This approach demonstrates superior eectiveness in
comparison to existing methods, highlighting its po-
tential for enhancing the accuracy and eciency of
supervised classication tasks.
Deng [36] introduced the AO for global optimiza-
tion but addressed its susceptibility to local optima by
incorporating a Dynamic Grouping Strategy (DGS).
The proposed DGSAO algorithm selectively evolves
individuals with the worst tness in each group, en-
hancing population diversity. Experimental results
on 23 benchmark functions and two engineering de-
sign problems demonstrate that DGSAO is eective
in solving global optimization tasks, outperforming
standard AO and other algorithms in both low and
high dimensions.
Ahmadipour et al. [37] propose an optimal power
ow solution employing a hybrid algorithm that com-
bines arithmetic optimization and the Aquila opti-
mizer. This approach aims to enhance the eciency of
power system operations by optimizing the distribu-
tion of power while minimizing costs. The hybridiza-
tion of these algorithms demonstrates improved per-
formance in solving the optimal power ow problem.
Abualigah and Almotairi [38] propose a dynamic
evolutionary approach for clustering data and text
documents. Here, an enhanced Aquila optimizer has
been introduced using an arithmetic optimization al-
gorithm and dierential evolution. This method aims
to improve the eciency and accuracy of document
clustering in dynamic environments, providing a ro-
bust solution for evolving data sets.
Zeng et al. [39] introduced the Spiral Aquila op-
timizer, incorporating dynamic Gaussian mutation,
and explored its applications in global optimization
and engineering. The proposed optimizer aims to en-
hance convergence and solution accuracy. This study
highlights its ecacy through experiments, showcas-
ing its potential in addressing optimization challenges
in diverse engineering domains.
Table 1compares the key features of the proposed
method with existing studies from dierent points of
view.
3 Background
High-dimensional optimization problems with compu-
tationally expensive objective functions are prevalent
in various domains, including engineering, nance,
and scientic research [14]. These problems demand
ecient optimization algorithms that can navigate
complex solution spaces while minimizing the num-
ber of costly objective function evaluations [11]. This
literature review explores the background of high-
dimensional optimization, with a primary focus on
Research Article, 2024, Volume 11, Number 1 (pp. 1–18) 5
Table 1. Summary of the Related Works and the Proposed Algorithm (SMAO) Focusing on their Key Features.
Reference Application Area Optimization Technique Main Contribution
Müller [17] Multi-objective optimization Surrogate optimization Surrogate optimization for expensive multi-objective problems
García-García
et al. [18]
Black-box optimization Cooperative optimization Cooperative optimization framework for expensive black-box
problems
Yi et al. [19] Multi-modal optimization, Ca-
pacity planning
RBF Surrogate Framework for optimization of multi-modal problems with
capacity planning application
Liu et al. [20] Multi-objective optimization Bayesian evolutionary opti-
mization
Historical surrogate model ensemble for expensive many-
objective problems
Zhang et al. [21] MPPT, Thermoelectric genera-
tion
Dynamic surrogate model Dynamic surrogate model for MPPT in thermoelectric gener-
ation systems
Shi et al. [22] Global optimization Support Vector Regression Multi-delity surrogate model based on SVR
Tabar et al. [23] Spot welding sequence optimiza-
tion
Surrogate model-based method Surrogate model-based method for spot welding sequence op-
timization
Bamdad et al. [24] Building energy optimization Active sampling Building energy optimization using surrogate models and ac-
tive sampling
Pedrozo et al. [25] Ethylene production, MILP Surrogate-model-based MILP Surrogate-model-based MILP for optimal ethylene production
design
Akyol [26] Global optimization Hybrid method Hybrid method for global optimization
Demiral et al. [27] Aerodynamic shape optimization Multi-delity surrogate-based
approach
Surrogate-based approach for aerodynamic shape optimization
Nadimi-Shahraki
et al. [28]
Feature selection, COVID-19 Binary Aquila optimizer Feature selection using Binary Aquila optimizer for medical
data
Yu et al. [29] Global optimization, Con-
strained engineering problems
Enhanced Aquila optimizer Improved Aquila optimizer for global optimization and con-
strained engineering problems
Xiao et al. [30] Global optimization Improved hybrid Aquila opti-
mizer, African vultures opti-
mization
Improved hybrid optimizer for global optimization problems
Li et al. [31] Global optimization Surrogate-model-based opti-
mization
Optimization method using multimodal expected improvement
criterion
Liu et al. [32] Global optimization Reinforcement learning-based
hybrid Aquila optimizer
Reinforcement learning-based hybrid optimizer for global op-
timization
Wu et al. [33] High-dimensional optimization Adaptive multi-surrogate,
Module-based optimization
Adaptive multi-surrogate and module-based optimization al-
gorithm
Baş [34] 0–1 knapsack problems Binary Aquila optimizer Binary Aquila optimizer for 0–1 knapsack problems
El-Mageed et al.
[35]
Feature selection, Supervised
classication
Improved binary Aquila opti-
mization
Feature selection strategy for classication using binary Aquila
optimizer
Deng [36] Global optimization The improved Aquila optimizer Improved second stage of AO, addition of a stage, dynamic
grouping strategy
Ahmadipour et al.
[37]
Optimal Power Flow Hybridization of Arithmetic
Optimization and Aquila Opti-
mizer
Optimal power ow using a hybrid algorithm with arithmetic
optimization and Aquila optimizer.
Abualigah and Al-
motairi [38]
Data and Text Document Clus-
tering
Improved Aquila Optimizer
with Arithmetic Optimization
and Dierential Evolution
A dynamic evolutionary approach for data and text document
clustering using an improved Aquila optimizer.
Zeng et al. [39] Global Optimization and Engi-
neering
Spiral Aquila Optimizer with
Dynamic Gaussian Mutation
Introduction of the Spiral Aquila optimizer with dynamic
Gaussian mutation for global optimization in engineering ap-
plications.
SMAO (presented
in this paper)
High-dimensional optimization Surrogate-model-based opti-
mization
Presenting a Surrogate Model-based Aquila Optimizer
(SMAO) that leverages machine learning techniques to ap-
proximate the objective functions and solve high-dimensional,
computationally expensive optimization problems.
6A Surrogate Model-based Aquila Optimizer for Solving High-dimensional —A. Rouhi and E. Pira
the AO, and provides a summary of the compared
optimization algorithms.
3.1 High-Dimensional Optimization Chal-
lenges
High-dimensional optimization presents formidable
challenges due to the exponential growth in the so-
lution space’s complexity as the number of decision
variables increases. Traditional optimization meth-
ods, including gradient-based techniques, often face
the “curse of dimensionality,” rendering them ine-
cient or impractical for high-dimensional problems
[6,7]. Moreover, when the objective function is com-
putationally expensive to evaluate, the optimization
process becomes even more daunting, as it necessi-
tates a large number of function evaluations [11].
3.2 Aquila Optimizer (AO)
The AO is an optimization algorithm designed for
eciently exploring high-dimensional search spaces
[8]. AO strikes a balance between exploration and ex-
ploitation, making it particularly suitable for complex
optimization problems. Key components of AO are as
follows:
(1) Initialization: Randomly initialize a popula-
tion of candidate solutions within the search
space.
(2) Exploration: Employ exploration operators to
perturb candidate solutions, encouraging explo-
ration of the solution space.
(3) Exploitation: Utilize exploitation operators to
improve candidate solutions by leveraging the
information gathered during exploration.
(4) Selection: Select the best-performing solutions
from the population based on their tness val-
ues.
(5) Termination Criteria: Determine termina-
tion conditions, such as a maximum number of
iterations or achieving a convergence threshold.
Algorithm 1shows the pseudocode for the AO al-
gorithm.
Algorithm 1 Aquila Optimizer (AO) [8].
1: Initialize the population of candidate solutions randomly
2: while Termination criteria are not met do
3: Perform exploration operations on a subset of solutions
4: Perform exploitation operations on another subset of
solutions
5: Evaluate the tness of all solutions
6: Select the best-performing solutions based on tness
7: end while
8: Return the best solution found
AO relies on tness evaluation and operators for
exploration and exploitation [8]. While the specics
of these formulas can vary depending on the problem,
here are some commonly used expressions:
Fitness Evaluation: The tness of a candidate so-
lution is typically determined by evaluating the ob-
jective function. In mathematical notation:
Fitness(x) = f(x)
Where xis the candidate solution, and f(x)is the
objective function.
Exploration and Exploitation Operators: These
operators perturb or improve candidate solutions.
The actual formulas depend on the chosen opera-
tors, which can include mutation, crossover, or other
heuristics tailored to the problem at hand.
This overview provides a basic context for the AO
algorithm, which serves as a foundational component
of the SMAO. SMAO leverages AO’s exploration ca-
pabilities and integrates them with surrogate mod-
eling techniques to eciently solve high-dimensional
optimization problems with computationally expen-
sive objectives.
3.3 Compared Optimization Algorithms
In evaluating the eectiveness of the SMAO, the sur-
rogate model-based versions of several state-of-the-
art optimization algorithms are considered for com-
parison. These algorithms represent a diverse set of
approaches for tackling high-dimensional optimiza-
tion problems with computationally expensive objec-
tive functions:
GOA: GOA [12] is a nature-inspired optimiza-
tion algorithm known for its ability to swiftly
converge to optimal solutions by mimicking the
swift movements of gazelles in nature.
RSA: RSA [13] draws inspiration from the
hunting strategies of reptiles and employs a
population-based approach to explore complex
search spaces eciently.
PDO: PDO [14] is an optimization algorithm in-
spired by the foraging behavior of prairie dogs. It
emphasizes exploration and information-sharing
among candidate solutions.
FLA: FLA [15] is rooted in the principles of
Fick’s laws of diusion and aims to optimize
problems involving the diusion of substances.
These algorithms are chosen for their diverse
methodologies and their relevance to high-dimensional
optimization with computationally expensive objec-
tives. Our comparative analysis assesses their perfor-
mance based on solution quality, convergence speed,
and hit rate, providing insights into the ecacy of
Research Article, 2024, Volume 11, Number 1 (pp. 1–18) 7
SMAO in this context.
In summary, the background of high-dimensional
optimization highlights the formidable challenges
posed by complex search spaces and computation-
ally expensive objective functions. The AO serves as
a foundation for the SMAO, which leverages AO’s
exploration capabilities while integrating surrogate
modeling techniques to address these challenges
eciently. The comparative analysis includes surro-
gate model-based versions of the above optimization
algorithms, setting the stage for a comprehensive
evaluation of SMAO’s eectiveness.
4 Proposed Algorithm
Many real-world optimization problems involve com-
plex computations. In such situations, evaluating t-
ness functions can be time-consuming, which limits
the use of traditional optimization methods. To re-
duce computational costs, surrogate models or meta-
models have been combined with Evolutionary Algo-
rithms (EAs) and are known as Surrogate-assisted
EAs (SAEAs).
SAEAs execute a limited number of evaluations of
the real tness functions and use these evaluations to
train surrogate models. The surrogate models then
act as approximations of the real functions and have
negligible computational costs compared to evaluat-
ing the real functions.
RBFs are one of the most famous surrogate mod-
els used in SAEAs. RBFs compute a weighted sum
of predened simple functions to approximate com-
plex design landscapes. Considering tnumber of data
points, x1, x2, ..., xt, with corresponding tness values
f(x1), f (x2), ..., f(xt). The estimated tness value for
a point xusing a trained RBF model is calculated
using the following formula:
fRBF (x) =
t
i=1
ωiψ(xXi) + p(x)
Here, ωirefers to the weights (coecients) learned
by the RBF model using the least squares method.
The term (xXi2) represents the Euclidean norm
(square root of the sum of squared dierences in each
dimension), and ψdenotes the selected radial basis
function. Several radial basis functions, such as Gaus-
sian (ψ(xXi) = e(ϵxXi)2) and multiquadric
(ψ(xXi) = 1 + (ϵxXi)2), can serve as the
radial basis function, where ϵis a parameter known
as the shape parameter, controlling the smoothness
of the RBF model and inuencing its generalization
capability. Additionally, p(x) = ax +brepresents a
linear function.
The proposed solution, SMAO, utilizes an RBF
model. To do this, it rst randomly generates a pop-
ulation (data) of size T npop and computes their t-
ness values. Then, it iterates the following steps until
the maximum number of tness function evaluations
reaches M axnF e:
(1) It creates an RBF model using this training
data.
(2) It selects a parent population (X) of size N pop
from this training data.
(3) Using the Aquila algorithm, it generates Npop
ospring from the parents (X) and calculates
their RBF values for each x.
(4) It selects the ospring with the lowest RBF
value, calculates its actual tness value using
the function (f(x)), and adds it to the training
population (X_all) if it does not already exist.
Figure 1visually represents the SMAO algorithm’s
key steps, providing a comprehensive understanding
of its workow and integration of surrogate modeling
techniques to eciently address high-dimensional op-
timization challenges.
Moreover, Algorithm 2 shows the pseudo-code of
the SMAO algorithm. In this algorithm, bf T ype,
T npop, and Npop denote the type of the basis func-
tions, the initial population size or training data size,
and the xed population size, respectively. In line 1,
the Latin Hypercube Sampling (LHS) is employed
to generate T npop aquilas (X_all) and then in line
2, the tness of aquilas is calculated (Xcost). In line
3, the main loop of the algorithm is started. In line
4, the rbf _build function is used to create an RBF
model through the training data (X_all). Line 5 se-
lects a parent population (X) of size Npop from this
training data. In line 6, the Aquila algorithm is called
to generate Npop children and calculate their RBF
values for each of them. In lines 8–13, the ospring
with the lowest RBF value is selected, then its actual
tness value is calculated using the real tness func-
tion and is added to the training population (X_all)
if it is not there already. Finally, it returns a solution
in X_all with the lowest tness.
5 Experiments
Table 2provides a summary of the appropriate pa-
rameter values for the optimization algorithms being
compared in the experimental evaluation. The param-
eters are congured for various dimensions (Dim =
30,50,100,and 200), with a xed population size
(Npop = 30), and a maximum number of function
evaluations (MaxnF e = 1000). Each row represents
8A Surrogate Model-based Aquila Optimizer for Solving High-dimensional —A. Rouhi and E. Pira
Start
Specify the input param eters of
MaxnFe, f , lb, ub, dim, bfType, Tnpop,Npop,
and also the ones of the AO algorithm
Initialize the training population X_all with the size of
Tnpop
End
Evaluate each individual of the population X_all using
f
Set nFe =
Tnpop
nFe <=
maxnFe
Build an RBF model using X_all
Select the parent population (X) of size Npop from X_all
Create
Npop
children (X_child) from X using the AO algorithm
Evaluate each individual of the population X_child with using
fRBF
Select the child with the lowest
fRBF
value and compute its actual
value using
f
, and if it is not present in X_all, add it to X_all
Set nFe = nFe+1
Yes No
Figure 1. The SMAO Algorithm Flowchart.
Research Article, 2024, Volume 11, Number 1 (pp. 1–18) 9
a specic algorithm, and the corresponding param-
eter values are detailed to facilitate a fair and stan-
dardized comparison among the algorithms. bfType
is Type of the basis functions, MQ = “Multiquadric”,
and Run_no = 20.
Algorithm 2 The SMAO Algorithm.
Require: M axnFe, f, lb, ub, dim, bf T ype, T npop, N pop,
and also the original Aquila optimizer algorithm,
i.e., AO()
Ensure: A solution with the lowest tness
1: X_all Employing Latin Hypercube Sampling
(LHS) to generate T npop Aquilas
2: Xcost Evaluate Xall using the tness function
f;nfe T npop
3: while nfe M axnF e do
4: RBFmodel
rbf_build(X_all, Xcost , bf T ype)
5: XSelectParents(X_all, N pop)Aquila
algorithm is called to generate Npop children
6: X_child AO(X, N pop)
7: X_child_cost Evaluate X_child using
fRBF
8: Xmin Child with the lowest cost
9: Xmin_cost f(Xmin)
10: if Xmin /Xall then
11: X_all X_all +Xmin
12: Xcost Xcost +Xmin_cost
13: end if
14: nfe nf e + 1
15: end while
16: return Solution in X_all with the lowest tness
Table 2. Appropriate Parameter Values for the Compared
Algorithms (Dim = 30,50,100, and 200;N pop = 30;
M axnF e = 1000).
Appropriate Parameter Values Algorithm
bf T ype = MQ, T npop = (specic
values)
SMAO
P SRs = 0.34, S = 0.88 SMGOA
Alpha = 0.1, Beta = 0.005 SMRSA
Rho = 0.005, epsP D = 0.1SMPDO
C1=0.5, C2=2, C 3=0.1, C4 =
0.2, C5 = 2, D = 0.01
SMFLA
Here, for the sake of completeness, a brief overview
of the compared algorithms (listed in Table 2) and
their appropriate parameter values are described as
follows:
SMAO: The parameters “bfType” and “Tnpop”
are specied as “MQ” and have specic values
related to the algorithm.
Surrogate Model-based GOA (SMGOA):
The parameters P SRs and Sare dened as
0.34 and 0.88, respectively, inuencing the ex-
ploration and exploitation characteristics of
SMGOA.
Surrogate Model-based RSA (SMRSA):
Parameters Alpha and Beta are congured as
0.1 and 0.005, respectively, governing SMRSA’s
exploration and exploitation strategies.
Surrogate Model-based PDO (SMPDO):
The parameters Rho and epsP D are set to 0.005
and 0.1, respectively, inuencing SMPDO’s for-
aging behavior.
Surrogate Model-based FLA (SMFLA):
Several parameters, including C1, C2, C 3, C4, C 5,
and D, are specied with values to guide
SMFLA’s optimization process based on Fick’s
laws of diusion.
These parameter congurations ensure that each al-
gorithm operates consistently across dierent dimen-
sions, enabling a fair and meaningful comparison of
their performance in solving high-dimensional opti-
mization problems with computationally expensive
objectives.
The compared algorithms were executed using
MATLAB 2017a, leveraging an Intel Core i5 CPU
with 6GB of RAM. The results showcased in the fol-
lowing tables (Tables 47) encompass values from 20
individual runtime trials, including the best, worst,
average (Ave), and standard deviation (Std) val-
ues. Initial population size or training data size, i.e.,
T npop is set to 100 in case of dim 50, otherwise
200.
5.1 Benchmark test functions
Table 3provides details of several benchmark test
functions [16] that are commonly used in the eval-
uation of optimization algorithms. These functions
serve as standardized challenges for assessing the per-
formance of optimization techniques across dierent
problem types. Below is a brief description of each
test function:
F1–Ellipsoid: This is an unimodal function
characterized by a single global optimum. It is
dened over a bounded range from -5.12 to 5.12,
with the optimum value at 0.
F2–Rosenbrock: F2 is a multimodal function
with a narrow valley leading to the global mini-
mum. It operates within the range [-2.048, 2.048]
and has its optimum value at 0.
F3–Ackley: Ackley is a multimodal function
known for its complex landscape. It spans a wide
range from -32.768 to 32.768, and the global op-
timum is at 0.
10 A Surrogate Model-based Aquila Optimizer for Solving High-dimensional —A. Rouhi and E. Pira
F4–Griewank: F4 is another multimodal func-
tion with a broader search space ranging from -
600 to 600. Like the previous functions, its global
optimum is at 0.
F5–Shifted Rotated Rastrigin: This function
is exceptionally complicated and multimodal. It
operates within a narrow range of [-5, 5] and has
a global minimum of -330.
F6–Rotated Hybrid Composition Func-
tion: F6 represents a highly complicated multi-
modal function. Its range is also [-5, 5], and it
possesses a global optimum of 120.
F7–Rotated Hybrid Composition Func-
tion: Similar to F6, F7 is a very complex multi-
modal function with a range of [-5, 5]. Its global
minimum is situated at 10.
These benchmark test functions [16] are used to rig-
orously evaluate optimization algorithms, including
the proposed SMAO. They provide diverse challenges
in terms of multimodality, dimensionality, and land-
scape complexity, making them suitable for compre-
hensive algorithm assessment.
Table 4presents the results of the proposed SMAO
algorithm compared to other algorithms (SMGOA,
SMRSA, SMPDO, and SMFLA) on a set of bench-
mark test functions with a dimension of 30, showcas-
ing the results in terms of “Best”, “Worst”, “Ave” (av-
erage), “Std” (standard deviation), and “Time” (com-
putation time) metrics. In terms of the “Best” val-
ues, SMAO consistently outperforms or matches its
counterparts in most functions, showcasing its eec-
tiveness in nding solutions with the lowest tness.
However, the “Worst” values for SMAO are generally
competitive but occasionally surpassed by other al-
gorithms. In terms of average performance (“Ave”),
SMAO again demonstrates strong results. The stan-
dard deviation (“Std”) values suggest that SMAO pro-
vides solutions with relatively stable performance. Re-
garding computation time (“Time”), SMAO exhibits
competitive eciency compared to the other algo-
rithms. In summary, the SMAO algorithm performs
admirably across multiple metrics, showcasing its ro-
bustness and eciency in solving the benchmark test
functions.
Table 5summarizes the performance of dierent
optimization algorithms on benchmark test functions
with a dimension of 50. For Function F1, SMAO
achieves the best result in terms of the “Best” metric,
outperforming other algorithms, while maintaining
competitive values for the “Worst” and “Ave” metrics.
Additionally, SMAO exhibits relatively low standard
deviation (“Std”) values, suggesting consistency in its
performance. In terms of computation time (“Time”),
SMAO generally performs well, with comparable or
Table 3: Details of Benchmark Test Functions (Dim = 30, 50, 100, and 200).
Problem/Function (P/F) Name Characteristics Range Optimum Value
F1 Ellipsoid Unimodal [-5.12, 5.12] 0
F2 Rosenbrock Multimodal with narrow valley [-2.048, 2.048] 0
F3 Ackley Multimodal [-32.768, 32.768] 0
F4 Griewank Multimodal [-600, 600] 0
F5 Shifted Rotated Rastrigin (F10 in [16]) Very complicated multimodal [-5, 5] -330
F6 Rotated hybrid composition function (F16 in [16]) Very complicated multimodal [-5, 5] 120
F7 Rotated hybrid composition function (F19 in [16]) Very complicated multimodal [-5, 5] 10
Research Article, 2024, Volume 11, Number 1 (pp. 1–18) 11
Table 4. Results of Algorithms on the Benchmark Test Func-
tions (Dim = 30).
Function Metric Algorithms
SMAO SMGOA SMRSA SMPDO SMFLA
F1
Best 1.43E-30 6.99E-06 7.60E+02 1.68E+02 3.27E-10
Worst 7.80E-20 9.62E-01 1.89E+03 3.52E+02 6.67E-08
Ave 1.95E-20 3.21E-01 1.17E+03 2.86E+02 2.28E-08
Std 3.90E-20 5.55E-01 6.23E+02 1.02E+02 3.81E-08
Time 0.146 0.572 0.641 0.731 0.258
F2
Best 2.87E+01 2.78E+01 2.21E+03 5.79E+02 3.71E+00
Worst 2.87E+01 2.83E+01 6.58E+03 6.83E+02 1.01E+01
Ave 2.87E+01 2.81E+01 4.40E+03 6.23E+02 6.72E+00
Std 1.28E-04 2.63E-01 2.19E+03 5.35E+01 3.22E+00
Time 0.154 0.602 0.732 0.814 0.161
F3
Best 6.08E-13 6.23E-03 8.88E-16 1.99E+01 4.63E-05
Worst 1.66E-06 7.28E-01 8.88E-16 2.05E+01 2.36E-04
Ave 4.16E-07 2.48E-01 8.88E-16 2.02E+01 1.18E-04
Std 8.31E-07 4.15E-01 0.00E+00 3.22E-01 1.03E-04
Time 0.159 0.603 0.832 0.936 0.289
F4
Best 0.00E+00 4.84E-04 0.00E+00 1.58E+00 9.25E-08
Worst 0.00E+00 8.30E-02 0.00E+00 1.85E+00 1.24E-05
Ave 0.00E+00 3.38E-02 0.00E+00 1.71E+00 4.32E-06
Std 0.00E+00 4.35E-02 0.00E+00 1.34E-01 7.01E-06
Time 0.160 0.603 0.762 0.843 0.580
F5
Best -4.78E+01 -3.15E+01 6.47E+02 1.33E+02 -5.63E+01
Worst 5.52E+01 4.54E+01 6.70E+02 2.52E+02 1.60E+02
Ave 9.49E-01 1.66E+00 6.62E+02 2.02E+02 2.39E+01
Std 4.22E+01 3.95E+01 1.31E+01 6.17E+01 1.19E+02
Time 0.174 0.962 1.315 1.731 1.624
F6
Best 7.13E+02 4.52E+02 1.46E+03 6.73E+02 4.25E+02
Worst 9.35E+02 5.08E+02 1.75E+03 7.95E+02 5.60E+02
Ave 8.08E+02 4.87E+02 1.61E+03 7.45E+02 4.77E+02
Std 9.34E+01 3.05E+01 1.45E+02 6.36E+01 7.24E+01
Time 0.179 2.821 1.935 2.834 2.532
F7
Best 9.10E+02 9.91E+02 9.10E+02 1.21E+03 9.23E+02
Worst 9.10E+02 1.00E+03 9.10E+02 1.30E+03 9.28E+02
Ave 9.10E+02 9.95E+02 9.10E+02 1.24E+03 9.25E+02
Std 0.00E+00 4.53E+00 0.00E+00 5.37E+01 2.67E+00
Time 0.165 2.943 2.184 2.951 3.821
lower values compared to other algorithms. This sug-
gests that SMAO is a competitive algorithm across
various metrics, showcasing its robustness and e-
ciency in solving the specied benchmark test func-
tions. Further analysis across dierent functions in
the table would provide a comprehensive understand-
ing of the algorithm’s overall performance.
Table 6summarizes the performance of dierent
optimization algorithms on benchmark test functions
with a dimension of 100. Analyzing the SMAO al-
gorithm’s performance, it consistently achieves com-
petitive results, indicated by its low “Best” values,
reasonable “Worst” and “Ave” values, and relatively
low “Std” values across all functions. In terms of com-
putational eciency, SMAO also exhibits favorable
Time values compared to other algorithms. Notably,
SMAO outperforms SMGOA and SMRSA in terms of
“Best” values, indicating superior optimization capa-
bilities. Additionally, SMAO demonstrates competi-
Table 5. Results of Algorithms on the Benchmark Test Func-
tions (Dim = 50).
Function Metric Algorithms
SMAO SMGOA SMRSA SMPDO SMFLA
F1
Best 1.53E-25 1.32E-03 1.68E+03 9.96E+02 1.68E+03
Worst 1.25E-16 4.33E-02 4.54E+03 1.22E+03 4.54E+03
Ave 3.12E-17 1.57E-02 3.10E+03 1.07E+03 3.10E+03
Std 6.23E-17 2.39E-02 1.43E+03 1.30E+02 1.43E+03
Time 0.227 0.323 0.662 0.692 0.221
F2
Best 4.85E+01 4.85E+01 8.80E+03 1.02E+03 8.80E+03
Worst 4.85E+01 5.35E+01 1.36E+04 2.73E+03 1.36E+04
Ave 4.85E+01 5.05E+01 1.19E+04 1.85E+03 1.19E+04
Std 4.65E-06 2.63E+00 2.65E+03 8.56E+02 2.65E+03
Time 0.194 0.374 0.732 0.730 0.210
F3
Best 1.04E-10 2.67E-02 8.88E-16 2.04E+01 8.88E-16
Worst 5.91E-08 2.35E-01 1.21E+01 2.08E+01 1.21E+01
Ave 2.04E-08 9.73E-02 8.09E+00 2.06E+01 8.09E+00
Std 2.72E-08 1.19E-01 7.00E+00 1.92E-01 7.00E+00
Time 0.183 0.799 0.831 0.838 0.268
F4
Best 0.00E+00 5.46E-02 0.00E+00 2.21E+00 0.00E+00
Worst 2.23E-01 5.18E-01 0.00E+00 2.31E+00 0.00E+00
Ave 5.58E-02 2.26E-01 0.00E+00 2.27E+00 0.00E+00
Std 1.12E-01 2.54E-01 0.00E+00 5.27E-02 0.00E+00
Time 0.183 0.757 0.973 0.892 0.297
F5
Best 3.25E+02 2.92E+02 1.06E+03 4.17E+02 1.06E+03
Worst 4.04E+02 3.38E+02 1.53E+03 6.98E+02 1.53E+03
Ave 3.53E+02 3.19E+02 1.28E+03 5.24E+02 1.28E+03
Std 3.61E+01 2.40E+01 2.35E+02 1.52E+02 2.35E+02
Time 0.182 0.990 1.032 1.319 0.715
F6
Best 6.82E+02 5.46E+02 1.12E+03 6.57E+02 1.12E+03
Worst 1.20E+03 5.96E+02 1.62E+03 9.16E+02 1.62E+03
Ave 9.29E+02 5.63E+02 1.42E+03 7.76E+02 1.42E+03
Std 2.20E+02 2.86E+01 2.66E+02 1.31E+02 2.66E+02
Time 0.186 3.921 4.632 3.842 2.521
F7
Best 9.10E+02 1.07E+03 9.10E+02 1.23E+03 9.10E+02
Worst 9.10E+02 1.13E+03 1.47E+03 1.28E+03 1.47E+03
Ave 9.10E+02 1.10E+03 1.27E+03 1.26E+03 1.27E+03
Std 0.00E+00 2.70E+01 3.15E+02 2.51E+01 3.15E+02
Time 0.180 4.123 4.827 4.059 3.103
tive or better performance in terms of “Worst” and
“Ave” values, signifying robustness and eectiveness.
The “Std” values suggest that SMAO provides reli-
able and consistent results. Overall, the table under-
scores SMAO’s ecacy in comparison to the other
algorithms across diverse benchmark functions.
Table 7presents a comprehensive analysis of algo-
rithm performance on benchmark test functions in a
high-dimensional space (Dim = 200). In terms of the
“Best” values, SMAO consistently demonstrates com-
petitive performance, often achieving signicantly
lower values, particularly in functions F1,F3,F4,
and F7, suggesting its eectiveness in nding opti-
mal solutions. However, in terms of “Worst” values,
SMAO shows higher values compared to SMGOA
and SMPDO in some functions, indicating occasional
suboptimal outcomes. The “Ave” values for SMAO
generally fall within the range of other algorithms,
suggesting a balanced overall performance. Standard
12 A Surrogate Model-based Aquila Optimizer for Solving High-dimensional —A. Rouhi and E. Pira
Table 6. Results of Algorithms on the Benchmark Test Func-
tions (Dim = 100).
Function Metric Algorithms
SMAO SMGOA SMRSA SMPDO SMFLA
F1
Best 1.85E-21 2.19E-01 7.86E+03 5.36E+03 2.57E-08
Worst 3.21E-13 2.64E+00 1.35E+04 8.17E+03 6.42E-07
Ave 8.02E-14 1.12E+00 1.03E+04 6.80E+03 2.54E-07
Std 1.60E-13 1.33E+00 2.89E+03 1.41E+03 3.38E-07
Time 0.261 0.415 0.672 0.739 0.247
F2
Best 9.80E+01 9.88E+01 1.98E+04 2.92E+03 4.50E+00
Worst 9.80E+01 1.89E+02 2.63E+04 3.71E+04 2.75E+01
Ave 9.80E+01 1.35E+02 2.37E+04 1.56E+04 1.61E+01
Std 1.51E-05 4.73E+01 3.47E+03 1.87E+04 1.15E+01
Time 0.263 0.796 0.593 0.842 0.236
F3
Best 5.58E-07 6.59E-02 8.88E-16 2.08E+01 1.77E-04
Worst 4.69E-06 3.59E-01 9.64E+00 2.08E+01 2.29E-04
Ave 2.02E-06 2.27E-01 6.43E+00 2.08E+01 2.11E-04
Std 1.92E-06 1.49E-01 5.57E+00 1.66E-02 2.94E-05
Time 0.261 0.727 0.741 0.992 0.308
F4
Best 0.00E+00 1.30E-01 0.00E+00 3.40E+00 4.28E-09
Worst 1.44E+00 9.69E-01 0.00E+00 3.61E+00 1.66E-06
Ave 6.43E-01 4.66E-01 0.00E+00 3.49E+00 7.41E-07
Std 7.54E-01 4.44E-01 0.00E+00 1.08E-01 8.45E-07
Time 0.241 0.490 0.842 0.941 0.340
F5
Best 1.66E+03 1.43E+03 2.48E+03 1.74E+03 1.31E+03
Worst 1.98E+03 1.52E+03 2.86E+03 1.85E+03 1.56E+03
Ave 1.77E+03 1.46E+03 2.63E+03 1.78E+03 1.43E+03
Std 1.52E+02 4.95E+01 2.05E+02 5.93E+01 1.26E+02
Time 0.247 1.495 0.942 1.043 1.931
F6
Best 9.54E+02 6.21E+02 1.58E+03 8.31E+02 5.55E+02
Worst 1.53E+03 6.72E+02 1.77E+03 9.81E+02 6.66E+02
Ave 1.26E+03 6.43E+02 1.71E+03 9.01E+02 5.96E+02
Std 3.03E+02 2.60E+01 1.06E+02 7.53E+01 6.07E+01
Time 0.279 2.831 1.425 2.531 2.741
F7
Best 9.10E+02 9.11E+02 1.44E+03 1.31E+03 1.11E+03
Worst 9.10E+02 1.35E+03 1.60E+03 1.41E+03 1.16E+03
Ave 9.10E+02 1.08E+03 1.53E+03 1.36E+03 1.14E+03
Std 5.63E-12 2.32E+02 8.36E+01 5.02E+01 2.47E+01
Time 0.253 3.192 2.462 3.674 3.216
deviation (“Std”) values for SMAO are relatively low,
especially in F1and F3, indicating consistent re-
sults across multiple runs. The time taken by SMAO
(“Time”) is comparable to or lower than other al-
gorithms in most cases, highlighting its eciency.
Overall, while SMAO demonstrates strong compet-
itiveness in nding optimal solutions, its occasional
suboptimal outcomes and ecient runtime make
it a noteworthy algorithm in the context of these
benchmark test functions.
Also, to further analyze the performance of SMAO,
in addition to MQ (Multiquadric), it has been im-
plemented with other bf_types such as BH (Bihar-
monic), IMQ (Inverse multiquadric), TP S (Thin
plate spline), and G(Gaussian). Table 8shows the
results. According to this table, algorithm SMAO
has high performance with the bf _type of MQ.
Table 7. Results of Algorithms on the Benchmark Test Func-
tions (Dim = 200).
Function Metric Algorithms
SMAO SMGOA SMRSA SMPDO SMFLA
F1
Best 2.46E-20 2.63E+00 1.99E+04 2.76E+04 2.27E-06
Worst 1.08E-17 5.93E+02 3.83E+04 3.20E+04 5.74E-05
Ave 4.56E-18 2.00E+02 2.98E+04 2.94E+04 2.12E-05
Std 4.81E-18 3.41E+02 9.25E+03 2.30E+03 3.14E-05
Time 2.147 1.532 1.943 2.031 1.734
F2
Best 1.97E+02 1.99E+02 4.65E+04 7.73E+04 6.30E+01
Worst 1.97E+02 1.99E+02 5.82E+04 8.06E+04 8.82E+01
Ave 1.97E+02 1.99E+02 5.36E+04 7.90E+04 7.60E+01
Std 7.01E-05 2.04E-01 6.25E+03 1.64E+03 1.26E+01
Time 1.921 1.738 2.104 2.132 1.842
F3
Best 1.30E-06 7.26E-02 8.88E-16 2.10E+01 1.01E-04
Worst 9.09E-05 3.77E-01 9.64E+00 2.11E+01 2.69E-04
Ave 3.14E-05 1.97E-01 3.21E+00 2.11E+01 1.95E-04
Std 4.19E-05 1.59E-01 5.57E+00 2.01E-02 8.59E-05
Time 1.920 2.165 2.526 2.363 1.923
F4
Best 0.00E+00 3.97E-01 0.00E+00 6.40E+00 6.68E-07
Worst 2.22E-16 8.80E-01 0.00E+00 7.15E+00 4.60E-05
Ave 5.55E-17 5.61E-01 0.00E+00 6.79E+00 1.65E-05
Std 1.11E-16 2.77E-01 0.00E+00 3.79E-01 2.56E-05
Time 1.915 2.421 2.462 2.542 2.194
F5
Best 4.46E+03 3.75E+03 5.66E+03 4.28E+03 3.64E+03
Worst 5.16E+03 3.99E+03 6.10E+03 4.40E+03 4.39E+03
Ave 4.95E+03 3.86E+03 5.92E+03 4.36E+03 4.00E+03
Std 3.29E+02 1.24E+02 2.28E+02 6.52E+01 3.77E+02
Time 3.050 3.142 3.241 3.621 2.842
F6
Best 1.04E+03 8.44E+02 1.72E+03 8.62E+02 8.18E+02
Worst 1.60E+03 9.84E+02 1.87E+03 1.15E+03 8.61E+02
Ave 1.44E+03 9.35E+02 1.81E+03 9.99E+02 8.45E+02
Std 2.68E+02 7.88E+01 7.79E+01 1.44E+02 2.34E+01
Time 2.147 3.532 3.642 3.732 3.103
F7
Best 9.10E+02 9.10E+02 1.40E+03 1.34E+03 9.10E+02
Worst 9.10E+02 9.24E+02 1.51E+03 1.36E+03 1.26E+03
Ave 9.10E+02 9.15E+02 1.45E+03 1.35E+03 1.13E+03
Std 2.87E-11 7.98E+00 5.79E+01 1.22E+01 1.94E+02
Time 2.423 3.842 3.742 3.942 3.214
5.2 Discussion
This section presents a detailed analysis of the results
in Tables 48, encompassing a comprehensive statis-
tical assessment. The evaluation encompasses various
metrics, including Mean Rank, Rank, and Hit Rate,
across a set of seven test functions, as illustrated in
Figure 2.
5.2.1 Friedman Test
To establish the relative performance of each algo-
rithm, the Friedman test, a non-parametric statisti-
cal hypothesis test well-suited for comparing multi-
ple related samples [40] will be used, aiming to ascer-
tain the average ranking of each algorithm. Addition-
ally, the Hit Rate associated with each algorithm pro-
vides valuable insights into its overall performance.
This metric signies the degree to which an algorithm
achieves solutions closely resembling the optimal ones,
Research Article, 2024, Volume 11, Number 1 (pp. 1–18) 13
Table 8. Results of SMAO on the Benchmark Test Functions
with Various Basis Function Types (Dim = 200).
Function Metric BH IMQ TPS G MQ
F1
Best 1.64E-07 3.95E-38 2.66E-30 3.48E-30 2.46E-20
Worst 1.67E-01 7.45E-25 2.01E-23 6.73E-26 1.08E-17
Ave 5.78E-02 2.51E-25 6.86E-24 2.29E-26 4.56E-18
Std 9.47E-02 4.28E-25 1.15E-23 3.85E-26 4.81E-18
Time 1.05 2.381 2.553 2.403 2.147
F2
Best 1.97E+02 1.97E+02 1.89E+01 1.97E+02 1.97E+02
Worst 1.98E+02 1.97E+02 8.62E+01 1.97E+02 1.97E+02
Ave 1.98E+02 1.97E+02 5.45E+01 1.97E+02 1.97E+02
Std 6.14E-01 1.01E-06 3.38E+01 3.08E-05 7.01E-05
Time 1.88 2.114 2.292 2.183 1.921
F3
Best 8.88E-16 2.22E-14 2.22E-14 8.88E-16 1.30E-06
Worst 8.88E-16 6.16E-06 1.22E-13 3.34E-11 9.09E-05
Ave 8.88E-16 2.05E-06 5.77E-14 1.19E-11 3.14E-05
Std 0.00E+00 3.55E-06 5.55E-14 1.86E-11 4.19E-05
Time 2.221 2.169 2.263 2.173 1.92
F4
Best 8.31E-03 7.86E-01 0.00E+00 0.00E+00 0.00E+00
Worst 1.01E+00 9.98E-01 0.00E+00 9.85E-01 2.22E-16
Ave 3.48E-01 9.21E-01 0.00E+00 6.08E-01 5.55E-17
Std 5.77E-01 1.17E-01 0.00E+00 5.32E-01 1.11E-16
Time 2.115 2.105 2.325 2.221 2.049
F5
Best 5.52E+03 5.10E+03 5.28E+03 5.10E+03 4.46E+03
Worst 5.61E+03 5.29E+03 5.31E+03 5.30E+03 5.16E+03
Ave 5.58E+03 5.22E+03 5.30E+03 5.22E+03 4.95E+03
Std 5.05E+01 1.04E+02 1.41E+01 1.05E+02 3.29E+02
Time 2.144 2.044 2.241 2.196 3.05
F6
Best 1.29E+03 1.75E+03 1.70E+03 1.76E+03 1.04E+03
Worst 1.71E+03 1.79E+03 1.71E+03 1.82E+03 1.60E+03
Ave 1.47E+03 1.77E+03 1.70E+03 1.79E+03 1.44E+03
Std 2.19E+02 2.01E+01 7.24E+00 3.49E+01 2.68E+02
Time 2.073 1.733 1.709 1.623 2.147
F7
Best 9.10E+02 9.10E+02 9.10E+02 9.10E+02 9.10E+02
Worst 9.10E+02 9.10E+02 9.10E+02 9.10E+02 9.10E+02
Ave 9.10E+02 9.10E+02 9.10E+02 9.10E+02 9.10E+02
Std 9.71E-03 0.00E+00 0.00E+00 0.00E+00 2.87E-11
Time 2.14 1.648 1.369 1.585 2.423
as elucidated in Tables 48. The ndings depicted
in Figure 2demonstrate that the SMAO algorithm
consistently attains the highest rank, underscoring
its prociency in generating solutions that closely ap-
proximate the optimal outcomes with minimal devi-
ation. Notably, the SMAO algorithm achieves a re-
markable Hit Rate of 14.28% across the evaluated
test functions, implying its capability to produce solu-
tions aligning with optimal outcomes in a signicant
proportion of instances.
5.2.2 Wilcoxon Signed-Rank Test
The Wilcoxon signed-rank test is a non-parametric
statistical method employed to compare two samples
[41]. Much like the Friedman test, this analysis can
be conducted using the SPSS toolbox. For a pair of
algorithms denoted as Xand Y, this test furnishes
four key output statistics: (1) Rrepresenting the
sum of negative ranks, (2) R+indicating the sum
of positive ranks, (3) R=denoting the sum of equal
Fig. 2
The statistical analysis of the reported results
0
1
2
3
4
5
Mean Rank
0
1
2
3
4
5
6
Rank
0
5
10
15
Hit rate
Figure 2. The Statistical Analysis of the Reported Results.
ranks, and (4) Asymp. Sig., which serves as a p-value.
The values of R- and R+ signify the cumulative ranks
where X outperforms and underperforms Y, respec-
tively, while R=represents the sum of ranks where
Xand Yexhibit equal performance. Furthermore, if
the p-value is less than 0.05, it leads us to the con-
clusion that X is signicantly superior to Y. Table 9
displays the Wilcoxon signed rank test results for the
performance of the SMAO compared to the others.
For each pair comparison of the algorithms, R,R+,
R=, and p-values are calculated. Due to this table,
in all pair comparisons, the values of Rare much
greater than the ones of R+and R=, and this shows
that SMAO has better performance compared to the
others. Moreover, in all pair comparisons, the p-values
are less than 0.05, and it is concluded that SMAO is
signicantly superior to the others.
5.3 Convergence Analysis
Convergence analysis is a crucial aspect of compar-
ing metaheuristic algorithms as it provides insights
into how quickly and eectively these algorithms ap-
14 A Surrogate Model-based Aquila Optimizer for Solving High-dimensional —A. Rouhi and E. Pira
Table 9. Results of the Wilcoxon Signed-Rank Test.
SM AO
SM GOA
SM AO
SM RSA
SM AO
SM P DO
SM AO
SM F LA
R18 18 23 15
R+8 4 5 10
R=2 6 0 3
Asymp.
Sig.
p < 0.05 p < 0.05 p < 0.05 p < 0.05
proach optimal or near-optimal solutions [42]. By as-
sessing the convergence behavior, a valuable knowl-
edge about an algorithm’s performance in terms of
solution quality and robustness can be obtained, help-
ing us make informed choices when selecting the most
suitable algorithm for solving real-world optimization
problems. Understanding how dierent metaheuris-
tics perform during the optimization process is essen-
tial for practitioners and researchers seeking to apply
these techniques to a wide range of complex problems,
ultimately enabling the development of more ecient
and eective optimization strategies.
As reported in the previous section, Tables 48
present the results of various algorithms on bench-
mark test functions across dierent dimensions (30,
50, 100, and 200) using metrics such as Best,Worst,
Ave,Std, and Time. To analyze the convergence
speed of the SMAO algorithm compared to SMGOA,
SMRSA, SMPDO, and SMFLA algorithms, the Best
and Worst performance metrics are considered.
In general, the Best performance metric represents
the algorithm’s ability to nd the optimal solution,
while the Worst metric indicates the algorithm’s ro-
bustness against convergence to suboptimal solutions
or getting stuck. From Tables 48, it is evident that
for all dimensions (30, 50, 100, and 200), the SMAO
algorithm consistently achieves competitive or even
superior Best results compared to other algorithms,
including SMGOA, SMRSA, SMPDO, and SMFLA.
This suggests that SMAO converges relatively quickly
to good solutions. However, it’s crucial to consider the
Worst results as well. While SMAO may excel in nd-
ing the best solutions, it may not always be the most
robust algorithm against convergence to suboptimal
solutions, especially in higher dimensions (100 and
200), where it sometimes performs worse than some
other algorithms in terms of the Worst metric. Fig-
ures 3,4, and 5show the convergence of algorithms
on the test functions F2,F6, and F7, respectively,
for the dimension values 30, 50, 100, and 200.
In summary, the SMAO algorithm appears to
demonstrate strong convergence speed and eec-
tiveness in nding optimal solutions across dierent
dimensions compared to the other algorithms, as
indicated by its competitive Best results. However,
its robustness in avoiding suboptimal solutions may
require further investigation, particularly in higher-
dimensional spaces.
Fig. 3
The convergence analysis of algorithms on the test function of F2 (Dim:30,50,100, and 200)
Figure 3. The Convergence Analysis of Algorithms on the
Test Function of F2 (Dim:30,50,100, and 200).
Research Article, 2024, Volume 11, Number 1 (pp. 1–18) 15
Fig. 4
The convergence analysis of algorithms on the test function of F6 (Dim:30,50,100, and 200)
Figure 4. The Convergence Analysis of Algorithms on the
Test Function of F6 (Dim:30,50,100, and 200).
6 Conclusions
This paper introduced the SMAO, a novel meta-
heuristic algorithm tailored for eciently solving
high-dimensional optimization problems character-
ized by computationally expensive objective func-
tions. SMAO leverages RBFs to construct a sur-
rogate model, which is iteratively optimized to
guide the search process toward the global optimum.
Fig. 5
The convergence analysis of algorithms on the test function of F7 (Dim:30,50,100, and 200)
Figure 5. The Convergence Analysis of Algorithms on the
Test Function of F7 (Dim:30,50,100, and 200).
Through extensive experimentation on benchmark
test functions ranging from dimensions 30 to 200, we
demonstrated that SMAO outperforms the surrogate
model-based versions of existing state-of-the-art al-
gorithms, including the GOA, RSA, PDO, and FLA.
SMAO consistently showcased superior solution qual-
ity, faster convergence speed, and a higher hit rate,
16 A Surrogate Model-based Aquila Optimizer for Solving High-dimensional —A. Rouhi and E. Pira
establishing its eectiveness in addressing the chal-
lenges posed by high-dimensional, computationally
expensive optimization problems.
The success of SMAO in tackling high-dimensional
optimization with expensive objectives opens up
several promising avenues for future research. First,
exploring advanced surrogate modeling techniques
beyond RBFs could enhance SMAO’s accuracy and
eciency. Additionally, investigating techniques for
automatically adapting surrogate models during op-
timization could further improve their performance.
Furthermore, the application of SMAO to real-world
problems, such as engineering design and scientic
research, warrants exploration to assess its practi-
cal utility. Finally, the incorporation of parallel and
distributed computing strategies to exploit computa-
tional resources more eectively in high-dimensional
spaces is an exciting direction to enhance SMAO’s
scalability. Overall, SMAO’s potential for revolu-
tionizing optimization in complex, resource-intensive
domains provides a fertile ground for continued re-
search and development.
References
[1] Z. Zhang, Y. Gao, and W. Zuo. A
dual biogeography-based optimization algo-
rithm for solving high-dimensional global opti-
mization problems and engineering design prob-
lems. IEEE Access, 10:55988–56016, 2022.
doi:10.1109/ACCESS.2022.3177218.
[2] K. Benidis, Y. Feng, and D. P. Palo-
mar. Sparse portfolios for high-dimensional
nancial index tracking. IEEE Transac-
tions on signal processing, 66(1):155–170, 2017.
doi:10.1109/TSP.2017.2762286.
[3] J. A. Carrillo, S. Jin, L. Li, and Y. Zhu.
A consensus-based global optimization method
for high dimensional machine learning prob-
lems. ESAIM: Control and Optimisation
and Calculus of Variations, 27:S5, 2021.
doi:10.1051/cocv/2020046.
[4] F. MiarNaeimi, G. Azizyan, and
M. Rashki. Horse herd optimization al-
gorithm: A nature-inspired algorithm for
high-dimensional optimization problems.
Knowledge-Based Systems, 213:106711, 2021.
doi:10.1016/j.knosys.2020.106711.
[5] A. Singh and N. Dulal. A survey on metaheuris-
tics for solving large scale optimization prob-
lems. Int. J. Comput. Appl, 170(5):1–7, 2017.
doi:10.5120/ijca2017914839.
[6] N. Aslimani and R. Ellaia. A new hybrid algo-
rithm combining a new chaos optimization ap-
proach with gradient descent for high dimen-
sional optimization problems. Computational
and Applied Mathematics, 37:2460–2488, 2018.
doi:10.1007/s40314-017-0454-9.
[7] E. K. Nyarko, R. Cupec, and D. Filko. A com-
parison of several heuristic algorithms for solv-
ing high dimensional optimization problems. In-
ternational journal of electrical and computer en-
gineering systems, 5(1.):1–8, 2014. URL https:
//hrcak.srce.hr/134402.
[8] L. Abualigah, D. Yousri, M. A. Elaziz, A. A.
Ewees, M. A. Al-Qaness, and A. H. Gan-
domi. Aquila optimizer: a novel meta-
heuristic optimization algorithm. Comput-
ers & Industrial Engineering, 157:107250, 2021.
doi:10.1016/j.cie.2021.107250.
[9] S. Mahajan, L. Abualigah, A. K. Pandit, and
M. Altalhi. Hybrid Aquila optimizer with arith-
metic optimization algorithm for global opti-
mization tasks. Soft Computing, 26(10):4863–
4881, 2022. doi:10.1007/s00500-022-06873-8.
[10] P. Jiang, Q. Zhou, X. Shao, P. Jiang, Q. Zhou,
and X. Shao. Surrogate Model-Based Engineer-
ing Design and Optimization. Springer, 2020.
doi:10.1007/978-981-15-0731-1_7.
[11] G. Chen, K. Zhang, X. Xue, L. Zhang, C. Yao,
J. Wang, and J. Yao. A radial basis function sur-
rogate model assisted evolutionary algorithm for
high-dimensional expensive optimization prob-
lems. Applied Soft Computing, 116:108353, 2022.
doi:10.1016/j.asoc.2021.108353.
[12] J. O. Agushaka, A. E. Ezugwu, and L. Abualigah.
Gazelle optimization algorithm: a novel nature-
inspired metaheuristic optimizer. Neural Com-
puting and Applications, 35(5):4099–4131, 2023.
doi:10.1007/s00521-022-07854-6.
[13] L. Abualigah, M. A. Elaziz, P. Sumari, Z. W.
Geem, and A. H. Gandomi. Reptile search algo-
rithm (rsa): A nature-inspired meta-heuristic op-
timizer. Expert Systems with Applications, 191:
116158, 2022. doi:10.1016/j.eswa.2021.116158.
[14] A. E. Ezugwu, J. O. Agushaka, L. Abuali-
gah, S. Mirjalili, and A. H. Gandomi. Prairie
dog optimization algorithm. Neural Comput-
ing and Applications, 34(22):20017–20065, 2022.
doi:10.1007/s00521-022-07530-9.
[15] F. A. Hashim, R. R. Mostafa, A. G.
Hussien, S. Mirjalili, and K. M. Sallam.
Fick’s law algorithm: A physical law-
based algorithm for numerical optimization.
Knowledge-Based Systems, 260:110146, 2023.
doi:10.1016/j.knosys.2022.110146.
[16] P. N. Suganthan, N. Hansen, J. J. Liang,
K. Deb, Y.-P. Chen, A. Auger, and S. Ti-
wari. Problem denitions and evaluation cri-
teria for the cec 2005 special session on real-
parameter optimization. KanGAL report, 2005.
Research Article, 2024, Volume 11, Number 1 (pp. 1–18) 17
URL https://www.academia.edu/download/
33925834/Tech-Report-May-30-05.pdf.
[17] J. Müller. SOCEMO: surrogate optimization of
computationally expensive multiobjective prob-
lems. INFORMS Journal on Computing, 29(4):
581–596, 2017. doi:10.1287/ijoc.2017.0749.
[18] J. C. García-García, R. García-Ródenas, and
E. Codina. A surrogate-based cooperative
optimization framework for computationally
expensive black-box problems. Optimiza-
tion and Engineering, 21(3):1053–1093, 2020.
doi:10.1007/s11081-020-09526-7.
[19] J. Yi, Y. Shen, and C. A. Shoemaker. A
multi-delity RBF surrogate-based optimization
framework for computationally expensive multi-
modal problems with application to capacity
planning of manufacturing systems. Struc-
tural and Multidisciplinary Optimization, 62:
1787–1807, 2020. doi:10.1007/s00158-020-02575-
7.
[20] H. Liu, J. Tian, Q. Yu, X. Liu, and
G. Wang. A historical surrogate model en-
semble assisted bayesian evolutionary optimiza-
tion algorithm for solving expensive many-
objective problems. Available at SSRN 4537543.
doi:10.2139/ssrn.4537543.
[21] X. Zhang, B. Yang, T. Yu, and L. Jiang.
Dynamic surrogate model based optimiza-
tion for mppt of centralized thermoelec-
tric generation systems under heterogeneous
temperature dierence. IEEE Transactions
on Energy Conversion, 35(2):966–976, 2020.
doi:10.1109/TEC.2020.2967511.
[22] M. Shi, L. Lv, W. Sun, and X. Song. A
multi-delity surrogate model based on sup-
port vector regression. Structural and Multi-
disciplinary Optimization, 61:2363–2375, 2020.
doi:10.1007/s00158-020-02522-6.
[23] R. S. Tabar, K. Wärmefjord, and R. der-
berg. A new surrogate model–based method
for individualized spot welding sequence op-
timization with respect to geometrical qual-
ity. The International Journal of Advanced
Manufacturing Technology, 106:2333–2346, 2020.
doi:10.1007/s00170-019-04706-x.
[24] K. Bamdad, M. E. Cholette, and J. Bell. Build-
ing energy optimization using surrogate model
and active sampling. Journal of Building
Performance Simulation, 13(6):760–776, 2020.
doi:10.1080/19401493.2020.1821094.
[25] H. A. Pedrozo, S. R. Reartes, Q. Chen, M. S.
Díaz, and I. E. Grossmann. Surrogate-model
based milp for the optimal design of ethy-
lene production from shale gas. Comput-
ers & Chemical Engineering, 141:107015, 2020.
doi:10.1016/j.compchemeng.2020.107015.
[26] S. Akyol. A new hybrid method based on
aquila optimizer and tangent search algorithm
for global optimization. Journal of Ambient
Intelligence and Humanized Computing, 14(6):
8045–8065, 2023. doi:10.1007/s12652-022-04347-
1.
[27] E. Demiral, C. Sahin, and K. Arslan. Aero-
dynamic shape optimization using multi-delity
surrogate-based approach for computationally
expensive problems. In AIAA AVIATION 2022
Forum, page 4161, 2022. doi:10.2514/6.2022-
4161.
[28] M. H. Nadimi-Shahraki, S. Taghian, S. Mirjalili,
and L. Abualigah. Binary aquila optimizer for
selecting eective features from medical data: A
covid-19 case study. Mathematics, 10(11):1929,
2022. doi:10.3390/math10111929.
[29] H. Yu, H. Jia, J. Zhou, and A. Hussien.
Enhanced Aquila optimizer algorithm for
global optimization and constrained engineer-
ing problems. Mathematical Biosciences
and Engineering, 19(12):14173–14211, 2022.
doi:10.3934/mbe.2022660.
[30] Y. Xiao, Y. Guo, H. Cui, Y. Wang, J. Li,
and Y. Zhang. IHAOAVOA: An improved
hybrid aquila optimizer and African vultures
optimization algorithm for global optimiza-
tion problems. Mathematical Biosciences
and Engineering, 19(11):10963–11017, 2022.
doi:10.3934/mbe.2022512.
[31] M. Li, J. Tang, and X. Meng. Multiple surrogate-
model-based optimization method using the mul-
timodal expected improvement criterion for ex-
pensive problems. Mathematics, 10(23):4467,
2022. doi:10.3390/math10234467.
[32] H. Liu, X. Zhang, H. Zhang, C. Li, and Z. Chen.
A reinforcement learning-based hybrid Aquila
optimizer and improved arithmetic optimiza-
tion algorithm for global optimization. Ex-
pert Systems with Applications, 224:119898, 2023.
doi:10.1016/j.eswa.2023.119898.
[33] M. Wu, J. Xu, L. Wang, C. Zhang, and
H. Tang. Adaptive multi-surrogate and
module-based optimization algorithm for high-
dimensional and computationally expensive
problems. Information Sciences, page 119308,
2023. doi:10.1016/j.ins.2023.119308.
[34] E. Baş. Binary aquila optimizer for 0–
1 knapsack problems. Engineering Applica-
tions of Articial Intelligence, 118:105592, 2023.
doi:10.1016/j.engappai.2022.105592.
[35] A. A. A. El-Mageed, A. A. Abohany, and
A. Elashry. Eective feature selection strat-
egy for supervised classication based on
an improved binary aquila optimization algo-
rithm. Computers & Industrial Engineering, 181:
18 A Surrogate Model-based Aquila Optimizer for Solving High-dimensional —A. Rouhi and E. Pira
109300, 2023. doi:10.1016/j.cie.2023.109300.
[36] B. Deng. Aquila optimizer with dynamic group
strategy for global optimization tasks. Concur-
rency and Computation: Practice and Experience,
page e7971, 2023. doi:10.1002/cpe.7971.
[37] M. Ahmadipour, M. M. Othman, R. Bo, M. S.
Javadi, H. M. Ridha, and M. Alrifaey. Opti-
mal power ow using a hybridization algorithm
of arithmetic optimization and aquila optimizer.
Expert Systems with Applications, 235:121212,
2024. doi:10.1016/j.eswa.2023.121212.
[38] L. Abualigah and K. H. Almotairi. Dy-
namic evolutionary data and text document
clustering approach using improved aquila opti-
mizer based arithmetic optimization algorithm
and dierential evolution. Neural Comput-
ing and Applications, 34(23):20939–20971, 2022.
doi:10.1007/s00521-022-07571-0.
[39] L. Zeng, M. Li, J. Shi, and S. Wang. Spiral aquila
optimizer based on dynamic gaussian mutation:
Applications in global optimization and engi-
neering. Neural Processing Letters, 55(8):11653–
11699, 2023. doi:10.1007/s11063-023-11394-y.
[40] M. Friedman. A comparison of alternative tests
of signicance for the problem of m rankings. The
Annals of Mathematical Statistics, 11(1):86–92,
1940. URL https://www.jstor.org/stable/
2235971.
[41] F. Wilcoxon. Individual comparisons by ranking
methods. In Breakthroughs in Statistics: Method-
ology and Distribution, pages 196–202. Springer,
1992. doi:10.1007/978-1-4612-4380-9_16.
[42] W. J. Gutjahr. Convergence Analysis of Meta-
heuristics, pages 159–187. Springer US, Boston
and MA, 2010. doi:10.1007/978-1-4419-1306-
7_6.
Alireza Rouhi received his B.Sc. at
Kharazmi University of Tehran in September
2000; M.Sc. at Sharif University of Technol-
ogy in June 2004; and Ph.D. at University
of Isfahan in September 2017, all in the Soft-
ware Engineering eld. He was rewarded as
an outstanding researcher of Ph.D. students
at the Faculty of Computer Engineering, University of Isfahan
in 2017. Currently, he is an assistant professor at Azarbai-
jan Shahid Madani University, Tabriz, Iran. He is interested
in Software Engineering in general and Formal Specication,
Model Transformation, Metaheuristics, and Social Networks
in particular.
Einollah Pira received his B.Sc. degree in
computer engineering (software) from the
University of Kharazmi, Tehran, Iran [1996–
2000], the M.Sc. degree in computer engineer-
ing (software) from the Sharif University of
Technology, Tehran, Iran [2000–2002], and
Ph.D degree in computer engineering (soft-
ware) from Arak University, Iran [2013-2017].
Currently, he is an Assistant Professor at the Department of
Computer Engineering, Azarbaijan Shahid Madani University,
Tabriz, Iran. His research interests include model checking,
formal methods, software testing, evolutionary computation,
and machine learning.
... MAs are broadly divided into two categories: individual-solution-based and population-based [20], [21]. The latter, known for its effectiveness in exploring and exploiting the search space to target global optima, begins with a randomly generated population of solutions [22], [23]. Population-based MAs draw inspiration from evolutionary processes, natural phenomena, and social behaviors. ...
Article
Full-text available
Background and Objectives: This paper explores the realm of optimization by synergistically integrating two unique metaheuristic algorithms: the Wild Horse Optimizer (WHO) and the Fireworks Algorithm (FWA). WHO, inspired by the behaviors of wild horses, demonstrates proficiency in global exploration, while FWA emulates the dynamic behavior of fireworks, thereby enhancing local exploitation. The goal is to harness the complementary strengths of these algorithms, achieving a harmonious balance between exploration and exploitation to enhance overall optimization performance. Methods: The study introduces a novel hybrid metaheuristic algorithm, WHOFWA, detailing its design and implementation. Emphasis is placed on the algorithm's ability to balance exploration and exploitation. Extensive experiments, featuring a diverse set of benchmark optimization problems, including general test functions and those from CEC 2005, CEC 2019, and 2022, assess WHOFWA's effectiveness. Comparative analyses involve WHO, FWA, and other metaheuristic algorithms such as Reptile Search Algorithm (RSA), Prairie Dog Optimization (PDO), Fick’s Law Optimization (FLA), and Ladybug Beetle Optimization (LBO). Results: According to the Friedman and Wilcoxon signed-rank tests, for all selected test functions, WHOFWA outperforms WHO, FWA, RSA, PDO, FLA, and LBO by 42%, 55%, 74%, 71%, 48%, and 52%, respectively. Finally, the results derived from addressing real-world constrained optimization problems using the proposed algorithm demonstrate its superior performance when compared to several well-regarded algorithms documented in the literature. Conclusion: In conclusion, WHOFWA, the hybrid metaheuristic algorithm uniting WHO and FWA, emerges as a powerful optimization tool. Its unique ability to balance exploration and exploitation yields superior performance compared to WHO, FWA, and benchmark algorithms. The study underscores WHOFWA's potential in tackling complex optimization problems, making a valuable contribution to the realm of metaheuristic algorithms.
Article
Full-text available
Aquila optimizer (AO) is a nascent meta-heuristic algorithm that draws its inspiration from the four distinct hunting strategies employed by Aquila in nature.While prior research has demonstrated that AO performs admirably on numerous optimization cases, the algorithm faces significant challenges when confronted with complex multidimensional optimization problems. Specifically, unbalanced exploration and development, inefficiency in identifying optimal solutions, and premature convergence all present significant obstacles to the algorithm’s performance. To address these challenges, this study proposes a novel spiral aquila optimizer based on dynamic Gaussian mutation (SGAO). The algorithm introduces a novel nonlinear control factor and combines it with the spiral search strategy derived from the whale optimization algorithm, thereby accelerating the convergence speed of the algorithm. Additionally, to enhance the probability of escaping local optima in the AO, Gaussian mutant solutions are generated using the positional information of the current new particle and the best particle. Diverging from other improved versions of AO, this research analyzes the motion patterns of traditional AO and purposefully introduces fresh optimization approach, enabling the proposed SGAO to exhibit exceptional performance. In order to validate the efficacy of the proposed method, the newly developed SGAO is subjected to extensive simulation experiments across 39 benchmark problems and five engineering application problems, and is compared against other advanced meta-heuristic algorithms. The experimental results demonstrate that SGAO significantly outperforms traditional AO and other advanced meta-heuristic algorithms, exhibiting exceptional performance and competitiveness.
Article
Full-text available
In this article, a multiple surrogate-model-based optimization method using the multimodal expected improvement criterion (MSMEIC) is proposed. In MSMEIC, an important region is first identified and used alternately with the whole space. Then, in each iteration, three common surrogate models, kriging, radial basis function (RBF), and quadratic response surface (QRS), are constructed, and a multipoint expected improvement (EI) criterion that selects the highest peak and other peaks of EI is proposed to obtain several potential candidates. Furthermore, the optimal predictions of the three surrogate models are regarded as potential candidates. After deleting redundant candidates, the remaining points are saved as the new sampling points. Finally, several well-known benchmark functions and an engineering application are employed to assess the performance of MSMEIC. The testing results demonstrate that, compared with four recent counterparts, the proposed method can obtain more precise solutions more efficiently and with strong robustness.
Article
The Aquila optimizer (AO) is an efficient method for solving global optimization problems. However, the evolution of each individual learns from experience in the same group, which can easily fall into local optima. Therefore, this paper adopts the dynamic grouping strategy (DGS) of the population and proposes an improved AO algorithm to solve the global optimization problem. Different from the original AO algorithm, the DGSAO algorithm only evolves the individuals with the worst fitness in each group each time, which increases the diversity of the population. In order to verify the effectiveness of the algorithm, we tested it on 23 benchmark functions, among which the dimensions of to are 30, 100, and 200 dimensions. The experimental results show that the DGSAO algorithm is an effective method for solving global optimization problems. At the same time, we also conduct experiments on two engineering design problems, and the results show that the DGSAO algorithm can obtain a competitive result.
Article
The optimization process entails determining the best values for various system characteristics in order to finish the system design at the lowest possible cost. In general, real-world applications and issues in artificial intelligence and machine learning are discrete, unconstrained, or discrete. Optimization approaches have a high success rate in tackling such situations. As a result, several sophisticated heuristic algorithms based on swarm intelligence have been presented in recent years. Various academics in the literature have worked on such algorithms and have effectively addressed many difficulties. Aquila Optimizer (AO) is one such algorithm. Aquila Optimizer (AO) is a recently suggested heuristic algorithm. It is a novel population-based optimization strategy. It was made by mimicking the natural behavior of the Aquila. It was created by imitating the behavior of the Aquila in nature in the process of catching its prey. The AO algorithm is an algorithm developed to solve continuous optimization problems in their original form. In this study, the AO structure has been updated again to solve binary optimization problems. Problems encountered in the real world do not always have continuous values. It exists in problems with discrete values. Therefore, algorithms that solve continuous problems need to be restructured to solve discrete optimization problems as well. Binary optimization problems constitute a subgroup of discrete optimization problems. In this study, a new algorithm is proposed for binary optimization problems (BAO). The most successful BAO-T algorithm was created by testing the success of BAO in eight different transfer functions. Transfer functions play an active role in converting the continuous search space to the binary search space. BAO has also been developed by adding candidate solution step crossover and mutation methods (BAO-CM). The success of the proposed BAO-T and BAO-CM algorithms has been tested on the knapsack problem, which is widely selected in binary optimization problems in the literature. Knapsack problem examples are divided into three different benchmark groups in this study. A total of sixty-three low, medium, and large scale knapsack problems were determined as test datasets. The performances of BAO-T and BAO-CM algorithms were examined in detail and the results were clearly shown with graphics. In addition, the results of BAO-T and BAO-CM algorithms have been compared with the new heuristic algorithms proposed in the literature in recent years, and their success has been proven. According to the results, BAO-CM performed better than BAO-T and can be suggested as an alternative algorithm for solving binary optimization problems.
Article
Recently, many metaheuristic optimization algorithms have been developed to address real-world issues. In this study, a new physics-based metaheuristic called Fick's law optimization (FLA) is presented, in which Fick's first rule of diffusion is utilized. According to Fick's law of diffusion, molecules tend to diffuse from higher to lower concentration areas. Many experimental series are done to test FLA's performance and ability in solving different optimization problems. Firstly, FLA is tested using twenty well-known benchmark functions and thirty CEC2017 test functions. Secondly, five real-world engineering problems are utilized to demonstrate the feasibility of the proposed FLA. The findings are compared with 12 well-known and powerful optimizers. A Wilcoxon rank-sum test is carried out to evaluate the comparable statistical performance of competing algorithms. Results prove that FLA achieves competitive and promising findings, a good convergence curve rate, and a good balance between exploration and exploitation. The source code is currently available for public from: https://se.mathworks.com/matlabcentral/fileexchange/121033-fick-s-law-algorithm-fla