ArticlePDF Available

MCSO: Levy's Flight Guided Modified Chicken Swarm Optimization MCSO: Levy's Flight Guided Modified Chicken Swarm Optimization

Authors:

Abstract

This paper proposes a Modified Chicken Swarm Optimization (MCSO) in which the local optima and early convergence problem of Chicken Swarm Optimization (CSO) is addressed and solved. The CSO adopted the Swarm Intelligence (SI) of chickens to solve the optimization problem in which the behaviour of roosters, hens, and chicks for food search is mathematically formulated. Hens follow their group rooster, whereas mother hen is followed by chicks in search of food. The problem occurs whenever the rooster follows the wrong path and is stuck in the local optima so the mother hen and, therefore, the chick. This situation leads to early convergence and may not provide global optimization. Most of the existing research studies focused on solving the local optima problem of hens. Hence, there is a need to address the local optima problem of roosters as well. The paper offers a solution to this problem by using the randomness phenomenon of Levy's Fight. Levy's flight is offered to guide the roosters, hens, and chicks, which allows the chickens to choose a random direction in a situation when there is no way to find the optimal solution. The inclusion of Levy's flight enhances the self-learning capability of the chicken. The MCSO is tested on the benchmark functions, IEEE CEC-2017 functions and an engineering problem. The results are validated by a comparative analysis with well-known SI agorithms. The results indicate that the MCSO provides competitive performance. The results are statistically verified with the win-tie-loss, Bonferroni-Dunn post-hoc, and Wilcoxon tests.
Full Terms & Conditions of access and use can be found at
https://www.tandfonline.com/action/journalInformation?journalCode=tijr20
IETE Journal of Research
ISSN: (Print) (Online) Journal homepage: https://www.tandfonline.com/loi/tijr20
MCSO: Levy’s Flight Guided Modified Chicken
Swarm Optimization
Satya Verma, Satya Prakash Sahu & Tirath Prasad Sahu
To cite this article: Satya Verma, Satya Prakash Sahu & Tirath Prasad Sahu (2023): MCSO:
Levy’s Flight Guided Modified Chicken Swarm Optimization, IETE Journal of Research, DOI:
10.1080/03772063.2023.2194265
To link to this article: https://doi.org/10.1080/03772063.2023.2194265
Published online: 09 Apr 2023.
Submit your article to this journal
View related articles
View Crossmark data
IETE JOURNAL OF RESEARCH
https://doi.org/10.1080/03772063.2023.2194265
MCSO: Levy’s Flight Guided Modified Chicken Swarm Optimization
Satya Verma , Satya Prakash Sahu and Tirath Prasad Sahu
Department of Information Technology, National Institute of Technology, Raipur, India
ABSTRACT
This paper proposes a Modified Chicken Swarm Optimization (MCSO) in which the local optima and
early convergence problem of Chicken Swarm Optimization (CSO) is addressed and solved. The CSO
adopted the Swarm Intelligence (SI) of chickens to solve the optimization problem in which the
behaviour of roosters, hens, and chicks for food search is mathematically formulated. Hens follow
their group rooster, whereas mother hen is followed by chicks in search of food. The problem occurs
whenever the rooster follows the wrong path and is stuck in the local optima so the mother hen
and, therefore, the chick. This situation leads to early convergence and may not provide global opti-
mization. Most of the existing research studies focused on solving the local optima problem of hens.
Hence, there is a need to address the local optima problem of roosters as well. The paper offers a solu-
tion to this problem by using the randomness phenomenon of Levy’s Fight. Levy’s flight is offered
to guide the roosters, hens, and chicks, which allows the chickens to choose a random direction in a
situation when there is no way to find the optimal solution. The inclusion of Levy’s flight enhances
the self-learning capability of the chicken. The MCSO is tested on the benchmark functions, IEEE CEC-
2017 functions and an engineering problem. The results are validated by a comparative analysis with
well-known SI agorithms. The results indicate that the MCSO provides competitive performance. The
results are statistically verified with the win-tie-loss, Bonferroni-Dunn post-hoc, and Wilcoxon tests.
KEYWORDS
Chicken swarm optimization
(CSO); Levy’s flight; Local
optima; Metaheuristic;
Optimization; Swarm
intelligence (SI)
1. INTRODUCTION
Optimization is the study of formulating a problem with
mathematical tools in search of an optimal solution
space. The most general approach is to follow random
walks in search of optimality. Optimization algorithms
are classied as deterministic, stochastic, and a hybrid
of two. Deterministic algorithms are typically used for
conventional problems. Stochastic algorithms can be
heuristic and metaheuristic. The heuristic approach fol-
lowsarandomsearchwithtrialanderror.Theheuris-
tic approach is suitable whenever a good solution is
required, not necessarily the best one. Meta-heuristic is
the improvement over heuristic, and it has a trade-o
between randomization and local search [1]. Optimiza-
tion can be used to answer an extensive range of ques-
tions arising in dierent domains. Dierent problems
may have dierent objectives like enhancement over the
current solution or speed.
Metaheuristic algorithms are interesting since they iterate
towardstheoptimalsolution.BlumandRoli[2]iden-
tied the classication of metaheuristic algorithms as
“nature-inspired vs. non-nature-inspired”, “population-
based vs. single-point search”, “dynamic vs. static objec-
tive function”, “one vs. various neighbourhood struc-
tures”, and memory usage vs. memory-less methods”.
The beauty of metaheuristic-based optimization is that it
can be applied to any application domain without much
prior domain-specic knowledge. Two key elements of
metaheuristic approaches are intensication and diver-
sication. Intensication ensures the convergence of the
bestsolutiontowardsanoptimumsolution.Whereas
diversicationensuresthatthesolutionmustnotbestuck
in a local optimum [2]. A trade-o between intensica-
tion and diversication may lead to a globally optimal
solution [3]. Software tools termed Metaheuristic Opti-
mization Framework (MOF) oer the execution of meta-
heuristics to solve any specic problem. MOF is useful
inresearchsinceitprovidesareusabilityoptiontothe
researcher [4]. A wide range of metaheuristic algorithms
are available for solving optimization problems, but not
all of them are suitable for all problems. Some may give
the best results for one kind of problem and some may
give the best results for another type of problem. This
concept is known as the “No Free Lunch (NFL)” theorem
[5]. Most of the metaheuristic algorithms are designed by
noticing nature. Nature is full of surprises with the abil-
ity to solve complex computational tasks [6]. Also, nature
has limitless sources for observational and computational
studies [3]. Nature-inspired algorithms are found pro-
cient in answering numerous optimization problems.
These kinds of algorithms adopt the biological behaviour
© 2023 IETE
2 SATYA VERMA ET AL.: MCSO: LEVY’S FLIGHT GUIDED MODIFIED CHICKEN SWARM OPTIMIZATION
of nature [7]. SI is one such behavioural analysis of nature,
where the social behaviour of the swarms (bee, ant, ies,
bat, wolf, chickens etc.) are observed to prepare a compu-
tational model for optimization problems. SI algorithms
intend to be executed several times to get the optimal
solution [6]. SI algorithms are used in almost every appli-
cation area of engineering. Other elds such as social sci-
ences, industrial applications, business applications etc.
are also exploiting SI [8].
The SI-based algorithms follow the population-based
approach. The popularity of SI-based algorithms is due
to their reliable and adaptable nature. Such algorithms
are capable enough for self-learning and easily adapt
the external changes [8]. The behaviour of swarms is
based on three general principles: separation, alignment,
and cohesion. Separation is collision avoidance with the
neighbourhood swarms. Alignment is matching with
the neighbourhood swarms. Cohesion is the tendency
towards the centre of the neighbourhood mass. Any
swarm’s objective is to survive. For survival, the swarm
searches for food and distracts the outward enemy [9].
The researchers formulated mathematical models for
various SI-based algorithms. Some have worked tremen-
dously in solving optimization problems [1012]. Pop-
ular SI-based algorithms are Bat Algorithm (BA) [13],
Articial Bee Colony (ABC) [14], Ant Colony Optimiza-
tion (ACO) [15], Dragony Algorithm (DA) [9], Parti-
cle Swarm Optimization (PSO) [16], Grey Wolf Opti-
mization (GWO) [17], Whale Optimization Algorithm
(WOA) [18], CSO [19], etc.
PSO is a popularly used method for optimization prob-
lems. Computationally, PSO is not much complex with
respect to memory usage and execution speed. PSO is
based on the social milieu [16]. ACO is another pop-
ular SI-based algorithm inspired by ants. ACO follows
astochasticsearchapproach.InACO,articialantsare
used to form the solution space for a particular opti-
mization problem. These articial ants interchange the
information to enhance the solution quality like real
ants [15]. ABC imitates the SI behaviour of a honey
bee.Theideaistosearchforfoodsourcesingroup-
wise and step-wise manner like honey bees [14]. Firey
Algorithm (FA) works on the blinking features of re-
ies. Fireies are attracted to each other with the intensity
of their brightness regardless of gender. The idea is to
compute the rey’s brightness for solution space search
[20]. BA used the echolocation properties of microbats.
Bats practise echolocation for distance sensing with the
ability to dierentiate amid food/prey and circumstan-
tial barriers. Random y patterns (with velocity, position,
and frequency) of bats are utilised for solution space
search [21]. GWO imitates the hierarchical behaviour of
grey wolves for prey hunting. The idea is to divide the
wolves into three categories alpha (leaders), beta (subor-
dinate), omega (scapegoat), and delta (remaining wolves
not coming under the alpha, beta, or omega category).
Alpha wolves are dominant and desire for hunting. Beta
wolves help the alpha wolf in decision-making activities.
Omega has the least priority and has to report the domi-
nant wolves. Delta subordinates alpha and beta but domi-
nate omega. Solution space is searched as per the location
of alpha, beta, and delta. Divergence is to be adopted with
each other in prey searching, whereas convergence is to
be followed in prey attacks [17]. DA adopted the SI of
dragonies to search for food and navigates to avoid the
enemy. DA starts optimising by making a set of random
solutions for a particular optimization problem. Conver-
gence of dragonies is required during optimization. To
transition from exploration to exploitation, dragonies
must adjust their weights to survive. During the opti-
mization process, dragonies frequently observe more
dragonies to adjust their ying path. At the nal stage
of optimization, the increased neighbourhood area forms
one group of the swarm for global optimum convergence.
The food source and enemy is identied based on the
worst and best solutions discovered so far. As a result,
the search space tends to converge towards promising
locations and diverge towards less promising locations
[9]. WOA adopted the hunting behaviour of humpback
whales. Humpback whales use spiral bubble-net feed-
ing mechanism for food hunting. The idea is to form
a shrinking circle and a spiral path around the prey
which is used to update the whale’s position in solution
space [18].
CSO is an optimization method which mimics the social
behaviour of chickens and comparatively performed bet-
ter than the competitive SI-based optimization meth-
ods [19]. CSO was used to solve a variety of problems
[2224]. CSO has fast convergence, therefore, CSO is
apttosolveproblemswherefastconvergenceisneeded.
However, due to the faster convergence speed, CSO
sometime may be trapped in the local optima [2527].
The local optima and early convergence problem of CSO
are addressed in this paper. The MCSO is proposed to
solve the above-mentioned problem. The main contribu-
tion of the paper is
i. The MCSO is proposed to handle the local optima
and early convergence problem faced by the chicken
swarms in basic CSO.
ii. The MCSO is tested on the benchmark functions,
CEC-2017 functions, and an engineering applica-
tion.
SATYA VERMA ET AL.: MCSO: LEVY’S FLIGHT GUIDED MODIFIED CHICKEN SWARM OPTIMIZATION 3
iii. The performance of the MCSO is statistically anal-
ysedandcomparedwiththecompetitiveSI-based
optimization algorithms.
The remaining paper is organised as follows: section 2
gives the preliminary concept of CSO and existing work
that modied/improved CSO; section 3proposes a mod-
ied and improved version of CSO (MCSO); section 4
provides experimental results with discussion; section 5
gives conclusion with a future direction.
2. CSO
CSO is a nature-driven algorithm which mimics the
Chicken’s behaviour for food search. Rooster, hen, and
chick are the three categories of chicken. A group is to be
formed by using these three categories. Every group con-
sists of a rooster with several hens and chicks. Grouping
of chicken swarms is done with the help of tness val-
ues. Rooster is the leader of the group, who has the best
tness value. Chicks roam around the mother hen and
have the worst tness value. The left-over chicken swarm
whichdoesnothavethebestorworsttnessisconsid-
ered a hen. Hens choose the group randomly and follow
the rooster. Chicks stick to their mother hen. Association
between hens and chicks is established arbitrarily [19].
Let the number of hens, roosters, mother hens, and chicks
be dened as HN, RN, MN, and CN, respectively. The
value of HN, RN, MN, and CN identies the rooster,
hens,andchicksbasedontnessvalues.Chickenn’s
position at time tin ddimensional space is given by
equation (1).
Pt
i,j(i[1, 2, ...,n], j[1, 2, ...,d])(1)
Chicken Movement: Roosters with superior tness value
have priority for food access. Rooster’s position is
obtained by equation (2).
Pt+1
i,j=Pt
i,j×(1+Randn(a,b)),a=0, b=σ2(2)
σ2=
1, if FNiFNk,
exp FNkFNi
|FNi|+,otherwise,
k[1, n], k= i(3)
where Randn(a,b)–Gaussiandistributionforaandb;
small constant to handle the division error; k–rooster
index; FN –tnessvalueofP.
Hens can pathway the groupmate roosters. Food access
is snatched by the dominant hens, which is given by
Figure 1: Working of CSO
equation (4).
Pt+1
i,j=Pt
i,j+S1×Rand ×(Pt
r1,jPt
i,j)
+S2×Rand ×(Pt
r2,jPt
i,j)(4)
S1=exp((FNiFNr1)/(abs(FNi)+ε)) (5)
S2=exp((FNr2FNi)) (6)
where Rand Random number from 0 to 1; r1
[1, 2, ...,n] rooster index for ith hen; r2[1, 2, ...,n]
–chickenindex(roosterorhen),r1= r2, FNi>
FNr1,FNi>FNr2,S2<1<S1.
Chicks travel around their mother, which is given by
equation (7).
Pt+1
i,j=Pt
i,j+FL ×(Pt
m,jPt
i,j)(7)
where, Pt
m,j position of jth chick’s mother, m
[1, 2, ...,n]; FL parameter for chicks to follow mother
hen, FL [0, ..., 2]. The working phenomenon of CSO
is illustrated in Figure 1.
2.1 Problem in CSO
Basic CSO easily collapses with the local optimum due
to the early convergence. This problem occurs due to
the swarm behaviour of chicks. Hens follow their group
roosters, whereas chicks follow the mother. So, the possi-
bility exists that whenever the mother chooses the wrong
path, the chicks will also denitely choose the wrong
path [2527]. Apt convergence is required throughout
the optimization process. Positions of roosters, hens, and
chicks are to be updated in such a way that possibil-
ity of local optima is minimum. Researchers proposed
4 SATYA VERMA ET AL.: MCSO: LEVY’S FLIGHT GUIDED MODIFIED CHICKEN SWARM OPTIMIZATION
dierent solutions to solve the local optima and early
convergence problem which is discussed in a subsequent
section.
2.2 Variants of CSO
Many variants and hybrid approaches are present in
the literature that utilised CSO or improved the basic
CSO. Wu et al. [25] proposed Improved CSO (ICSO) by
updatingthepositionofchickstodealwiththisprob-
lem of local optimum. The chicks that learnt from their
rooster are used to update their position. This modica-
tion handles the early convergence problem and provides
aglobaloptimumsolution[25]. Wu et al. [26]applied
Markov Chain for the convergence analysis of CSO [26].
Wu et al . [ 28]solvedtheproblemoflocaloptimumby
varying the scale of CSO. Basic CSO does not have a
variable scale but rather the xed one. Hen’s position
update is having more parameters than Rooster’s and
Chick’s position updates. Therefore, the author modi-
ed the position update equation of the hen based on
theroostersposition.Theauthorusedacrossoveroper-
ator to improve the CSO [28]. Torabi and Sa-Esfahani
[29] combined Improved Raven Roosting Optimization
(IRRO) and CSO. IRRO was oered to handle prema-
ture convergence, whereas CSO was used in IRRO to
balance the local and global search capability [29]. He
et al. [27] proposed improved CSO with the integra-
tion of GA, PSO, and BAT optimization. The global
search ability of CSO is good enough but due to lack
of guidance, chicken’s movement misleads the chick-
ens. The equation of chicken position update needs to
be improved, which referred to rooster and hen move-
ment both [27]. Deb et al. [30] modied the equation
of rooster position update [30]. Ajesh and Ravi [31]
proposed JAYA-CSO to train an RNN classier (weight
tuning) for Glaucoma detection [31]. Cui et al. [32]
proposed improved CSO for beamforming optimiza-
tion in which chicken swarm’s positions were updated
throughPSO,GWOandFA.Thesolutionspaceofthe
roosterwasupdatedbyPSOandGWO,whereasthe
solution space of hens and chicks was updated by FA
[32]. Self-Adaptive CSO (SA-CSO) was proposed by
Kumari et al. [33] for feature selection. The early conver-
gence problem is solved through updating the solution
space (tness function) where SA-CSO randomly selects
the candidate solution obtained in the CSO [33]. Wang
et al. [34] oered a solution to enhance global search-
ing where the authors updated the hen position with
Levy’s ight and the chick position by the diminishing
inertial weight to improve the chick’s self-learning ability
[34].
3. PROPOSED MCSO
This paper proposes the MCSO to solve the problem of
local optimum and early convergence. Most of the pre-
vious research focused on updating the position of hen
to improve the basic CSO. Very few have addressed the
problem, what if the rooster falls into the local optimum
and opts for early convergence. Whenever a rooster gets
trapped in local optima then hens and chicks have no
other option but to follow their group rooster. This paper
oers a solution to this problem by guiding the rooster,
hen, and chick through Levy’s ight distribution.
Levy’s ight is quite useful in the metaheuristic optimiza-
tion method whenever a search space provides no infor-
mation for the further movement of an object. Levy’s
ight adds some randomness in the search space to
handle this scenario and, therefore, is used by many
researchers [35]. The MCSO is inspired by the con-
cept of the Dragony Algorithm (DA) which utilised
random walk (Levy’s ight) search for position updates
whentrappedinnoneighbouringsolutions[9]. The pro-
posed MCSO uses Levy’s ight to update the rooster, hen,
and chick position to handle the local optima and early
convergence problem. The equation for rooster position
update in the MCSO is given by equation (8).
Pt+1
i,j=Pt
i,j×(1+Levys(d)) (8)
TheequationforthehenpositionupdateintheMCSOis
given by equation (9).
Pt+1
i,j=Pt
i,j+S1×Levys(d)×(Pt
r1,jPt
i,j)
+S2×Levys(d)×(Pt
r2,jPt
i,j)(9)
TheequationforthechickpositionupdateintheMCSO
is given by equation (10).
Pt+1
i,j=(Pt
i,j+FL ×(Pt
m,jPt
i,j)) ×(1+Levys(d))
(10)
where dis the dimension of the position vector, and other
parameters of equations (8), (9), and (10) are the same as
equations (2), (4) and (7), respectively.
Levysightiscomputedbyequation(11).
Levys(x)=0.01 ×r1×Sigma
|r2|1 (11)
where r1 and r2 random numbers in (0,1); β–constant.
SATYA VERMA ET AL.: MCSO: LEVY’S FLIGHT GUIDED MODIFIED CHICKEN SWARM OPTIMIZATION 5
Algorithm 1:MCSO
Sigmaiscomputedbyequation(12).
Sigma =
(1+β) ×sin πβ
2
(1+β)
2×β×2β1
2
1
/
β
(12)
where,
(x)=(x1)! (13)
Inclusion of Levy’s ight in the rooster’s, hen’s and
chick’s position update is benecial whenever the
rooster/hen/chick is trapped in local optima and nds no
other way to converge in a better position. So by using
the proposed MCSO this problem is solved and a better
solution space can be obtained in comparison to the basic
CSO. Algorithm 1 provides the pseudo-code of MCSO,
and Figure 2provides the owchart of MCSO.
4. EXPERIMENTAL RESULTS AND DISCUSSION
The experiments are performed on the Python3 with
Ryzen 5, 2.10 GHz CPU and 8GB RAM. The MCSO
is tested and evaluated for the three cases which are
discussed subsequently. Firstly, MCSO is tested on the
benchmark functions. Secondly, MCSO is tested on the
CEC 2017 functions. Lastly, MCSO is tested on one engi-
neering application (Pressure Vessel Design problem).
All three problems are minimization problems. There-
fore, the paper targets to minimise the objective function.
The result rst illustrates the eectiveness of Levy’s ight
in the MCSO by comparing it with the uniform random
Figure 2: Flowchart of MCSO
distribution since Levy’s ight and uniform random dis-
tribution are used in the MCSO and CSO, respectively for
the position update of chicken swarms. The comparison
of Levy’s ight and a uniform distribution is provided in
Figure 3where the X-axis denotes the continuous space
and the Y-axis denotes the probability distribution.
4.1 Benchmark Function Optimization
The MCSO is applied to the 12 benchmark functions.
Table 1provides the details of these functions which pro-
vide the dimension, search space range, and optimum
value. The MCSO’s performance is compared with the
6 SATYA VERMA ET AL.: MCSO: LEVY’S FLIGHT GUIDED MODIFIED CHICKEN SWARM OPTIMIZATION
Figure 3: Levy’s flight vs. uniform distribution
PSO, GWO, WOA, and CSO. Many variants of PSO,
GWO, WOA, and CSO are present in the literature.
The basic version of these algorithms is used for exper-
iments. The empirically selected parameters for PSO,
GWO, WOA, CSO, and MCSO are listed in Table 2.The
value of gmay have a substantial eect on the conver-
genceoftheMCSO.Forthelargevalueofg, the algorithm
may not have fast convergence to the global solution.
For the small values of g, there is a possibility of local
optima. The empirical results led to the conclusion that
at g=10, MCSO converges well while providing the
global optimum solution to the problem. The parame-
ter FL measures how quickly a chick would follow its
mother. The FL is generated at random from the range
[0, 2] to account for the variances between chickens. A
smaller/higher value of FL is not a good choice. The out-
come of the experiments is providing the best results
when the FL values are taken in the range [0.5,0.9]. All the
optimization methods are executed for 100 runs where
100 iterations are there in each run. All the mentioned
optimization methods are tested on the listed benchmark
functions.
Table 3provides the statistical outcome of the experi-
ment which consists of execution time, worst, best, and
average (AVG) values for the objective function. Table 3
also provides the Standard Deviation (SD) for the objec-
tivefunctionstoshowthedispersionsfromtheAVG
value.Thesevaluesshouldbeminimum(0) since
the problem is minimization. The optimization method
that provides the lowest worst, best, and AVG values,
is considered the best one. The bold values in the table
indicate the best value. Figure 4provides the conver-
gencegraphforthefunctionsF1toF6,andFigure5
provides the convergence graph for the functions F7 to
F12. In the convergence graph, the X-axis denotes the
number of iterations, and the Y-axis denotes the tness
values of the function. Figure 6provides the optimization
method’s rank (for worst, best, AVG, and SD) obtained
through the the Bonferroni-Dunn post-hoc statistical
test.Eachfunctionsoptimumvalueis0,therefore,the
best-performing method acquires the highest rank and
the worst-performing method acquires the lowest rank.
4.1.1 Statistical Test
Win-tie-loss and Bonferroni-Dunn post-hoc statistical
tests are used to validate the performance of the MCSO.
The win-tie-loss statistical test is used to nd the win-
ning percentage of the MCSO. In Table 3,“+”indi-
cates the win, ”indicatestheloss,and“ =”indicates
thetieoftheMCSO.TheBonferroni-Dunntest[36]is
used to verify the statistical disparity of all optimiza-
tion methods in the tness values for the 12 benchmark
functions. The value of signicance level (α)istakenas
0.05 to compute the Critical Dierence (CD) which is
1.6124520664710214.
4.1.2 Discussion
Experimental results (Table 3)illustratethattheMCSO
outperforms the PSO, GWO, WOA, and CSO. The
MCSO provides competitive performance to the CSO
for the functions F4, F5, and F7, whereas for all other
functions, MCSO outperforms CSO. The value of SD
obtained through MCSO is the lowest (0) for a major-
ity of benchmark functions. The lower SD signies that
theoutcomeisclusteredaroundthemeanandproves
the superiority of the MCSO. The lower values of all the
parameters signify that the optimization method con-
verges well towards the optimal value. The MCSO pro-
vides faster convergence due to the inclusion of Levy’s
ight. As illustrated in Figure 3,Levysightfollowsthe
random walk where Levy distribution is used to nd step
length. Here, the computation time for calculating the
steps is constant. The probability distribution of Levy’s
ight is heavy-tailed. Therefore, the inclusion of Levy’s
ight gives the reason for faster and stagnant conver-
gence in the MCSO. By analysing the convergence graph
(Figures 4and 5), it is found that MCSO converges well
towards the optimum value and remains stagnant after
some iterations. At the initial iterations also, the MCSO
gets near optimum values for the tness functions which
is not the case for PSO, GWO, and WOA. The MCSO
can aord superior results in lesser execution time. By
analysing win-tie-loss statistics (Table 3), it is concluded
that MCSO provides the fastest execution 100% of the
time. The win percentage of the MCSO for worst and
AVG values is 75%, whereas the tie percentage for the
same is 25%. Similarly, the win percentage of the MCSO
for the best value and SD is 66.67%, and the loss and tie
percentage for the same is 8.33% and 25%, respectively.
SATYA VERMA ET AL.: MCSO: LEVY’S FLIGHT GUIDED MODIFIED CHICKEN SWARM OPTIMIZATION 7
Tab le 1: Benchmark functions
Function Mathematical Representation Dimension
[lower bound,
upper bound] Optimum
Sphere (F1) f(x)=
n
i=1
x2
i30 [100, 100] 0
Schwefel 2.22 (F2) f(x)=
n
i=1|xi|+
n
i=1|xi|30 [10, 10] 0
Schwefel 1.2 (F3) f(x)=
n
i=1
i
j1
xj
2
30 [100, 100] 0
Schwefel 2.21 (F4) f(x)=max
i{|xi|,1in}30 [100, 100] 0
Rosenbrock (F5) f(x)=
n1
i=1
(100(xi+1xi2)2+(1xi)2)30 [30, 30] 0
Step (F6) f(x)=
n
i=1
(xi+0.5)230 [100, 100] 0
Quartic (F7) f(x)=
n
i=1
ixi4+random(0, 1)30 [1.28, 1.28] 0
Schwefel (F8) f(x)=
n
i=1xisin |xi|30 [500, 500] 0
Rastrigin (F9) f(x)=
n
i=1
(xi210 cos(2πxi)+10)30 [5.12, 5.12] 0
Ackely (F10) f(x)=−20exp
0.2
1
n
n
i=1
xi2
exp 1
n
n
i=1
cos(2πxi)+20 +e30 [32, 32] 0
Griewank (F11) f(x)=1
4000
n
i=1
xi2
n
i=1
cos xi
i+130[600, 600] 0
Penalized1 (F12) f(x)=π
n10 sinx1)+
n1
i=1
(xi1)2(1+10sin2xi+1)) +(xn1)2+
n
i=1
u(xi, 10, 100, 4)
30 [50, 50] 0
Tab le 2: Selected parameters
Optimization
Method Parameter Values
PSO MinWeight =0.2, Max Weight =0.9, Cognitive LearningRate (c1)=2, S ocial LearningRate (c2)=2, Population =50, Dimension =30,
Iteration =100
GWO Number of Search Agents =5, Dimension =30, Iteration =100
WOA Number of Search Agents =50, Spiral Coefficient (b)=1, Dimension =30, Iteration =100
CSO RN =0.2n,HN =0.6n,CN =nRNHN,MN =0.1n,g=10, FL[0.5, 0.9], Dimension =30, iteration =100
MCSO RN =0.2n,HN =0.6n,CN =nRNHN,MN =0.1n,g=10, FL[0.5, 0.9], β=1.5, Dimension =30, iteration =100
The overall win, tie, and loss percentage of the MCSO is
76.67%, 20%, and 3.33%, respectively.
The Bonferroni-Dunn test (Figure 5) indicates that the
rankoftheMCSOishighestamongPSO,GWO,WOA,
and CSO.
4.2 CEC-2017 Benchmark Functions
The MCSO is tested on the IEEE CEC-2017 functions.
Table 4provides the CEC-2017 benchmark functions
along with itheir description. The performance of the
MCSO is compared with the PSO, GWO, WOA, and
CSO. The parameters for these algorithms are the same as
mentioned in Table 2. All the algorithms are performed
for 100 runs with 100 iterations in each run. The exper-
imental outcome of the CEC-2017 test functions is pro-
vided in Table 5(CEC-1 to CEC-10), Table 6(CEC-11 to
CEC-20), and Table 7(CEC-21 to CEC-30).
4.2.1 Statistical Test
Statistical tests are useful to analyse the performance of
any method where it considers more factors than only
the performance dierence between the two approaches.
Wilcoxon-signed paired rank tests and win-tie-loss tests
are performed to statistically analyse the performance of
the MCSO. Table 8provides the statistical outcome of
the experiment which consists of the p-value obtained
through the Wilcoxon-signed paired rank test and win-
tie-loss statistics obtained through the win-tie-loss tests.
8 SATYA VERMA ET AL.: MCSO: LEVY’S FLIGHT GUIDED MODIFIED CHICKEN SWARM OPTIMIZATION
Tab le 3: Fitness values of the proposed MCSO vs. other optimization methods
Benchmark
Function Method
Execution
Time (Sec) Worst Best AVG SD
F1 PSO 4.6721 65938.0769 5.3791 4108.6644 12254.9598
GWO 4.8511 67439.7348 0.0006 2774.8231 10168.4117
WOA 4.3209 66437.7263 0.9135 1783.4162 8531.5569
CSO 1.6719 1.2 0.9825 1.0479 0.0884
MCSO 1.6594 0.9063 0.7188 0.7894 0.0371
F2 PSO 4.3978 4.37E +09 5.1066 4.36E +07 4.37E +08
GWO 5.5516 3.61E +10 0.0045 4.21E +08 3.66E +09
WOA 3.9264 1.34E +12 0.9769 1.34E +10 1.34E +11
CSO 0.9249 1.0312 0.6875 0.9609 0.0752
MCSO 0.9006 0.8749 0.6249 0.7349 0.0447
F3 PSO 5.7618 1.39E +05 531.2193 10599.2252 24208.1895
GWO 5.8523 1.49E +05 182.7920 14338.6363 28298.8071
WOA 4.6250 1.08E +05 51034.53765 76609.2241 17650.1088
CSO 1.9448 24.5313 12.5312 17.5631 2.1896
MCSO 0.7031 16.5625 3.1562 8.5888 2.8406
F4 PSO 4.3581 87.4535 3.5205 13.9647 18.5202
GWO 5.0171 86.2837 0.3479 12.6178 23.2396
WOA 3.7494 89.1874 88.7585 88.7916 0.0963
CSO 0.5882 0.0313 0.0313 0.0313 0
MCSO 0.5680 0.0313 0.0313 0.0313 0
F5 PSO 4.1094 2.48E +08 542.6243 4.36E +06 2.72E +07
GWO 5.1489 2.69E +08 28.9391 8.15E +06 3.55E +07
WOA 4.1406 2.53E +08 28.7829 9.05E +06 3.89E +07
CSO 0.9219 2.00E-16 2.00E-16 2.00E-16 3.72E-31
MCSO 0.9093 2.00E-16 2.00E-16 2.00E-16 3.72E-31
F6 PSO 4.9429 62553.6475 2.7964 3908.5934 11506.4389
GWO 4.8853 66343.7886 3.3389 2440.6394 9169.1076
WOA 4.1494 72002.8020 2.1766 2561.8720 11432.4012
CSO 0.8836 2.2499 1.5625 1.7963 0.1875
MCSO 0.8599 1.9375 1.4999 1.7312 0.0903
F7 PSO 4.4236 69.1926 1.0927 21.1473 22.0595
GWO 4.8431 100.8400 0.0039 3.1089 14.1712
WOA 3.8978 96.8039 0.0032 4.3581 18.0927
CSO 0.3489 2E-08 2E-08 2E-08 6.65E-24
MCSO 0.3341 2E-08 2E-08 2E-08 6.65E-24
F8 PSO 5.0468 3413.7481 1876.2195 3243.4849 402.1872
GWO 4.7261 5389.8243 1773.0802 3424.7764 963.1805
WOA 3.9531 7325.4099 2753.8169 6979.5479 713.2579
CSO 1.9531 3.60E-15 2.10E-15 2.54E-15 6.84E-16
MCSO 1.4727 2.00E-16 2.00E-16 2.00E-16 3.72E-31
F9 PSO 4.1094 418.9510 211.9109 317.9808 49.2482
GWO 4.5503 447.1449 20.7585 92.4321 95.9674
WOA 4.0640 437.6292 1.65E-04 82.9844 119.6204
CSO 1.7675 0.3437 0.3125 0.3434 0.0031
MCSO 1.4920 9.43E-05 9.43E-05 9.43E-05 6.81E-20
F10 PSO 4.5937 20.8133 1.9125 6.7679 4.1514
GWO 4.6094 20.5379 0.0087 2.9649 5.4871
WOA 3.9522 20.4962 4.63E-09 2.3787 5.4958
CSO 2.5832 0.1289 0.1134 0.1194 6.17E-03
MCSO 2.2394 2.34E-10 1.41E-10 1.64E-10 4.05E-11
F11 PSO 4.8006 540.5451 7.4588 231.5258 188.0793
GWO 4.7656 542.2128 0.0549 20.7048 79.3605
WOA 4.0744 674.3651 1.55E-15 23.3744 100.0497
CSO 0.5538 0.0625 0.0314 0.0447 0.0128
MCSO 0.5241 0.0123 0.0019 0.0095 3.98E-03
F12 PSO 4.5633 5.03E +08 1.3552 1.01E +07 6.00E +07
GWO 4.8608 5.23E +08 0.5970 1.74E +07 7.33E +07
WOA 4.6377 5.34E +08 0.1440 2.07E +07 8.24E +07
CSO 4.5423 32.2013 24.5621 27.4803 1.7386
MCSO 4.3433 28.6549 23.2573 27.2071 0.6618
+//=Total 12/0/0 9/0/3 8/1/3 9/0/3 8/1/3
The p-value 0.05 signies that the two samples are dif-
ferent. The symbols +”, ”, and =”havethesame
signicance as shown in Table 3.
4.2.2 Discussion
MCSOprovidedthebestvaluesofAVGandSDandout-
performs 18 benchmark functions i.e. CEC-1, CEC-2,
SATYA VERMA ET AL.: MCSO: LEVY’S FLIGHT GUIDED MODIFIED CHICKEN SWARM OPTIMIZATION 9
Figure 4: Convergence graph (F1–F6)
CEC-4, CEC-7, CEC-8, CEC-9, CEC-10, CEC-11, CEC-
12, CEC-15, CEC-18, CEC-19, CEC-21, CEC-24, CEC-
25, CEC-27, CEC-28, and CEC-30. MCSO provided the
best AVG values for CEC-5, CEC-6, CEC-13, CEC-20,
and CEC-22, whereas MCSO provided the best SD value
for CEC-14 and CEC-17. PSO provided the best AVG
value for the function CEC-14, CEC-16, CEC-17, CEC-
23, CEC-26, and CEC-29. CSO provided the best AVG
value for CEC-3. The Wilcoxon test (Table 8) provides
the p-values 0.05 for a majority of the functions which
indicates that the MCSO is better than the existing algo-
rithms. The win-tie-loss statistics indicate that MCSO
wins 76.67%, 100%, 93.33%, and 96.67% of the time to
PSO, GWO, WOA, and CSO, respectively. The overall
win percentage of MCSO for AVG and SD is 76.67%
and 66.67%, respectively. Statistical analysis of the results
showsthatMCSOisoverallprovidinggoodresults.
4.3 Pressure Vessel Design Problem
The pressure vessel design problem is frequently used to
evaluate optimization methods [37]. The objective is to
minimise the total cost incurred in the fabrication. Pres-
sure vessel design is a four-constrained problem having
10 SATYA VERMA ET AL.: MCSO: LEVY’S FLIGHT GUIDED MODIFIED CHICKEN SWARM OPTIMIZATION
Figure 5: Convergence graph (F7–F12)
four design factors i.e. thickness (a1), the thickness of
heads (a2), inner radius (a3), and length (a4) of the cylin-
drical section. The problem is mathematically formulated
as follows:
Minimise
f(a)=0.6224a1a3a4+1.7781a2a32
+3.1661a12a4+19.84a12a3(14)
Subject to
g1(a)=−a1+0.0193a30 (15)
g2(a)=−a2+0.00954a30 (16)
g3(a)=−πa32a44
3πa33+129600 0 (17)
g3(a)=a4240 0 (18)
SATYA VERMA ET AL.: MCSO: LEVY’S FLIGHT GUIDED MODIFIED CHICKEN SWARM OPTIMIZATION 11
Figure 6: Bonferroni-Dunn test to rank the optimization methods
Tab le 4: CEC-2017 functions
Function Description Dimension fmin
CEC-1 Shifted and Rotated Bent Cigar 30 100
CEC-2 Shifted and Rotated the Sum of Different Power 30 200
CEC-3 Shifted and Rotated Zakharov 30 300
CEC-4 Shifted and Rotated Rosenbrock 30 400
CEC-5 Shifted and Rotated Rastrigin 30 500
CEC-6 Shifted and Rotated Expanded Schaffer 30 600
CEC-7 Shifted and Rotated Lunacek Bi_Rastrigin 30 700
CEC-8 Shifted and Rotated Non-Continuous Rastrigin 30 800
CEC-9 Shifted and Rotated Levy 30 900
CEC-10 Shifted and Rotated Schwefel 30 1000
CEC-11 Hybrid Function 1 (N =3) 30 1100
CEC-12 Hybrid Function 2 (N =3) 30 1200
CEC-13 Hybrid Function 3 (N =3) 30 1300
CEC-14 Hybrid Function 4 (N =4) 30 1400
CEC-15 Hybrid Function 5 (N =4) 30 1500
CEC-16 Hybrid Function 6 (N =4) 30 1600
CEC-17 Hybrid Function 6 (N =5) 30 1700
CEC-18 Hybrid Function 6 (N =5) 30 1800
CEC-19 Hybrid Function 6 (N =5) 30 1900
CEC-20 Hybrid Function 6 (N =6) 30 2000
CEC-21 Composition Function 1 (N =3) 30 2100
CEC-22 Composition Function 2 (N =3) 30 2200
CEC-23 Composition Function 3 (N =4) 30 2300
CEC-24 Composition Function 4 (N =4) 30 2400
CEC-25 Composition Function 5 (N =5) 30 2500
CEC-26 Composition Function 6 (N =5) 30 2600
CEC-27 Composition Function 7 (N =6) 30 2700
CEC-28 Composition Function 8 (N =6) 30 2800
CEC-29 Composition Function 9 (N =3) 30 2900
CEC-30 Composition Function 10 (N =3) 30 3000
where
a10.0625, a299 ×0.0625, a310, a4200 (19)
Many researchers solved this problem through meta-
heuristic optimization [38]. The comparison of MCSO
with the PSO [16], GWO [17], WOA [18], CSO [19],
and ICSO [25]isprovidedinTable9.Table9shows that
MCSO outperforms the other algorithms.
4.4 Time Complexity
The time complexity can be computed by considering the
population size N,dimensiond,andthenumberoftness
evaluations m[29]. The time complexity for the MCSO is
as follows.
Parameter Initialization :O(1)(20)
Initial Solution :O(N×d)(21)
Iteration :O(N×logN)+O(N×d×m)
+O(m×N×logN)(22)
Total Tim e C ompl e xit y :O(1)+O(N×d)
+O(N×logN)+O(N×d×m)
+O(m×N×logN)(23)
5. CONCLUSION
This paper proposed MCSO for solving optimization
problems. MCSO handled the local optima and early
convergence problem of basic CSO. Levy’s ight was
used to add randomness and thus helps in a situation
where chicken leads to no neighbouring solution. Due
to Levy’s ight-based randomness, the MCSO converges
faster to the global optimum solution. The proposed
MCSO has been validated on twelve benchmark func-
tions, CEC-2017 functions, and pressure vessel design
problems. It is found from the experiments that MCSO
is eciently performed for all the benchmark functions
and pressure vessel design problem. The MCSO was sta-
tistically analysed and compared with the other opti-
mization methods (PSO, GWO, and WOA), basic CSO,
12 SATYA VERMA ET AL.: MCSO: LEVY’S FLIGHT GUIDED MODIFIED CHICKEN SWARM OPTIMIZATION
Tab le 5: AVG and SD for CEC-2017 functions (CEC1-CEC10)
CEC-1 CEC-2 CEC-3 CEC-4 CEC-5 CEC-6 CEC-7 CEC-8 CEC-9 CEC-10
PSO AVG 4.76E +03 9.78E +03 3.54E +04 5.75E +02 5.74E +02 6.12E +02 8.45E +02 9.65E +02 1.37E +03 2.11E +03
SD 4.02E +03 7.87E +04 7.45E +03 3.66E +01 2.01E +01 3.42E-01 2.75E +01 2.16E +01 2.78E +02 1.35E +03
GWO AVG 3.92E +10 8.39E +08 8.78E +04 9.36E +03 8.47E +02 6.63E+02 1.65E +03 1.33E +03 9.74E +03 8.23E +03
SD 5.11E +09 1.73E +08 9.98E +03 1.25E +03 1.64E +01 7.35E +00 5.68E +01 1.81E +01 9.89E +02 4.22E +02
WOA AVG 8.33E +07 9.60E +05 2.72E +05 6.05E +02 7.87E +02 6.69E +02 1.31E +03 1.19E +03 9.55E +03 6.48E +03
SD 4.69E +07 2.31E +07 6.62E +04 4.98E +01 6.45E +01 1.39E +01 7.38E +01 6.14E +01 3.29E +03 9.77E +02
CSO AVG 4.55E +03 5.66E +03 9.88E +03 5.82E +02 7.86E +02 6.54E +02 1.35E +03 9.16E +02 4.87E +03 2.02E +03
SD 4.28E +03 4.27E +03 7.38E +03 3.78E +01 3.67E +01 6.69E +00 2.51E +01 2.11E +01 3.99E +02 7.36E +02
MCSO AVG 3.98E +03 3.24E +03 6.89E +04 5.12E +02 5.22E +02 6.07E +02 7.51E +02 8.80E +02 1.11E +03 1.87E +03
SD 3.42E +03 9.76E +02 9.62E +03 9.76E +00 1.82E +01 5.34E-01 9.37E +00 2.52E +00 3.20E +01 6.64E +01
Tab le 6: AVG and SD for CEC-2017 functions (CEC11-CEC20)
CEC-11 CEC-12 CEC-13 CEC-14 CEC-15 CEC-16 CEC-17 CEC-18 CEC-19 CEC-20
PSO AVG 1.93E +03 5.99E +04 1.67E +04 2.27E +04 1.71E +04 2.53E +03 1.78E +03 9.89E +05 1.75E +04 2.79E +03
SD 5.38E +01 2.14E +05 1.67E +04 2.66E +04 2.75E +04 2.68E +02 1.46E +02 1.76E +05 1.54E +04 1.48E +02
GWO AVG 6.95E +03 8.99E +09 6.74E +09 1.78E +06 1.84E +08 5.16E +03 3.65E +03 1.83E +07 1.88E +08 2.98E +03
SD 1.56E +03 2.54E +09 2.98E +09 8.88E +05 1.47E +08 5.16E +02 3.21E +02 1.53E +07 9.73E +07 1.47E +02
WOA AVG 2.46E +03 9.32E +07 2.42E +05 2.54E +06 1.18E +05 3.79E +03 2.45E +03 3.87E +06 5.16E +06 2.87E +03
SD 1.31E +03 6.39E +07 2.98E +05 2.09E +06 8.21E +04 4.28E +02 2.65E +02 3.76E +06 4.89E +06 1.59E +02
CSO AVG 1.47E +03 4.67E +04 2.38E +04 5.57E +04 4.76E +04 3.99E +03 4.55E +03 9.86E +05 1.33E +04 2.87E +03
SD 1.84E +02 2.11E +07 5.97E +03 4.98E +04 3.48E +04 6.76E +02 3.47E +02 1.32E +06 8.86E +04 2.87E +02
MCSO AVG 1.11E +03 3.27E +03 5.48E +03 2.56E +04 4.11E +03 2.99E +03 3.62E +03 9.43E +04 8.23E +03 2.12E +03
SD 1.65E +01 4.56E +03 8.32E +03 2.87E +03 4.03E +03 2.98E +03 9.81E +01 5.11E +04 6.12E +03 2.43E +03
Tab le 7: AVG and SD for CEC-2017 functions (CEC21-CEC30)
CEC-21 CEC-22 CEC-23 CEC-24 CEC-25 CEC-26 CEC-27 CEC-28 CEC-29 CEC-30
PSO AVG 2.57E +03 4.76E +03 2.54E +03 2.62E +03 2.86E +03 4.42E +03 3.78E +03 3.34E +03 3.65E +03 3.85E +03
SD 4.12E +01 2.25E +03 2.76E +01 4.87E +01 1.73E +01 4.76E +02 1.55E +01 2.85E +01 1.87E +02 4.44E +03
GWO AVG 2.54E +03 8.65E +03 3.34E +03 3.98E +03 4.45E +03 9.88E +03 4.78E +03 3.92E +03 6.49E +03 3.32E +03
SD 3.22E +01 1.98E +03 1.12E +02 1.88E +02 3.62E +02 3.29E +02 2.74E +02 4.14E +02 5.33E +02 8.12E +02
WOA AVG 2.65E +03 6.97E +03 3.16E +03 3.20E +03 3.11E +03 7.87E +03 3.84E +03 3.28E +03 4.98E +03 9.76E +03
SD 5.43E +01 2.01E +03 1.07E +02 9.18E +01 3.76E +01 9.39E +02 8.22E +01 3.39E +01 4.37E +02 6.34E +02
CSO AVG 2.76E +03 3.59E +03 3.35E +03 3.21E +03 2.45E +03 7.15E +03 3.82E +03 3.03E +03 5.82E +03 3.21E +03
SD 4.89E +01 1.37E +02 1.42E +02 1.31E +02 1.59E +01 1.17E +03 3.25E +02 3.22E +01 7.99E +02 2.01E +02
MCSO AVG 2.24E +03 3.66E +03 3.17E +03 2.52E +03 2.62E +03 5.14E +03 3.57E +03 2.99E +03 4.22E +03 3.13E +03
SD 2.13E +01 1.23E +03 1.41E +02 2.49E +01 1.69E +00 4.26E +02 6.87E +00 1.24E +01 4.52E +02 1.29E +02
and ICSO. MCSO outperformed all the mentioned meth-
ods. For benchmark function optimization, MCSO wins
76.67% of the time, whereas gives competitive perfor-
mance 20% of the time and loses 3.33% of the time when
compared to the competitive optimization methods. For
CEC-2017 functions, MCSO wins 76.67% of the time
and losses 23.33% of the time. MCSO provided the best
values for the pressure vessel design problem. Overall,
MCSO outperformed all the optimization problems. The
work proposed in this paper can be extended by testing
the proposed MCSO on other optimization problems
such as feature optimization or model optimization. Also,
the proposed work can be extended to solve the multi-
objective optimization problems. In future work, Levy’s
ight-based randomness can be replaced by some other
random distribution methods like intermittent search.
DISCLOSURE STATEMENT
No potential conict of interest was reported by the author(s).
DECLARATION OF COMPETING INTEREST
Theauthorsdeclarethattheyhavenoknowncompet-
ing nancial interests or personal relationships that could
have appeared to inuence the work reported in this
paper.
ORCID
Satya Verma http://orcid.org/0000-0001-8298-1057
Satya Prakash Sahu http://orcid.org/0000-0002-9886-9518
Tirath P ra sad S ahu http://orcid.org/0000-0001-9985-5241
SATYA VERMA ET AL.: MCSO: LEVY’S FLIGHT GUIDED MODIFIED CHICKEN SWARM OPTIMIZATION 13
Tab le 8: Statistical tests for CEC-2017 functions (MCSO vs. PSO, GWO, WOA, and CSO)
MCSO vs. PSO MCSO vs. GWO MCSO vs. WOA MCSO vs. CSO
Function p-value +//=p-value +//=p-value +//=p-value +//=
CEC-1 1.45E-07 +1.81E-06 +1.54E-06 +1.26E-06 +
CEC-2 1.76E-06 +2.01E-06 +1.32E-06 +1.79E-06 +
CEC-3 1.23E-01 1.76E-06 +1.98E-06 +1.92E-01
CEC-4 1.87E-01 +1.69E-06 +4.27E-05 +1.32E-04 +
CEC-5 5.48E-04 +1.82E-06 +1.86E-06 +2.12E-06 +
CEC-6 0.61E +00 +1.78E-06 +1.91E-06 +3.24E-06 +
CEC-7 1.91E-06 +1.78E-06 +1.69E-06 +1.98E-06 +
CEC-8 3.04E-02 +1.68E-06 +1.75E-06 +1.67E-06 +
CEC-9 8.58E-06 +1.59E-06 +1.89E-06 +1.73E-06 +
CEC-10 1.69E-02 +1.87E-06 +2.91E-06 +6.45E-05 +
CEC-11 1.55E-02 +1.82E-06 +1.64E-06 +1.92E-06 +
CEC-12 1.62E-06 +1.45E-06 +1.72E-06 +1.87E-06 +
CEC-13 3.36E-02 +1.79E-06 +1.69E-06 +1.79E-06 +
CEC-14 3.51E-01 1.64E-06 +2.56E-05 +2.01E-06 +
CEC-15 1.73E-06 +1.51E-06 +2.31E-06 +1.02E-05 +
CEC-16 3.21E-02 2.12E-06 +1.54E-06 +1.05E-06 +
CEC-17 2.19E-01 3.65E-06 +1.34E-06 3.44E-06 +
CEC-18 1.73E-06 +1.86E-06 +9.97E-03 +2.18E-06 +
CEC-19 2.25E-06 +2.91E-06 +1.95E-06 +1.84E-06 +
CEC-20 0.98E-01 +1.73E-06 +1.82E-06 +1.57E-06 +
CEC-21 7.12E-01 +1.97E-06 +1.72E-02 +3.99E-06 +
CEC-22 3.89E-02 +2.53E-06 +1.45E-04 +1.94E-06 +
CEC-23 8.47E-02 1.45E-06 +8.34E-06 1.59E-06 +
CEC-24 1.62E-06 +2.69E-06 +1.88E-06 +2.01E-06 +
CEC-25 1.85E-06 +3.11E-06 +1.96E-06 +2.46E-06 +
CEC-26 6.98E-01 1.89E-06 +1.96E-06 +1.68E-06 +
CEC-27 3.21E-02 +1.72E-06 +1.74E-06 +1.89E-06 +
CEC-28 1.82E-06 +3.01E-02 +1.83E-06 +1.81E-06 +
CEC-29 8.63E-01 1.25E-06 +1.69E-06 +1.76E-06 +
CEC-30 1.85E-06 +1.73E-06 +1.81E-06 +1.92E-06 +
Tot al (+//=) 23/7/0 30/0/0 28/2/0 29/1/0
Tab le 9: Comparative performance for pressure vessel
design problems
Optimum Variables
Optimization
Method a1a2a3a4
Optimum
Cost
PSO [16] 0.77896 0.38468 40.32091 200.00000 5,891.3879
GWO [17] 0.77904 0.38466 40.32779 199.65029 5,889.3689
WOA [18] 0.81250 0.43750 42.09827 176.63899 6059.7410
CSO [19] 0.80412 0.40271 41.23561 182.45631 5885.5908
ICSO [25] 0.79376 0.39139 40.86429 183.96583 5753.9043
MCSO 0.77381 0.38564 40.24128 178.53924 5387.2063
REFERENCES
1. M. O. Okwu, and L. K. Tartibu, “Bat algorithm,” in
Metaheuristic Optimization: Nature-Inspired Algorithms
Swarm and Computational Intelligence, Theory and Appli-
cations, Vol. 927, Cham: Springer, 2021, pp. 71–84. DOI:
10.1007/978-3-030-61111-8_8.
2. C. Blum, and A. Roli, “Metaheuristics in combinatorial
optimization: overview and conceptual comparison,” ACM
Comput. Surv, Vol. 35, no. 3, pp. 268–308, 2003.DOI:
10.1145/937503.937505.
3. R. S. Parpinelli, and H. S. Lopes, “New inspirations in
swarm intelligence: A survey,” Int. J. Bio-Inspired Comput,
Vol. 3, no. 1, pp. 1–16, 2011.DOI:10.1504/IJBIC.2011.03
8700.
4. J.A.Parejo,A.Ruiz-Cortés,S.Lozano,andP.Fernandez,
“Metaheuristic optimization frameworks: A survey and
benchmarking,” Soft. comput., Vol. 16, no. 3, pp. 527–61,
2012.DOI:10.1007/s00500-011-0754-8.
5. Y. C. Ho, and D. L. Pepyne, “Simple explanation of the
No-free-lunch theorem and Its implications,” J. Optim.
Theory Appl, Vol. 115, no. 3, pp. 549–70, 2002.DOI:
10.1023/A:1021251113462.
6. X. Yang, “From swarm intelligence to metaheuristics:
nature-inspired optimization algorithms,” Computer (Long.
Beach. Calif), Vol. 49, no. 9, pp. 52–9, 2016.
7. P. Agarwal, and S. Mehta, “Nature-Inspired algorithms:
state-of-Art, problems and prospects,” Int. J. Comput. Appl,
Vol. 100, no. 14, pp. 14–21, 2014.DOI:10.5120/17593-
8331.
8. A. Chakraborty, and A. K. Kar, “Swarm intelligence:
areviewofalgorithms,”inNature-Inspired Computing
and Optimization. Modeling and Optimization in Science
and Technologies, Vol. 10, S. Patnaik, X. S. Yang, and
K. Nakamatsu, Ed. Cham: Springer, 2017, pp. 475–94.
DOI: 10.1007/978-3-319-50920-4_19.
9. S. Mirjalili, “Dragony algorithm: A New meta-heuristic
optimization technique for solving single-objective, dis-
crete, and multi-objective problems,” Neural Comput. Appl,
Vol. 27, no. 4, pp. 1053–73, 2016.DOI:10.1007/s00521-01
5-1920-1.
14 SATYA VERMA ET AL.: MCSO: LEVY’S FLIGHT GUIDED MODIFIED CHICKEN SWARM OPTIMIZATION
10. J. Tang, G. Liu, and Q. Pan, “A review on representa-
tive swarm intelligence algorithms for solving optimiza-
tion problems : applications and trends,” IEEE/CAA J.
Autom. Sin, Vol. 8, no. 10, pp. 1627–43, 2021.DOI:
10.1109/JAS.2021.1004129.
11. K. Kaur, and Y. Kumar, “Swarm intelligence and Its appli-
cations towards various computing: A systematic review,”
Proc.Int.Conf.Intell.Eng.Manag.ICIEM, Vol. 2020, pp.
57–62, 2020.DOI:10.1109/ICIEM48762.2020.9160177.
12. B. A. S. Emambocus, M. B. Jasser, and A. Amphawan,
“A survey on the optimization of articial neural net-
works using swarm intelligence algorithms,” IEEE. Access.,
Vol. 11, no. December 2022, pp. 1280–94, 2023.DOI:
10.1109/access.2022.3233596.
13. M.O.Okwu,andL.K.Tartibu,“Batalgorithm,”inMeta-
heuristic optimization: nature-inspired algorithms swarm
and computational intelligence, theory and applications,
Springer, 2021, pp. 71–84.
14. D. Karaboga, and B. Basturk, “A powerful and ecient
algorithm for numerical function optimization: articial
Bee colony (ABC) algorithm,” J. Glob. Optim, Vol. 39, no.
3, pp. 459–71, 2007.DOI:10.1007/s10898-007-9149-x.
15. M. B. Dorigo, and S. Mauro Thomas, “Ant colony opti-
mization,” IEEE Comput. Intell. Mag. 1, 28–39, 2006.DOI:
10.1109/MCI.2006.329691.
16. J. Kennedy, and R. Eberhart. Particle Swarm Optimiza-
tion. In Proceedings of ICNN’95 - International Conference
on Neural Networks; 1995; Vol. 4, pp 1942–1948. DOI:
10.1109/ICNN.1995.488968.
17. S. Mirjalili, S. M. Mirjalili, and A. Lewis, “Grey wolf opti-
mizer,” Adv. Eng. Softw, Vol. 69, pp. 46–61, 2014.DOI:
10.1016/j.advengsoft.2013.12.007.
18. S. Mirjalili, and A. Lewis, “The whale optimization
algorithm,” Adv. Eng. Softw, Vol. 95, pp. 51–67, 2016.DOI:
10.1016/j.advengsoft.2016.01.008.
19.X.Meng,Y.Liu,X.Gao,andH.Zhang,“ANewBio-
inspired algorithm: chicken swarm optimization,” Lect.
Notes Comput. Sci. (Including Subser. Lect. Notes Artif.
Intell. Lect. Notes Bioinformatics), Vol. 8794, pp. 86–94,
2014.DOI:10.1007/978-3-319-11857-4_10.
20. X. S. Yang, Firey algorithm, stochastic test functions and
design optimization,” Int. J. Bio-Inspired Comput,Vol.2,
no. 2, pp. 78–84, 2010.DOI:10.1504/IJBIC.2010.032124.
21. X. S. Yang, and A. H. Gandomi, “Bat algorithm: A novel
approach for global engineering optimization,” Eng. Com-
put. (Swansea, Wales),Vol. 29, no. 5, pp. 464–83, 2012.
DOI: 10.1108/02644401211235834.
22. G. Fu, H. Gong, H. Gao, T. Gu, and Z. Cao, “Inte-
grated thermal error modeling of machine tool spin-
dle using a chicken swarm optimization algorithm-based
radial basic function neural network,” Int. J. Adv. Manuf.
Technol , Vol. 105, no. 5–6, pp. 2039–55, 2019.DOI:
10.1007/s00170-019-04388-5.
23. S. Verma, S. P. Sahu, and T. P. Sahu, “Discrete wavelet
transform-based feature engineering for stock market
prediction,” Int. J. Inf. Technol, Vol. 15, pp. 1179–88, 2023.
DOI: 10.1007/s41870-023-01157-2.
24.Z.Wang,W.Zhang,Y.Guo,M.Han,B.Wan,andS.
Liang, “A multi-objective chicken swarm optimization
algorithm based on dual external archive with various
elites,” Appl.SoftComput, Vol. 133, pp. 109920, 2023.DOI:
10.1016/j.asoc.2022.109920.
25.D.Wu,F.Kong,W.Gao,Y.Shen,andZ.Ji.Improved
Chicken Swarm Optimization. 2015 IEEE Int. Conf. Cyber
Technol. Autom. Control Intell. Syst. IEEE-CYBER 2015,
2015, 681–686. DOI: 10.1109/CYBER.2015.7288023.
26.D.Wu,S.Xu,andF.Kong,“Convergenceanalysis
and improvement of the chicken swarm optimization
algorithm,” IEEE. Access., Vol. 4, pp. 9400–12, 2016.DOI:
10.1109/ACCESS.2016.2604738.
27. D. He, G. Lu, and Y. Yang, “Research on optimization
of train energy-saving based on improved chicken swarm
optimization,” IEEE. Access., Vol. 7, pp. 121675–84, 2019.
DOI: 10.1109/ACCESS.2019.2937656.
28.Y.Wu,B.Yan,andX.Qu,“Improvedchickenswarm
optimization method for reentry trajectory optimiza-
tion,” Math.Probl.Eng, Vol. 2018, pp. 1–13, 2018.DOI:
10.1155/2018/813
5274.
29. S. Torabi, and F. Sa-Esfahani, “A hybrid algorithm based
onchickenswarmandimprovedravenroostingopti-
mization,” Soft Computing, Vol. 23, no. 20, pp. 10129–71,
201910.1007/s00500-018-3570-6.
30. S. Deb, X. Z. Gao, K. Tammi, K. Kalita, and P. Mahanta, “A
New teaching–learning-based chicken swarm optimiza-
tion algorithm,” Soft. comput., Vol. 24, no. 7, pp. 5313–31,
2020.DOI:10.1007/s00500-019-04280-0.
31. F. Ajesh, and R. Ravi, “Hybrid features and optimization-
driven recurrent neural network for glaucoma detection,”
Int. J. Imaging Syst. Technol, Vol. 30, no. 4, pp. 1143–61,
2020.DOI:10.1002/ima.22435.
32. L. Cui, Y. Zhang, and Y. Jiao, “Robust array beamforming
via an improved chicken swarm optimization approach,”
IEEE. Access., Vol. 9, no. Dl, pp. 73182–93, 2021.DOI:
10.1109/ACCESS.2021.3081138.
3
3.N.Kumari,R.K.Dwivedi,A.K.Bhatt,andR.Bel-
wal, Automated fruit grading using optimal feature selec-
tion and hybrid classication by self-adaptive chicken
swarm optimization: grading of mango,” Neural Com-
put. Appl, Vol. 34, no. 2, pp. 1285–306, 2022.DOI:
10.1007/s00521-021-06473-x.
SATYA VERMA ET AL.: MCSO: LEVY’S FLIGHT GUIDED MODIFIED CHICKEN SWARM OPTIMIZATION 15
34.H.Wang,Z.Chen,andG.Liu.AnImprovedChicken
Swarm Optimization Algorithm for Feature Selection BT
- Proceeding of 2021 International Conference on Wire-
less Communications, Networking and Applications; Qian,
Z.,Jabbar,M.A.,Li,X.,Eds.;SpringerNatureSingapore:
Singapore, 2022; pp 177–186.
35. M. Chawla, and M. Duhan, “Levy ights in metaheuristics
optimization algorithms–A review,” Appl. Artif. Intell,Vol.
32, no. 9–10, pp. 802–21, 2018.DOI:10.1080/08839514.
2018.1508807.
36. J. Demšar, “Statistical comparisons of classiers over mul-
tiple data sets,” J.Mach.Learn.Res,Vol.7,pp.130,2006.
37. X. S. Yang, C. Huyck, M. Karamanoglu, and N. Khan, “True
global optimality of the pressure vessel design problem: A
benchmark for Bio-inspired optimisation algorithms,” Int.
J. Bio-Inspired Comput, Vol. 5, no. 6, pp. 329–35, 2013.DOI:
10.1504/IJBIC.2013.058910.
38. S. Das, T. P. Sahu, and R. R. Janghel, “PSO-Based Group-
Oriented crow search algorithm (PGCSA),” Eng. Comput.
(Swansea, Wales), Vol. 38, no. 2, pp. 545–71, 2021.DOI:
10.1108/EC-07-2019-0305.
AUTHORS
Satya Verma received her BE in com-
puter science and engineering from Pt.
Ravishankar Shukla University, Raipur,
India in 2003. She received an M Tech
degree in computer technology from the
NationalInstituteofTechnologyRaipur,
India in 2011. Presently she is working
as a research scholar in the Department
of Information Technology, National Institute of Technology
Raipur, India. She worked as faculty in various engineering
colleges and a University in Chhattisgarh, India. Her research
interest includes data mining, machine learning, optimization
techniques, image processing, and time-series analysis.
Corresponding author. Email: satya.ritu@gmail.com
Satya Prakash Sahu is associate professor
in the Department of Information Tech-
nology at the National Institute of Tech-
nology Raipur, India. He received his BE
andMTechdegreesincomputerscience
and engineering from Rajiv Gandhi Tech-
nological University, Bhopal, India, and
PhD degree from the National Institute of
Technology Raipur, India. He has more than 17 years of teach-
ing and research experience. His primary research areas are
articial intelligence, computer vision, digital image process-
ing, soft computing, and medical imaging.
Email: spsahu.it@nitrr.ac.in
Tirath Prasad Sahu is assistant profes-
sor in the Department of Information
Technology at the National Institute of
Technology Raipur, India. He received his
MTechdegreeincomputerscienceand
engineering from Samrat Ashok Tech-
nological Institute, Vidisha, in 2012 and
PhD in computer science and engineer-
ingfromtheNationalInstituteofTechnologyRaipur,India
in 2018. His research interests include data mining, text ana-
lytics, bio-informatics, optimization techniques, and image
processing.
Email: tpsahu.it@nitrr.ac.in
... When , , and conditions are present, these numbers are respectively referred to as Interval Type-2 Flat Trapezoidal Fuzzy Number (IT2FTrFN) and Perfect Interval Type-2 Trapezoidal Fuzzy Number (PIT2TrFN). (19) and (20) For the case where , and k is crisp, certain arithmetic and ranking operations for GIT2TrFNs are defined as follows [20]- [22]: Addition Operation: (21) Subtraction Operation: (22) Multiplication Operation: ...
... Although the field of ranking methods on IT2FS is noticeably narrow and requires new approaches, similarity and uncertainty measures have drawn broad methodology. While there exists extensive methodology and research interest in similarity and uncertainty measures for IT2FSs, the field of ranking methods is constrained, requiring fresh perspectives and innovative solutions [20], [22]-[25]. ...
... Without supporting data, it is challenging to adopt the under-body heating system widely due to its higher cost. While prior research has indicated the effectiveness of under-body heating systems for supine surgery [19,20], no reports of their efficacy for procedures carried out in the lithotomy position exist [21]. ...
Book
Full-text available
Mühendislik, köprüler, tüneller, yollar, araçlar ve binalar dahil olmak üzere makineler, yapılar ve diğer öğeleri tasarlamak ve inşa etmek için bilimsel ilkelerin kullanılmasıdır. Mühendislik disiplini, her biri uygulamalı matematik, uygulamalı bilim ve uygulama türlerinin belirli alanlarına özel vurgu yapan, geniş bir yelpazede uzmanlaşmış mühendislik alanlarını kapsar. Mühendislik bir ülkenin kalkınmasında ve gelişmesinde çok önemli rol oynayan disiplinler arası ortaklığın meydana getirdiği bir üst bilim dalıdır. Mühendislik, matematiksel ve doğal bilim dallarından, ders çalışma, deney yapma ve uygulama yolları ile kazanılmış bilgileri akıllıca kullanarak, doğanın kuvvetleri ve maddelerini insanoğlu yararına sunmak üzere ekonomik olan yöntemler geliştiren bir meslektir. Çünkü mühendislik yaklaşımı; işi sorun çözmek olan insan yaklaşımıdır. Mühendislik yaklaşımı içinde bulunan çalışanlar, görülmeyeni görerek, düşünülmeyeni bularak, optimum çözümleri hedefleyip durumdan maksimum faydayı çıkarmayı bilirler. Diğer taraftan mühendislik nedir, sorusu ile aklımıza çok geniş ve detaylı bir tanımlama gelse de genel olarak mühendislik, problemleri çözebilmek için gerekli olan bilim ve matematiğin uygulanmasıdır diyebiliriz. Mühendisler, bir şeylerin nasıl çalıştığını anlar ve bilimsel keşiflerin pratik hayatta kullanımı için yöntemler bulur. Yayınlanan bu kitap; çeşitli mühendislik dallarında hocalık yapan akademisyenlerin sunmuş olduğu gerek kendi özgün çalışmaları ve gerekse literatürden aktarılan derleme çalışmaların bir araya getirilmiş sunumlarından meydana getirilmiştir. Burada amaç konuyla ilgilenen mühendis ve akademisyenlerin önemli sayılacak mühendislik çalışmalarını bir arada bulmalarıdır. Dolayısıyla kitabın önemli bir boşluğu dolduracağı ve genç araştırmacılara faydalı olacağı kanaatindeyim. Bu bağlamda; kitaba bilimsel katkı sunan, kitabı baskıya hazırlayan kısacası emeği geçen herkese teşekkür ederiz. Yayımlanan bu kitabın gerek mühendislere ve gerekse genç akademisyenlere faydalı olmasını diler, tüm mühendis ve genç akademisyenlere başarılar dilerim. Prof. Dr. Kamil KAYGUSUZ Makine & Kimya Mühendisi ve Enerji Uzmanı Karadeniz Teknik Üniversitesi Öğretim Üyesi Türkiye Bilimler Akademisi Asli Üyesi
... CSO, a nature-based algorithm that models the behaviour of chickens in search of food, grouped into 3 categories of chickens: hens, roosters, and The algorithm works as follows; there are groups in the swarm consisting of a rooster, hens, and chicks (Gullu, 2021;Meng et al., 2014;Verma et al., 2023). Roosters are identified as the individuals with the best fitness value, while at the same time acting as the leader in the group. ...
... is maintained by assigning randomly selected hens the role of mothers of randomly selected chicks (Gullu, 2021;M, A. O, 2018;Meng et al., 2014;Verma et al., 2023). In the next stage, position-shifting operations are performed to search for food. ...
... chicks, was introduced to the literature in 2014 byXianbing Meng, Yu Liu, Xiaozhi Gao, and Hengzhen Zhang (M, A. O, 2018;Meng et al., 2014;Verma et al., 2023). ...
Article
Full-text available
Anaemia occurs when the haemoglobin (Hgb) value falls below a certain reference range. It requires many blood tests, radiological images, and tests for diagnosis and treatment. By processing medical data from patients with artificial intelligence and machine learning methods, disease predictions can be made for newly ill individuals and decision‐support mechanisms can be created for physicians with these predictions. Thanks to these methods, which are very important in reducing the margin of error in the diagnoses made by doctors, the evaluation of data records in health institutions is also important for patients and hospitals. In this study, six hybrid models are proposed to classify non‐anaemia records, Hgb‐anaemia, folate deficiency anaemia (FDA), iron deficiency anaemia (IDA), and B12 deficiency anaemia by combining artificial intelligence and machine learning methods TreeBagger, Crow Search Algorithm (CSA), Chicken Swarm Optimization Algorithm (CSO) and JAYA methods. The proposed hybrid models are analysed with two different approaches, with/without applying the SMOTE technique to achieve high performance by better emphasizing the importance of parameters. To solve the multiclass anaemia classification problem, fuzzy logic‐based parameter optimization is applied to improve the class‐based accuracy as well as the overall accuracy in the dataset. The proposed methods are evaluated using ROC criteria to build a prediction model to determine the anaemia type of anaemic patients. As a result of the study on the dataset taken from the Kaggle database, it is observed that the six proposed hybrid methods outperformed other studies using the same dataset and similar studies in the literature.
... When the roosters get trapped in local optima, the hens and chicks will converge too early, reducing the effect of global optimization. Based on this problem, Verma et al. [16] introduced Levy's Flight strategy to solve the problems of local optimal and premature convergence. Ahmed et al. [17] improved the search capability of CSO by applying logistic and tending to chaotic maps to help the CSO swarm to better explore the search space. ...
Article
Full-text available
Bio-inspired optimization algorithms are competitive solutions for engineering design problems. Chicken swarm optimization (CSO) combines the advantages of differential evolution and particle swarm optimization, drawing inspiration from the foraging behavior of chickens. However, the CSO algorithm may perform poorly in the face of complex optimization problems because it has a high risk of falling into a local optimum. To address these challenges, a new CSO called chicken swarm optimization combining Pad$ \acute{e} $ approximate, random learning and population reduction techniques (PRPCSO) was proposed in this work. First, a Pad$ \acute{e} $ approximate strategy was combined to help agents converge to the approximate real solution area quickly. Pad$ \acute{e} $ approximate was grounded in a rational function aligning with the power series expansion of the approximated function within a defined number of terms. The fitting function used in this strategy employs the above rational function and the extreme points are calculated mathematically, which can significantly improve the accuracy of the solution. Second, the random learning mechanism encouraged agents to learn from other good agents, resulting in better local exploitation capability compared to traditional CSO. This mechanism has a special idea that when it comes to selecting random individuals, it selects from the same type of high-performing agents, rather than selecting them completely at random. Third, a new intelligent population size shrinking strategy was designed to dynamically adjust the population size to prevent premature convergence. It considers fitness function calls and variations in recent optimal solutions creatively. To validate the algorithm's efficacy, PRPCSO was rigorously tested across 23 standard test functions and six kinds of practical engineering problems. We then compared PRPCSO with several mainstream algorithms, and the results unequivocally established PRPCSO's superior performance in most instances, highlighting its substantial practical utility in real engineering applications.
Article
Full-text available
Artificial Neural Networks (ANNs) are becoming increasingly useful in numerous areas as they have a myriad of applications. Prior to using ANNs, the network structure needs to be determined and the ANN needs to be trained. The network structure is usually chosen based on trial and error. The training, which consists of finding the optimal connection weights and biases of the ANN, is usually done using gradient-descent algorithms. It has been found that swarm intelligence algorithms are favorable for both determining the network structure and for the training of ANNs. This is because they are able to determine the network structure in an intelligent way, and they are better at finding the most optimal connection weights and biases during the training as opposed to conventional algorithms. Recently, a number of swarm intelligence algorithms have been employed for optimizing different types of neural networks. However, there is no comprehensive survey on the swarm intelligence algorithms used for optimizing ANNs. In this paper, we present a review of the different types of ANNs optimized using swarm intelligence algorithms, the way the ANNs are optimized, the different swarm intelligence algorithms used, and the applications of the ANNs optimized by swarm intelligence algorithms.
Chapter
Full-text available
In recent years, feature selection is becoming more and more important in data mining. Its target is that reduce the dimensionality of the datasets while at least maintaining the classification accuracy. There are some researches about chicken swarm optimization algorithm (CSO) applied to feature selection, the effect is extraordinary compared with traditional swarm intelligence algorithms. However, there is a complex search space in the challenging task feature selection, the CSO algorithm still has a default that quickly gets stuck in the local minimum problem. An improved chicken swarm optimization algorithm (ICSO) is proposed in this paper, which introduces the Levy flight strategy in the hen location update strategy and the nonlinear strategy of decreasing inertial weight in the chick location update strategy to increase the global search ability and avoid getting stuck in the local minimum problem. Compared with the other three algorithms on eighteen UCI datasets shows that the ICSO algorithm can greatly reduce the redundant features while ensuring classification accuracy.
Article
Full-text available
Post-harvest grading is an essential and important process that affects the fruit quality, evaluation, health-intensive, and export market. Even though the sorting and grading can be performed by the human as going on, it is tedious, labor-intensive, slow, and error-prone. Hence, smart automation is required for the same. Computer vision advancements touching every area where there even a minimal chance of smart automation. In this paper, intelligent automation for mango fruit grading designed and developed. Initially, the fruit segmentation is done by the active contour model, and the abnormality segmentation is performed using enhanced fuzzy-based K-means clustering approach followed by features: discrete Fourier transform (DFT), local binary pattern (LBP), and gray-level co-occurrence matrix (GLCM) and shape features extraction. The self-adaptive chicken swarm optimization (SA-CSO) has been used to reduce and optimize features vector. The quality of the fruits has finally categorized based on surface defect and maturity classification. Hence, for defect classifications, optimal abnormality segmented features have been fed to the K-nearest neighbors (KNN). The optimally selected fruit segmented features are subjected to a fuzzy classifier and the fruit segmented images are subjected to a convolutional neural network (CNN). As an improvement, the proposed SA-CSO is used to optimize the hybrid classifier for maximizing the accuracy of classification. The maturity is classified using the hybrid fuzzy classifier and CNN as ripe, partially ripe, and unripe. Finally, the defect and maturity output have been used to decide the quality as good, average, and bad. The comparative analysis of diverse performance metrics proves the effectiveness of the proposed model over other traditional algorithms.
Article
Full-text available
Robust array beamforming is a challenging task in radar, sonar and communications due to the influence of direction of arrival (DOA) mismatch and sensor position errors. However, how to enhance the robustness of beamforming is a key issue in antenna arrays. The current paper focuses on a novel approach called the improved chicken swarm optimization (ICSO) method to settle the optimization model of conventional linearly constrained minimum variance (LCMV) based on support vector machine (SVM) to against the mismatch problems as well as control the sidelobe level (SLL). As far as the ICSO method is concerned, considering that the particle swarm optimization (PSO) algorithm has outstanding convergence performance in the early iteration, the dominance of the alpha wolf in the grey wolf optimization (GWO) algorithm and the innovative mutual attraction mechanism in the firefly algorithm (FA), and we introduce these three strategies into the solution update method of conventional chicken swarm optimization (CSO) algorithm for achieving better optimization capability. Moreover, an operation of removing duplicate solutions is proposed to enhance the utilization of the population. In terms of the SVM-based LCMV beamforming algorithm, we adopt the so-called linear ε - insensitive loss function to reconstruct the final cost function of LCMV by penalizing the errors between the actual and ideal array responses. Finally, we conduct simulations to evaluate the performance of the swarm intelligent optimization algorithms under an ideal scenario without mismatch and an actual scenario with the mismatch, respectively. And the results demonstrate that the developed ICSO algorithm obtains excellent robustness for different scenarios compared to PSO, FA, GWO and CSO optimization algorithms.
Book
Full-text available
This book exemplifies how algorithms are developed by mimicking nature. It emphasizes on the social behaviour of insects, animals and other natural entities, in terms of converging power and benefits. Major nature-inspired algorithms discussed in this text include the bee colony algorithm, ant colony algorithm, grey wolf optimization algorithm, whale optimization algorithm, firefly algorithm, bat algorithm, ant lion optimization algorithm, grasshopper optimization algorithm, butterfly optimization algorithm and others. The algorithms have been arranged in chapters to help readers gain better insight into nature-inspired systems and swarm intelligence. All the MATLAB codes have been provided in the appendices of the book to enable readers practice how to solve examples included in all sections. This book is for undergraduate students, postgraduate researchers and experts in Engineering and Applied Sciences, Natural and Formal Sciences, Economics, Humanities and Social Sciences. https://www.springer.com/gp/book/9783030611101#
Article
Stock market prediction is an interesting area of research where Technical Indicators (TI) play an important role. However, prediction of stock market movement is difficult due to the presence of noise and irregularities in the stock data. Data de-noising and decomposition techniques are apt to handle such noise. The data decomposition technique may lead to the generation of a large feature vector that needs to be handled carefully. Therefore, a suitable and effective feature engineering component must be included in the prediction model. To handle the above-mentioned issues, this paper proposes a stock market prediction model which offers a module for TI computation, feature engineering, and stock market prediction. A feature engineering component is proposed in which Discrete Wavelet Transform (DWT) is offered for data decomposition and Chicken Swarm Optimization (CSO) is offered to handle the large number of features generated through DWT. CSO is used to select the optimal feature subset. The proposed feature engineering component is named as DWT-CSO. The stock market trend prediction is performed by Machine Learning (ML) and Deep Learning (DL) models. The dataset of Indian (NIFTY50 and BSE) and US stock (S&P500 and DJI) indices is used for experimentation. The proposed DWT-CSO provided improved performance. The prediction models’ accuracy is increased by 19.59% (for S&P500), 18.33% (for DJI), 19.43% (for NIFTY50), 15.89% (for BSE). The performance of DWT-CSO is statistically analysed with Wilcoxon rank-sum test.
Article
Swarm intelligence algorithms are a subset of the artificial intelligence (AI) field, which is increasing popularity in resolving different optimization problems and has been widely utilized in various applications. In the past decades, numerous swarm intelligence algorithms have been developed, including ant colony optimization (ACO), particle swarm optimization (PSO), artificial fish swarm (AFS), bacterial foraging optimization (BFO), and artificial bee colony (ABC). This review tries to review the most representative swarm intelligence algorithms in chronological order by highlighting the functions and strengths from 127 research literatures. It provides an overview of the various swarm intelligence algorithms and their advanced developments, and briefly provides the description of their successful applications in optimization problems of engineering fields. Finally, opinions and perspectives on the trends and prospects in this relatively new research domain are represented to support future developments.
Article
Purpose The purpose of this paper is to modify the crow search algorithm (CSA) to enhance both exploration and exploitation capability by including two novel approaches. The positions of the crows are updated in two approaches based on awareness probability (AP). With AP, the position of a crow is updated by considering its velocity, calculated in a similar fashion to particle swarm optimization (PSO) to enhance the exploiting capability. Without AP, the crows are subdivided into groups by considering their weights, and the crows are updated by conceding leaders of the groups distributed over the search space to enhance the exploring capability. The performance of the proposed PSO-based group-oriented CSA (PGCSA) is realized by exploring the solution of benchmark equations. Further, the proposed PGCSA algorithm is validated over recently published algorithms by solving engineering problems. Design/methodology/approach In this paper, two novel approaches are implemented in two phases of CSA (with and without AP), which have been entitled the PGCSA algorithm to solve engineering benchmark problems. Findings The proposed algorithm is applied with two types of problems such as eight benchmark equations without constraint and six engineering problems. Originality/value The PGCSA algorithm is proposed with superior competence to solve engineering problems. The proposed algorithm is substantiated hypothetically by using a paired t-test.
Chapter
Bat algorithm (BA) is an innovative population-based technique which belongs to the swarm intelligence group. This meta-heuristic algorithm provides a suitable solution technique than numerous and prevalent classical and heuristic techniques. This chapter is an exposition of the communication and navigational pattern of bats and micro-bats echolocation (EL), algorithm development and solved numerical problem. The illustration and implementation of the BAT algorithm for a typical optimization problem considering a numerical equation, using MATLAB code has been demonstrated. The model considers 10 bats and the maximum number of iterations of 1000. The best solution obtained by BA is [1.3138 1.8528 0.261 0.83905 0.34859 1.1436 0.73859 1.7623 0.16537 0.28717]. The best optimal value of the objective function found by BAT is: 0.23604. BA is considered useful in engineering, business, transportation, and other fields of human endeavour.