ArticlePDF Available

Black Hole Algorithm and Its Applications

Authors:

Abstract and Figures

Bio-inspired computation is a field of study that connects together numerous subfields of connectionism (neural network), social behavior, emergence field of artificial intelligence and machine learning algorithms for complex problem optimization. Bio-inspired computation is motivated by nature and over the last few years, it has encouraged numerous advance algorithms and set of computational tools for dealing with complex combinatorial optimization problems. Black Hole is a new bio-inspired metaheuristic approach based on observable fact of black hole phenomena. It is a population based algorithmic approach like genetic algorithm (GAs), ant colony optimization (ACO) algorithm, particle swarm optimization (PSO), firefly and other bio-inspired computation algorithms. The objective of this book chapter is to provide a comprehensive study of black hole approach and its applications in different research fields like data clustering problem, image processing, data mining, computer vision, science and engineering. This chapter provides with the stepping stone for future researches to unveil how metaheuristic and bio-inspired commutating algorithms can improve the solutions of hard or complex problem of optimization.
Content may be subject to copyright.
Black Hole Algorithm and Its Applications
Santosh Kumar, Deepanwita Datta and Sanjay Kumar Singh
Abstract Bio-inspired computation is a eld of study that connects together
numerous subelds of connectionism (neural network), social behavior, emergence
eld of articial intelligence and machine learning algorithms for complex problem
optimization. Bio-inspired computation is motivated by nature and over the last few
years, it has encouraged numerous advance algorithms and set of computational
tools for dealing with complex combinatorial optimization problems. Black Hole is
a new bio-inspired metaheuristic approach based on observable fact of black hole
phenomena. It is a population based algorithmic approach like genetic algorithm
(GAs), ant colony optimization (ACO) algorithm, particle swarm optimization
(PSO), rey and other bio-inspired computation algorithms. The objective of this
book chapter is to provide a comprehensive study of black hole approach and its
applications in different research elds like data clustering problem, image pro-
cessing, data mining, computer vision, science and engineering. This chapter pro-
vides with the stepping stone for future researches to unveil how metaheuristic and
bio-inspired commutating algorithms can improve the solutions of hard or complex
problem of optimization.
Keywords Metaheuristic ÁBlack hole ÁSwarm intelligence ÁK-means Á
Clustering
S. Kumar (&)ÁD. Datta ÁS.K. Singh
Department of Computer Science and Engineering, Indian Institute of Technology
(Banaras Hindu University), Varanasi, India
e-mail: santosh.rs.cse12@iitbhu.ac.in
D. Datta
e-mail: welcomedeepanwita@gmail.com
S.K. Singh
e-mail: sks.cse@iitbhu.ac.in
©Springer International Publishing Switzerland 2015
A.T. Azar and S. Vaidyanathan (eds.), Computational Intelligence Applications
in Modeling and Control, Studies in Computational Intelligence 575,
DOI 10.1007/978-3-319-11017-2_7
147
1 Introduction
Bio-inspired computation and swarm intelligence based algorithms have attracted
signicant attention in recent years for solving the complex and combinatorial
optimization problems of data clustering, feature selection and maximization of
matching scores for authentication of human in biometrics [1] computer vision, data
mining and machine learning based algorithms. Motivated from the natural and
social behavioral phenomena, bio-inspired computation algorithms have signicant
research area during the recent years from both multidisciplinary research and the
scientic research purpose. In the last 30 years, a great interest has been devoted to
bio-inspired metaheuristics and it has encouraged and provides successful algo-
rithms and computational simulated tools for dealing with complex and optimiza-
tion problems (ISA Trans [2]). Several of these approaches are motivated from
natural processes and generally start with an initial set of variables and then evolve
to obtain the global minimum or maximum of the objective function and it has been
an escalating interest in algorithms motivated by the behaviors of natural phe-
nomena which are incorporated by many scientists and researchers to solve hard
optimization problems. Hard problems cannot be solved to optimality, or to any
guaranteed bound by any exact (deterministic) method within a reasonabletime
limit [310]. It is computational problems such as optimization of objective func-
tions [11,12] pattern recognition [1,13] control objectives [2,4,14], image pro-
cessing [15,16] and lter modeling [17,18] etc. There are different heuristic
approaches that have been implemented by researches so far, for example Genetic
algorithm [10] is the most well-known and mostly used evolutionary computation
technique and it was developed in the early 1970s at the University of Michigan by
John Holland and his students, whose re-search interests were devoted to the study
of adaptive systems [19]. The basic genetic algorithm is very general and it can be
implemented differently according to the problem: representation of solution
(chromosomes), selection strategy, type of crossover (the recombination operator)
and mutation operators. The xed-length binary string is the most common rep-
resentation of the chromosomes applied in GAs and a simple bit manipulation
operation allow the implementation of crossover and mutation operations.
Emphasis is mainly concentrated on crossover as the main variation operator that
combines multiple (generally two) individuals that have been selected together by
exchanging some of their parts. An exogenous parameter pc[0.6, 1.0] (crossover
rate) indicates the probability per individual to undergo crossover. After evaluating
the tness value of each individual in the selection pool, Individuals for producing
offspring are selected using a selection strategy. A few of the popular selection
schemes are mainly roulette-wheel selection, tournament selection and ranking
selection, etc. After crossover operation, individuals are subjected to mutation
process. Mutation initiates some randomness into the search space to prevent the
optimization process from getting trapped into local optima. Naturally, the mutation
rate is applied with less than 1 % probability but the appropriate value of the
mutation rate for a given optimization problem is an open issue in research.
148 S. Kumar et al.
Simulated Annealing [9] is inspired by the annealing technique used by the different
metallurgists to get a ‘‘well orderedsolid state of minimal energy (while avoiding
the ‘‘meta stablestructures, characteristic of the local minima of energy), Ant
Colony optimization (ACO) algorithm is a metaheuristic technique to solve prob-
lems that has been motivated by the antssocial behaviors in nding shortest paths.
Real ants walk randomly until they nd food and return to their nest while
depositing pheromone on the ground in order to mark their preferred path to attract
other ants to follow [6,20,21], Particle Swarm Optimization (PSO) was introduced
by James Kennedy and Russell Eberhart as a global optimization technique in 1995.
It uses the metaphor of the ocking behavior of birds to solve optimization prob-
lems [22], rey algorithm is a population based metaheuristic algorithm. It has
become an increasingly important popular tool of Swarm Intelligence that has been
applied in almost all research area so of optimization, as well as science and
engineering practice. Fireies have their ashing light. There are two fundamental
functions of ashing light of rey: (1) to attract mating partners and (2) to warn
potential predators. But, the ashing lights comply with more physical rules. On the
one hand, the light intensity of source (I) decrease as the distance (r) increases
according to the term I 1/r
2
. This phenomenon inspired [23] to develop the rey
algorithm [2325], Bat-inspired algorithm is a metaheuristic optimization algo-
rithm. It was invented by Yang et al. [2628] and it is based on the echolocation
behavior of microbats with varying pulse rates of emission and loudness. And
honey bee algorithm [29] etc. Such algorithms are progressively analyzed, deployed
and powered by different researchers in many different research elds [3,5,27,28,
3032]. These algorithms are used to solve different optimization problems. But,
there is no specic algorithm or direct algorithms to achieve the best solution for all
optimization problems. Numerous algorithms give a better solution for some par-
ticular problems than others. Hence, searching for new heuristic optimization
algorithms is an open problem [29] and it requires a lot of exploration of new
metaheuristic algorithms for solving of hard problems.
Recently, one of the metaheuristic approaches has been developed for solving the
hard or complex optimization and data clustering problem which is NP-hard problem
known as black hole heuristic approach. Black Hole heuristic algorithm is inspired
by the black hole phenomenon and black hole algorithm (BH) starts with an initial
population of candidate solutions to an optimization problem and an objective
function that is calculated for them similar to other population-based algorithms.
2 Heuristics Algorithm and Metaheuristic Algorithm
2.1 Heuristics Algorithm
The ‘‘heuristicis Greek word and means to know,to nd,‘‘to discoveror
‘‘to guide an investigationby trial and error methodology [33]. Specically,
Black Hole Algorithm and Its Applications 149
heuristics are techniques which search for near-optimal solutions of problem at a
reasonable computational cost without being able to guarantee either feasibility or
optimality, or even in many cases to state how close to optimality a particular
feasible solution is? [34]. The main algorithmic characteristic of heuristic is based
on mimic physical or biological processes which are motivated by nature phe-
nomena. Quality solution to complex optimization problems can be founded in
reasonable amount of time however, there is no guarantee that is optimal solutions
are reached. A heuristic is a technique designed for solving a problem more quickly
when classic methods are too slow, or for nding an approximate solution when
classic methods or deterministic approaches fail to provides any exact solution of a
hard or complex problem. This is achieved by trading optimality, completeness,
accuracy or precision for speed. Heuristic search exploits additional knowledge
about the problem that helps direct search to more promising paths [23].
2.2 Metaheuristic Algorithm
Metaheuristic algorithms are the master strategy key that modify and update the
other heuristic produced solution that is normally generated in the quest of local
optimal. These nature-inspired algorithms are becoming popular and powerful in
solving optimization problems. The sufxmetaGreek means upper level meth-
odology, beyond or higher level and it generally perform better than simple heu-
ristic approach. Metaheuristic algorithms are conceptual set of all heuristic
approach which is used to nd the optimal solution of a combinatorial optimization
problem. The term metaheuristicwas introduced by Sir F. Glover in research
paper. In addition, metaheuristic algorithms use certain trade-off of randomization
and local search for nding the optimal and near optimal solution. Local search is a
general methodology for nding the high quality solutions to complex or hard
combinatorial optimization problems in reasonable amount of time. It is basically
an iterative based search approach to diversication of neighbor of solutions trying
to enhance the current solution by local changes [23,35].
2.2.1 Characteristics of Metaheuristics
Each meta-heuristic search process depends on balancing between two major
components which is involved through-out search of optimal solution. Two main
components are known as diversication (exploration) and intensication (exploi-
tation) [35].
150 S. Kumar et al.
2.2.2 Diversication or Exploration
Diversication phase ensures that the algorithm explores the search space more
efciently and thoroughly and helps to generate diverse solutions. On other hand,
when diversication is too much, it will increase the probability of nding the true
optimality globally solutions. But, it will often to slow the exploration process with
much low rate of convergence of problem.
2.2.3 Intensication or Exploitation
It uses the local information in search process to generate better solutions of
problems. If there is too much intensication, it will may lead to converge rapidly
often to a local optimum or a wrong solution with respect to problem and reduce the
probability of nding the global optimum solutions of a complex problem.
Therefore, there is requirement of ne tuning or a ne balance and trade-off
between intensication and diversication characteristics of metaheuristic
approach. These metaheuristic techniques are in combination with the solutions of
the best solutions of the complex combinatorial optimization problems. The main
objective behind the best solution of metaheuristic ensures that the solution will
converge to the optimization, while the diversication via randomization avoids the
solution beings trapped or struck at local minima at the same time and increase the
diversity of the solutions of hard problems and solving optimization problems with
multiple often conicting objectives is generally a very difcult target. The good
combinations of these two major components (diversication and intensication)
will usually ensure that the global optimality of given hard or complex is achievable
and it provides a way solve large size population based problem instances by
delivering the efcient solution in reasonable amount of time.
In short, metaheuristics are high level strategies for diversifying search spaces by
using different algorithmic approach. Of great importance hereby is that a dynamic
balance is given between diversication and intensication. The term diversica-
tion generally refers to the exploration of the search space, whereas the term
intensication refers to the exploitation of the accumulated search experience [36].
3 Classication of Bio-inspired Metaheuristic Algorithm
Bio-inspired metaheuristic algorithms can be classied as: population based algo-
rithm, trajectory based, swarm intelligence based, articial intelligence based and
bio-insect behavior based approaches. Some of the most famous algorithms are
Genetic Algorithm (GAs), Simulated Annealing (SA) [9], Articial Immune System
(AIS) [7], Ant Colony Optimization (ACO) [6], Particle Swarm Optimization (PSO)
[22] and Bacterial Foraging Algorithm (BFA) [37]. Genetic Algorithm (GAs) are
enthused from Darwinian evolutionary theory [10], simulated annealing method
Black Hole Algorithm and Its Applications 151
mimics the thermodynamics process of cooling of molten metal for getting the
minimum free energy state. It works with a point and at each iteration builds a point
according to the Boltzmann probability distribution [9,38]. Articial immune sys-
tems (AIS) simulate biological immune systems [7], ant colony optimization (ACO)
eld study a model derived from the observation of real ants behavior and uses these
models as source of inspiration for the design of innovative and novel approaches for
solution of optimization and distributed control problems. The main objective of ant
colony algorithm is that the self organizing principle which allows the highly coor-
dinated behavior of ants can be exploited to coordinate transport, Bacterial foraging
algorithm (BFA) comes from search and optimal foraging of bacteria and particle
swarm optimizati on (PSO) simulates the behavior of ock of birds and sh schooling
which search for best solution in both local and global search space [5,22,38]. Based
on bio-inspired characteristics, various algorithms are illustrated as below (Table 1).
Unlike exact algorithm methodologies (it is guaranteed to nd the optimal
solution and to prove its optimality for every nite size instance of a combinatorial
optimization problem within an instance dependent run time.). The metaheuristic
algorithms ensure to nd the optimal solution of a given hard problem and rea-
sonable amount of time. The application of metaheuristic falls into a large number
of area some of them are as follows:
Engineering design, topological optimization, structural optimizations in elec-
tronics and VLSI design, aerodynamics based structural design.
Fluid dynamics, telecommunication eld, automotives and robotics design and
robotic roadmap planning optimization.
In data mining and machine learning: Data mining in bioinformatics, compu-
tational biology.
System modeling simulations and identication in chemistry, physics and biology.
Images processing and control signal processing: Feature extraction from data
and selection of feature with help of metaheuristic approach.
Planning in routing based problems, robotic planning, scheduling and produc-
tion based problems, logistics and transportation, supply chain management and
environmental.
4 Black Hole Phenomena
In the eighteens-century, Dr. John Michel and Pierre Pierre Simon de Laplace were
established to blemish the idea of black holes. Based on Newtons law, they invented
the concept of a star turning into invisible to the human eye but during that period it
was not able to recognize as a black hole in 1967, John Wheeler the American
physicist rst named the phenomenon of mass collapsing as a black hole [42]. A black
hole in space is a form when a star of massive size collapses named the develop-
ment of mass collapsing as apart. The gravitational power of the black hole is too
strong that even the any light cannot escape from it. The gravity of such body is so
152 S. Kumar et al.
Table 1 Description of bio-inspired algorithms
S.No. Metaheuristic algorithms Description of metaheuristic algorithms
1. Genetic algorithms (GAs) [10] Genetic algorithm is a search and optimization
based techniques that evolve a population of
candidate solutions to a given problem, using
natural genetic variation and natural selection
operators
2. Simulated annealing(SA) algorithm [9] Simulated Annealing is developed by modeling
the steel annealing process and gradually
decreases the temperature (T)
3. Ant colony optimization (ACO) [6] Ant Colony Optimization is inspired from the
behavior of a real ant colony, which is able to
nd the shortest path between its nest and a food
source (destination)
4. Particle swarm optimization (PSO)
algorithm [22]
Particle Swarm Optimization is developed based
on the swarm behavior such as sh and bird
schooling in nature
5. The gravitational search algorithm
(GSA) [39]
It is constructed based on the law of gravity and
the notion of mass interactions. In the GSA
algorithm, the searcher agents are a collection of
masses that interact with each other based on the
Newtonian gravity and the laws of motion
6. Intelligent water drops algorithm [40] It is inspired from observing natural water drops
that ow in rivers and how natural rivers nd
almost optimal paths to their destination. In the
IWD algorithm, several articial water drops
cooperate to change their environment in such a
way that the optimal path is revealed as the one
with the lowest soil on its links
7. Firey algorithm (FA) [23,41] The rey algorithm (FA) was inspired by the
ashing behavior of reies in nature. FA is
nature inspired optimization algorithm that imi-
tates or stimulates the ash pattern and charac-
teristics of reies. It is used data analysis and to
identify homogeneous groups of objects based
on the values of their attributes
8. Honey bee mating optimization
(HBMO) algorithm [29]
It is inspired by the process of marriage in real
honey bees
9. Bat algorithm (BA) It is inspired by the echolocation behavior of
bats. The capability of the echolocation of bats is
fascinating as they can nd their prey and
recognize different types of insects even in
complete darkness
10. Harmony search optimization algorithm It is inspired by the improvising process of
composing a piece of music. The action of
nding the harmony in music is similar to
nding the optimal solution in an optimization
process
11. Big BangBig Crunch (BbBy)
optimization
It is based on one of the theories of the evolution
of the universe. It is composed of the big bang
and big crunch phases. In the big bang phase the
candidate solutions are spread at random in the
search space and in the big crunch phase a
contraction procedure calculates a center of mass
for the population
(continued)
Black Hole Algorithm and Its Applications 153
strong because matter has been squeezed into a tiny space and anything that crosses
the boundary of the black hole will be consumed or by it and vanishes and nothing can
get away from its enormous power. The sphere-shaped boundary of a black hole in
space is known as the event horizon. The radius of the event horizon is termed as the
Schwarzschild radius. At this radius, the escape speed is equal to the speed of light,
and once light passes through, even it cannot escape. Nothing can escape from within
the event horizon because nothing can go faster than light. The Schwarzschild radius
(R) is calculated by R¼2GM
c2, where G is the gravitational constant
(6.67 ×10
11
N×(m/kg)
2
), M is the mass of the black hole, and c is the speed of light.
If star moves close to the event horizon or crosses the Schwarzschild radius it will be
absorbed into the black hole and permanently disappear. The existence of black holes
can be discerned by its effect over the objects surrounding it [43,44].
4.1 Black Hole
A black hole is a region of space-time (x, y, t) whose gravitational eld is so strong and
powerful that nothing can escape from it. The theory and principle of general relativity
predicts that a sufciently compact mass will deform space-time to form a black hole.
Around a black hole, there is a mathematically dened surface called an event horizon
that marks the point of no return. If anything moves close to the event horizon or
crosses the Schwarzschild radius, it will be absorbed into the black hole and per-
manently disappear. The existence of black holes can be discerned by its effect over
the objects surrounding it [45]. The hole is called black because it absorbs all the light
that hits the horizon, reecting nothing, just like a perfect black body in thermody-
namics [46,47]. A black hole has only three independent physical properties: Black
holes mass (M), charge (Q) and angular momentum (J). A charged black hole repels
other like charges just like any other charged object in given space. The simplest black
holes have mass but neither electric charge nor angular momentum [48,49].
Table 1 (continued)
S.No. Metaheuristic algorithms Description of metaheuristic algorithms
12. Black hole (BH) algorithm It is inspired by the black hole phenomenon. The
basic idea of a black hole is simply a region of
space that has so much mass concentrated in it
that there is no way for a nearby object to escape
its gravitational pull. Anything falling into a
black hole, including light, is forever gone from
our universe
154 S. Kumar et al.
4.2 Black Hole Algorithm
The basic idea of a black hole is simply a region of space that has so much mass
concentrated in it that there is no way for a nearby object to escape its gravitational
pull. Anything falling into a black hole, including light, is forever gone from our
universe.
4.2.1 Terminology of Black Hole Algorithm
Black Hole: In black hole algorithm, the best candidate among all the candidates at
each iteration is selected as a black hole.
Stars: All the other candidates form the normal stars. The creation of the black
hole is not random and it is one of the real candidates of the population.
Movement: Then, all the candidates are moved towards the black hole based on
their current location and a random number.
1. Black hole algorithm (black hole) starts with an initial population of candidate
solutions to an optimization problem and an objective function that is calculated
for them.
2. At each iteration of the Black Hole, the best candidate is selected to be the black
hole and the rest form the normal stars. After the initialization process, the black
hole starts pulling stars around it.
3. If a star gets too close to the black hole it will be swallowed by the black hole
and is gone forever. In such a case, a new star (candidate solution) is randomly
generated and placed in the search space and starts a new search.
4.3 Calculation of Fitness Value for Black Hole Algorithm
1. Initial Population: PðxÞ ¼ fxt
1;xt
2;xt
3;. . .;xt
n} randomly generated population of
candidate solutions (the stars) are placed in the search space of some problem or
function.
2. Find the total Fitness of population:
fi¼X
pop size
i¼1
evalðpðtÞ ð1Þ
3. fBH ¼P
pop size
i¼1
evalðpðtÞ
where fiand fBH are the tness values of black hole and i
th
star in the initialized
population. The population is estimated and the best candidate (from remaining
Black Hole Algorithm and Its Applications 155
stars) in the population, which has the best tness value, fiis selected to be the black
hole and the remaining form the normal stars. The black hole has the capability to
absorb the stars that surround it. After initializing the rst black hole and stars, the
black hole starts absorbing the stars around it and all the stars start moving towards
the black hole.
4.3.1 Absorption Rate of Stars by Black Hole
The black hole starts absorbing the stars around it and all the stars start moving towards
the black hole. The absorption of stars by the black hole is formulated as follows:
XiðtÞ ¼ XiðtÞ þ rand  ðXBH ÀXiðtÞÞ ð3Þ
where i = 1;2;3;. . .n, Xt
iand Xtþ1
iare the locations of the ith star at iterations t and
(t + 1) respectively. XBH is the location of the black hole in the search space and rand
is a random number in the interval [0, 1]. N is the number of stars (candidate
solutions). While moving towards the black hole, a star may reach a location with
lower cost than the black hole. In such a case, the black hole moves to the location of
that star and vice versa. Then the black hole algorithm will continue with the black
hole in the new location and then stars start moving towards this new location.
4.3.2 Probability of Crossing the Event Horizon During Moving Stars
In block hole algorithm, the probability of crossing the event horizon of black hole
during moving stars towards the black hole is used to gather the more optimal data
point from search space of the problem. Every star (candidate solution) crosses the
event horizon of the black hole will be sucked by the black hole and every time a
candidate (star) dies it means it sucked in by the black hole, another candidate
solution (star) is populated and distributed randomly over the search space of the
dened problem and go for a new search in the search solution space. It is com-
pleted to remain the number of candidate solutions constant. The next iteration
takes place after all the stars have been moved. The radius of the event horizon in
the black hole algorithm is calculated using the following equation: The radius of
horizon (R) of black hole is demonstrated as follow:
R¼fBH
PN
i¼1fi
ð4Þ
where fiand fBH are the tness values of black hole and ith star. N is the number of
stars (candidate solutions).When the distance between a candidate solution and the
black hole (best candidate) is less than R, that candidate is collapsed and a new
candidate is created and distributed randomly in the search space.
156 S. Kumar et al.
4.4 Pseudo Code for Black Hole Algorithm
1. Initialize a population of stars with random locations in the search space
PðtÞ ¼ fxt
1;xt
2;xt
3...xt
ng. Randomly generated population of candidate solutions
(the stars) are placed in the search space of some problem or function.
Loop
2. For each ith star, evaluate the objective function
fi¼X
pop size
i¼1
evalðpðtÞ
fBH ¼X
pop size
i¼1
evalðpðtÞ
3. Select the best star that has the best tness value as the black hole.
4. Change the location of each star according to Eq. (3) as
XiðtÞ ¼ XiðtÞ þ rand  ðXBH ÀXiðtÞÞ
5. If a star reaches a location with lower cost than the black hole, exchange their
locations.
6. If a star crosses the event horizon of the black hole
7. Calculate the event horizon radius (R)
REventHorizon ¼fBH
PN
i¼1fi
8. When the distance between a candidate solution and the black hole (best
candidate) is less than R, that candidate is collapsed and a new candidate is
created and distributed randomly in the search space.
9. Replace it with a new star in a random location in the search space
10. else
break
11. If a termination criterion (a maximum number of iterations or a sufciently
good tness) is met exit the loop.
The candidate solution to the clustering problem corresponds to one dimensional
(1-D) array while applying black hole algorithm for data clustering. Every candi-
date solution is considered as k initial cluster centers and the individual unit in the
array as the cluster center dimension. Figure 1illustrates a candidate solution of a
problem with three clusters and all the data objects have four features.
Black Hole Algorithm and Its Applications 157
4.5 Advantage of Black Hole Algorithm
It has a simple structure and it is easy to implement.
It is free from tuning parameter issues like genetic algorithm local search utilizes
the schemata(S) theorem of higher order O(S) (compactness) and longer dening
length δ(S). In Genetic Algorithm, to improve the ne tuning capabilities of a
genetic algorithm, which is a must for high precision problem over the traditional
representation of binary string of chromosomes? It was required a new mutation
operator over the traditional mutation operator however, it only use only local
knowledge i.e. it stuck into local minimum optimal value.
The Black Hole algorithm converges to global optimum in all the runs while the
other heuristic algorithms may get trapped in local optimum solutions like genetic
algorithm, Ant colony Optimization algorithm simulated Annealing algorithm.
5 Application of Black Hole Metaheuristic Algorithm
Nature-inspired metaheuristic algorithms have been used in many elds such as
computer science [5153] clustering analysis, industry [54] agriculture [55], com-
puter vision [5658] is about computing visual properties from the real world and
automatic circle detection in digital still images has been considered an important
and complex task for the computer vision community that has devoted a tremen-
dous amount of research, seeking for an optimal circle detector. Electro-magnetism
Optimization (EMO) bio-inspired algorithm based circle detector method which
assumes the overall detection process as optimization problem, forecasting [59],
medicine and biology [60], scheduling [61], data mining [62,63], economy [64]
and engineering [65]. There are following applications of black hole algorithm in
data clustering and its performance analysis.
Fig. 1 Learning problems: dots correspond to points without any labels. Points with labels are
denoted by plus signs,asterisks, and crosses. In (c), the must-link and cannot link constraints are
denoted by solid and dashed lines, respectively [50]. aSupervised. bPartially labelled. cPartially
constrained. d Unsupervised
158 S. Kumar et al.
5.1 Cluster Analysis
Clustering is an important unsupervised classication approach, where a set of
patterns are usually vectors (observations, data items, or feature vectors) into in
multi-dimensional space are grouped into clusters or groups, based on some sim-
ilarity metrics between data objects; the distance measurement is used to nd out
the similarity and dissimilarity of different object of our database [66]. The main
idea is to classify a given data set through a certain number of clusters by mini-
mizing distances between objects inside each cluster. Cluster analysis is the orga-
nization of a collection of patterns (usually represented as a vector of
measurements, or a point in a multidimensional space) into clusters based on
similarity [50,67]. Cluster is often used for different type of application in image
processing, data statistical analysis, medical imaging analysis and other eld of
science and engineering research eld. In addition, it is a main task of exploratory
data mining and a common technique for statistical data analysis used in many
elds including machine learning, pattern recognition, image analysis, information
retrieval and bioinformatics.
5.1.1 Data Analysis Problem Is Specied as Follows
Given N objects, assign each object to one of K clusters and minimize the sum of
squared euclidean distances between each object and the center of the cluster that
belongs to every allocated object:
FðO;ZÞ ¼ XN
i¼1XK
j¼1WijðOiÀZjÞ2
where ðOiÀZjÞ2¼OiÀZj
is the Euclidean distance between a data object Oi
and the cluster center Zj. N and K are the number of data objects and the number of
clusters, respectively. Wij is the association weight of data object Oiwith cluster j.
Wij ¼1 if object i is assign to cluster j:
0 if object i is not assigned to cluster j:
5.2 Data Clustering
The goal of data clustering also known as cluster analysis is to discover the natural
grouping of a set of patterns, points or objects. An operational denition of clus-
tering can be stated as follows: Given a representation of n objects, nd K groups
based on a measure of similarity such that the similarities between objects in the
same group are high while the similarities between objects in different groups are
Black Hole Algorithm and Its Applications 159
low. Figure 2demonstrates that clusters may differ in terms of their shape, size, and
density. The presence of noise in the data makes the detection of the clusters even
more difcult and ideal cluster can be dened as a set of points that is compact and
isolated. While humans are excellent cluster seekers in two and possibly three
dimensions, we need automatic algorithms for high-dimensional data. It is this
challenge along with the unknown number of clusters for the given data that has
resulted in thousands of clustering algorithms that have been published and that
continue to appear. An example of clustering is shown in Fig. 2. In pattern rec-
ognition, data analysis is concerned with predictive modeling: given some training
data and to predict the behavior of the unseen test data. This task is also referred to
as learning.
5.2.1 Classication of Machine Learning
Classication of data based on machine algorithms is follow:
Supervised Learning and Unsupervised Learning
Supervised learning is the machine learning approach of inferring a function from
labeled training data. The training data consist of a set of training examples. Let a
set of labeled training data X¼ ½ðxn;ynÞ2XÂY n Nwhere x is input space
and y a nite label set. It is assumed that each ½ðxn;ynÞis drawn independently from
axed, but unknown probability distribution p, where ½ðxn;ynÞ2pðx;yÞ. Unfor-
tunately, supervised learning method is very expensive and time consuming to
Fig. 2 Diversity of clusters. The seven clusters in (a) [denoted by seven different colors in 1(b)]
differ in shape,size and density. Although these clusters are apparent to a data analyst, none of the
available clustering algorithms can detect all these clusters. (Source [50]. AInput data. bDesired
clustering
160 S. Kumar et al.
collect a huge amount of labeled data ½ðxn;ynÞ. One of learning approach to deal
such issues is to exploit unsupervised learning. The main aim object is to learn a
classication model from both labeled X¼ ½ðxn;ynÞ2XÂY n Nand
½xjNþM
j¼Nþ1unlabelled data where N M. Clustering is a more difcult and chal-
lenging problem than classication.
Semi-supervised Learning
In semi-supervised classication, the labels of only a small portion of the training
data set are available. The unlabeled data, instead of being discarded, are also used
in the learning process. In semi-supervised clustering, instead of specifying the
class labels, pair-wise constraints are specied, which is a weaker way of encoding
the prior knowledge. A pair-wise must-link constraint corresponds to the require-
ment that two objects should be assigned the same cluster label, whereas the cluster
labels of two objects participating in a cannot-link constraint should be different
[50,68]. Constraints can be particularly benecial in data clustering, where precise
denitions of underlying clusters are absent. Figure 1illustrates this spectrum of
different types of learning problems of interest in pattern recognition and machine
learning.
5.3 K-means Clustering
Cluster analysis is prevalent in any discipline that involves analysis of multivariate
data. Clustering algorithm K-means was rst published in 1955. It is difcult to
exhaustively list the numerous scienticelds and applications that have utilized
clustering techniques as well as the thousands of published algorithms. Image
segmentation an important problem in computer vision, can be formulated as a
clustering problem. Documents can be clustered to generate topical hierarchies for
efcient information access. Clustering is also used to group customers into dif-
ferent types for efcient marketing to group services delivery engagements for
workforce management and planning as well as to study genome data in biology
[50]. Data clustering has been used for the following three main purposes.
Underlying structure to gain insight into data generates hypotheses, detect
anomalies, and identify salient features.
Natural classication to identify the degree of similarity among forms or
organisms (phylogenetic relationship).
Compression: as a method for organizing the data and summarizing it through
cluster prototypes.
Among the classical clustering algorithms, K-means is the most well known
algorithm due to its simplicity and efciency.
Black Hole Algorithm and Its Applications 161
5.3.1 Classication of Clustering Algorithms
Clustering algorithms are classied into can be broadly divided into two categories:
(1): Hierarchical clustering and Partitional clustering.
Hierarchical Algorithm
Hierarchical clustering constructs a hierarchy of groups by splitting a large cluster
into small ones and merging smaller cluster into a large cluster centroid [69]. In
this, there are two main approaches:(1) the divisive approach, which splits a large
cluster into two or more smaller clusters; (2) the agglomerative approach, which
builds a larger cluster by merging two or more smaller clusters by recursively nd
nested clusters either in agglomerative mode (starting with each data point in its
own cluster and merging the most similar pair of clusters successively to form a
cluster hierarchy. Input to a hierarchical algorithm is an n ×n similarity matrix,
where n is the number of objects to be clustered [50].
Partitioned Algorithm
Partitioned clustering algorithms nd all the clusters simultaneously as a partition of
the data without hierarchical structure. The most widely used partitional clustering
approaches are prototype-based clustering algorithm where each cluster is dem-
onstrated by its centre. The objective function or square error function is sum of
distance from the pattern to the centre [70]. Partitioned algorithm can use either an
n×d pattern matrix, where n objects are embedded in a d-dimensional feature
space, or an n ×n similarity matrix. Note that a similarity matrix can be easily
derived from a pattern matrix but ordination methods such as multi-dimensional
scaling (MDS) are needed to derive a pattern matrix from a similarity matrix. The
most well-known hierarchical algorithms are single-link and complete-link. The
most popular and the simplest partitioned algorithm is K-means. Since partitioned
algorithms are preferred in pattern recognition due to the nature of available data,
our coverage here is focused on these algorithms [50].
5.4 K-means Algorithm
K-means has a rich and diverse history as it was independently discovered in
different scienticelds by [71], Lloyd. It is a popular partitional clustering
algorithm and essentially a function minimization technique, where the main
objective function is the square error.
Let X ¼ fxig;i¼1;2;3;4...;nbe the set of n d-dimensional points to be
clustered into a set of K clusters, C¼ fckgwhere k = 1, 2, 3, 4.K. K-means
162 S. Kumar et al.
algorithm nds a partition such that the squared error between the empirical mean of
a cluster and the points in the cluster is minimized. Let lkbe the mean of cluster Ck.
The squared error between lkand the points in cluster Ckis dened as
JðckÞ ¼ P
xi2ck
ðXiÀlkÞ2. The goal of K-means is to minimize the sum of the squared
error over all K clusters, JðCÞ ¼ PK
k¼1PN
xi2ckðXiÀlkÞ2. Minimizing this objective
function is known to be an NP-hard problem (even for K = 2). Thus K-means which
is a greedy algorithm and can only converge to a local minimum, even though recent
study has shown with a large probability, K-means could converge to the global
optimum when clusters are well separated. K-means starts with an initial partition
with K clusters and assign patterns to clusters so as to reduce the squared error. Since
the squared error always decreases with an increase in the number of clusters K (with
J(C) = 0 when K = n), it can be minimized only for a xed number of clusters [64].
Recently efcient hybrid evolutionary and bio-inspired metaheuristic methods and
K-means to overcome local problems in clustering are used [72,73] (Niknam and
Amiri 2010).
5.4.1 Steps of K-Means Algorithm
1. Select an initial partition with K clusters.
2. Repeat steps 2 and 3 until cluster membership stabilizes.
3. Generate a new partition by assigning each pattern to its closest cluster center.
4. Compute new cluster centers (Figs. 3,4).
Fig. 3 Illustration of K-means algorithm aTwo-dimensional input data with three clusters; bthree
seed points selected as cluster centers and initial assignment of the data points to clusters; and
cupdates intermediate cluster labels
Black Hole Algorithm and Its Applications 163
The K-means algorithm requires three user-specied parameters: number of
clusters K, cluster initialization, and distance metric. The most critical choice is K.
While no perfect mathematical criterion exists, a number of heuristics are available
for choosing K. Typically, K-means is run independently for different values of K
and the partition that appears the most meaningful to the domain expert is selected.
Different initializations can lead to different nal clustering because K-means. One
way to overcome the local minima is to run the K-means algorithm, for a given K,
with multiple different initial partitions and choose the partition with the smallest
squared error. K-means is typically used with the Euclidean metric for computing
the distance between points and cluster centers. As a result, K-means nds spherical
or ball-shaped clusters in data. K-means with mahalanobis distance metric has been
used to detect hyper ellipsoidal clusters [74].
5.5 Advantages and Disadvantages of the K-Means
Clustering
Among the all classical clustering algorithms, K-means clustering algorithm is the
well known algorithm due to their simplicity and efciency. It suffers from two
problems: It needs number of cluster before starting i.e. the number of cluster must
be known a priori. In addition, its performance strongly depends on the initial
centroids and may get stuck in local optima solutions and its convergence rate are
affected [56]. In order to overcome the shortcomings of K-means many heuristic
approaches have been applied in the last two decades.
Fig. 4 a and bintermediate iterations updating cluster labels and their centers an nal clustering
obtained by K-means
164 S. Kumar et al.
5.6 Evolutionary Computation Algorithms for Cryptography
and Cryptanalysis
Cryptography is a methodology and study of techniques for secure communication
in the presence of third parties and cryptanalysis is the study of analyzing infor-
mation systems in order to study the hidden aspects of the systems. The crypt-
analysis of different cipher problems can be formulated as NP-hard combinatorial
problem. Metaheuristic algorithm provides a very powerful tool for the crypt-
analysis of simple substitution ciphers using a cipher text only attack and automated
cryptanalysis of classical simple substitution ciphers [7577].
Recently, efcient hybrid evolutionary optimization algorithms based on com-
bining evolutionary methods and swarm intelligence has signicant role in the eld
of cryptography and demonstrates a dynamical system which is sensitive to initial
condition and generates apparently random behavior but at the same time the
system is completely deterministic. Hussein et al. presents a new encryption scheme
based on a new chaotic map derived from a simplied model of Swarm Intelligence
(SI) [78] and overcome the problem of cryptograph [7981].
5.7 Short-Term Scheduling Based System Optimization
by Black Hole Algorithm
Recently, a major challengeable subject that are facing the electric power system
operator and how to manage optimally the power generating units over a scheduling
horizon of one day considering all of the practical equality inequality and dynamic
constraints. These constraints of system are comprised of load plus transmission
losses balance, valve-point effects, prohibited operating zones, multi-fuel options,
line ow constraints, operating reserve and minimum on/off time. There is not
available any optimization for the short-term thermal generation scheduling
(STGS). It has high-dimensional, high-constraints, non-convex, non-smooth and
non-linear nature and needs an efcient algorithm to be solved. Then, a new
optimization approach, known as gradient-based modied teachinglearning-based
optimization combined with black hole (MTLBOBH) algorithm has been planned
to seek the optimum operational cost [8284].
6 Discussion
The eld of metaheuristic approaches for the application to combinatorial optimi-
zation problems is a rapidly growing eld of research. This is due to a great
importance of combinatorial optimization problems for the scientic as well as the
industrial world. Since the last decade metaheuristics approaches have a signicant
Black Hole Algorithm and Its Applications 165
role for solving the complex problem of different applications in science, computer
vision, computer science, data analysis, data clustering, and mining, clustering
analysis, industrial forecasting of weather, medical and biological research, econ-
omy and different multi-disciplinary engineering research eld. In addition, meta-
heuristics are useful in computer vision, image processing, machine learning and
pattern recognition of any subject which can be deployed for nding the optimal set
of discriminant values in form of Eigen vector (face recognition, ngerprint and
other biometric characteristics) and incorporate these values for identication
purpose. In biometrics and computer vision, face recognition has always been a
major challenge for machine learning researchers and pattern recognition.
Introducing the intelligence in machines to identifying humans from their face
images (which is stored in template data base) deals with handling variations due to
illumination condition, pose, facial expression, scale and disguise etc., and hence
becomes a complex task in computer vision. Face recognition demonstrates a
classication problem for human recognition. Face recognition classication
problems can be solved by a technique for the design of the Radial Basis Functions
neural network with metaheuristic approaches (like rey, particle swarm intelli-
gence and black hole algorithm). These algorithms can be used at match score level
in biometrics and select most discriminant set of optimal features for identication
of face and their classication.
Recently black hole methodology plays a major role of modeling and simulating
natural phenomena for solving complex problems. The motivation for new heuristic
optimization algorithm is based on the black hole phenomenon. Further, it has a
simple structure and it is easy to implement and it is free from parameter tuning
issues like genetic algorithm. The black hole algorithm can be applied to solve the
clustering problem and can run on different benchmark datasets. In future research,
the proposed algorithm can also be utilized for many different areas of applications.
In addition, the application of BH in combination with other algorithms may be
effective. Meta-heuristics support managers in decision-making with robust tools
that provide high-quality solutions to important applications in business, engi-
neering, economics and science in reasonable time horizons.
7 Conclusion and Future Direction
We conclude that new black hole algorithm approach is population based same as
particle swarm optimization, rey, genetic algorithm, BAT algorithm and other
evolutionary methods. It is free from parameter tuning issues like genetic algorithm
and other. It does not suffer from premature convergence problem. This implies that
black hole is potentially more powerful in solving NP-hard (e.g. data clustering
problem) problems which is to be investigated further in future studies. The further
improvement on the convergence of the algorithm is to vary the randomization
parameter so that it decreases gradually as the optima are approaching. In wireless
sensor network, density of deployment, scale, and constraints in battery, storage
166 S. Kumar et al.
device, bandwidth and computational resources create serious challenges to the
developers of WSNs. The main issues of the node deployment, coverage and
mobility are often formulated as optimization problems and moth optimization
techniques suffer from slow or weak convergence to the optimal solutions for high
performance optimization methods that produce high quality solutions by using
minimum resources. Bio-inspired black hole algorithm can give a model to solve
optimization problems in WSNs due to its simplicity, best solution, fast convergence
and minimum computational complexity. These can be form important topics for
further research in computer network. Furthermore, as a relatively straight forward
extension, the black hole algorithm can be modied to solve multi objective opti-
mization problems. In addition, the application of black hole in combination with
other algorithms may form an exciting area for further research.
References
1. Tan, X., Bhanu, B.: Fingerprint matching by genetic algorithms. Pattern Recogn. 39, 465477
(2006)
2. Karakuzu, C.: Fuzzy controller training using particle swarm optimization for nonlinear
system control. ISA Trans. 47(2), 229239 (2008)
3. Rajabioun, R.: Cuckoo optimization algorithm. Elsevier Appl. Soft Comput. 11, 55085518
(2011)
4. Tsai Hsing, C., Lin, Yong-H: Modication of the sh swarm algorithm with particle swarm
optimization formulation and communication behavior. Appl. Soft Comput. Elsevier 1,
53675374 (2011)
5. Baojiang, Z., Shiyong, L.: Ant colony optimization algorithm and its application to neu ro-
fuzzy controller design. J. Syst. Eng. Electron. 18, 603610 (2007)
6. Dorigo, M., Maniezzo, V., Colorni, A.: The ant system: optimization by a colony of
cooperating agents. IEEE Trans. Syst. Man Cybern. Part B 26(1), 2941 (1996)
7. Farmer, J.D., et al.: The immune system, adaptation and machine learning. Phys. D Nonlinear
Phenom. Elsevier 22(13), 187204 (1986)
8. Kim, D.H., Abraham, A., Cho, J.H.: A hybrid genetic algorithm and bacterial foraging
approach for global optimization. Inf. Sci. 177, 39183937 (2007)
9. Kirkpatrick, S., Gelatto, C.D., Vecchi, M.P.: Optimization by simulated annealing. Science
220, 671680 (1983)
10. Tang, K.S., Man, K.F., Kwong, S., He, Q.: Genetic algorithms and their applications. IEEE
Sig. Process. Mag. 3(6), 2237 (1996)
11. Du, Weilin, Li, B.: Multi-strategy ensemble particle swarm optimization for dynamic
optimization. Inf. Sci. 178, 30963109 (2008)
12. Yao, X., Liu, Y., Lin, G.: Evolutionary programming made faster. IEEE Trans. Evol. Comput.
3, 82102 (1999)
13. Liu, Y., Yi, Z., Wu, H., Ye, M., Chen, K.: A tabu search approach for the minimum sum-of-
squares clustering problem. Inf. Sci. 178(12), 26802704 (2008)
14. Kim, T.H., Maruta, I., Sugie, T.: Robust PID controller tuning based on the constrained
particle swarm optimization. J. Autom. Sciencedirect 44(4), 11041110 (2008)
15. Cordon, O., Santamarı, S., Damas, J.: A fast and accurate approach for 3D image registration
using the scatter search evolutionary algorithm. Pattern Recogn. Lett. 27, 11911200 (2006)
16. Yang, X.S.: Firey algorithms for multimodal optimization, In: Proceeding of Stochastic
Algorithms: Foundations and Applications (SAGA), 2009 (2009)
Black Hole Algorithm and Its Applications 167
17. Kalinlia, A., Karabogab, N.: Articial immune algorithm for IIR lter design. Eng. Appl.
Artif. Intell. 18, 919929 (2005)
18. Lin, Y.L., Chang, W.D., Hsieh, J.G.: A particle swarm optimization approach to nonlinear
rational lter modeling. Expert Syst. Appl. 34, 11941199 (2008)
19. Holland, J.H.: Adaptation in Natural and Articial Systems. University of Michigan Press,
Ann Arbor (1975)
20. Jackson, D.E., Ratnieks, F.L.W.: Communication in ants. Curr. Biol. 16, R570R574 (2006)
21. Goss, S., Aron, S., Deneubourg, J.L., Pasteels, J.M.: Self-organized shortcuts in the Argentine
ant. Naturwissenschaften 76, 579581 (1989)
22. Kennedy, J., Eberhart, R.C.: Particle swarm optimization. Proc. IEEE Int. Conf. Neural
Networks 4, 19421948 (1995)
23. Yang, X. S.: 2010, Nature-inspired metaheuristic algorithms, Luniver Press
24. Tarasewich, p, McMullen, P.R.: Swarm intelligence: power in numbers. Commun. ACM 45,
6267 (2002)
25. Senthilnath, J., Omkar, S.N., Mani, V.: Clustering using rey algorithm: performance study.
Swarm Evol. Comput. 1(3), 164171 (2011)
26. Yang, X.S.: Firey algorithm. Engineering Optimization, pp. 221230 (2010)
27. Yang, X.S.: Bat algorithm for multi-objective optimization. Int. J. Bio-inspired Comput. 3(5),
267274 (2011)
28. Tripathi, P.K., Bandyopadhyay, S., Pal, S.K.: Multi-objective particle swarm optimization
with time variant inertia and acceleration coefcients. Inf. Sci. 177, 50335049 (2007)
29. Karaboga, D.: An idea based on honey bee swarm for numerical optimization. Technical
Report TR06,Erciyes University (2005)
30. Ellabib, I., Calamari, P., Basir, O.: Exchange strategies for multiple ant colony system. Inf.
Sci. 177, 12481264 (2007)
31. Hamzaçebi, C.: Improving genetic algorithms performance by local search for continuous
function optimization. Appl. Math. Comput. 96(1), 309317 (2008)
32. Lozano, M., Herrera, F., Cano, J.R.: Replacement strategies to preserve useful diversity in
steady-state genetic algorithms. Inf. Sci. 178, 44214433 (2008)
33. Lazar, A.: Heuristic knowledge discovery for archaeological data using genetic algorithms and
rough sets, Heuristic and Optimization for Knowledge Discovery, IGI Global, pp. 263278
(2014)
34. Russell, S.J., Norvig, P.: Articial Intelligence a Modern Approach. Prentice Hall, Upper
Saddle River (2010). 1132
35. Fred, W.: Glover, Manuel Laguna, Tabu Search, 1997, ISBN: 079239965X
36. Christian, B., Roli, A.: Metaheuristics in combinatorial optimization: Overview and
conceptual comparison. ACM Comput. Surveys (CSUR) 35(3), 268308 (2003)
37. Gazi, V., Passino, K.M.: Stability analysis of social foraging swarms. IEEE Trans. Syst. Man
Cybern. Part B 34(1), 539557 (2008)
38. Deb, K.: Optimization for Engineering Design: Algorithms and Examples, Computer-Aided
Design. PHI Learning Pvt. Ltd., New Delhi (2009)
39. Rashedi, E.: Gravitational Search Algorithm. M.Sc. Thesis, Shahid Bahonar University of
Kerman, Kerman (2007)
40. Shah-Hosseini, H.: The intelligent water drops algorithm: a nature-inspired swarm-based
optimization algorithm. Int. J. Bio-inspired Comput. 1(1), 7179 (2009)
41. Dos Santos, C.L., et al.: A multiobjective rey approach using beta probability. IEE Trans.
Magn. 49(5), 20852088 (2013)
42. Talbi, E.G.: Metaheuristics: from design to implementation, vol. 74, p. 500. Wiley, London
(2009)
43. Giacconi, R., Kaper, L., Heuvel, E., Woudt, P.: Black hole research past and future. In: Black
Holes in Binaries and Galactic Nuclei: Diagnostics. Demography and Formation, pp. 315.
Springer, Berlin, Heidelberg (2001)
44. Pickover, C.: Black Holes: A Travelers Guide. Wiley, London (1998)
45. Frolov, V.P., Novikov, I.D.: Phys. Rev. D. 42, 1057 (1990)
168 S. Kumar et al.
46. Schutz, B. F.: Gravity from the Ground Up. Cambridge University Press, Cambridge. ISBN
0-521-45506-5 (2003)
47. Davies, P.C.W.: Thermodynamics of Black Holes. Reports on Progress in Physics, Rep. Prog.
Phys. vol. 41 Printed in Great Britain (1978)
48. Heusler, M.: Stationary black holes: uniqueness and beyond. Living Rev. Relativity 1(1998),
6 (1998)
49. Nemati, M., Momeni, H., Bazrkar, N.: Binary black holes algorithm. Int. J. Comput. Appl. 79
(6), 3642 (2013)
50. Jain, A.K.: Data clustering: 50 years beyond K-means. Pattern Recogn. Lett. 31(8), 651666
(2010)
51. Akay, B., Karaboga, D.: A modied articial bee colony algorithm for real-parameter
optimization. Inf. Sci. 192, 120142 (2012)
52. El-Abd, M.: Performance assessment of foraging algorithms vs. evolutionary algorithms. Inf.
Sci. 182, 243263 (2012)
53. Ghosh, S., Das, S., Roy, S., Islam, M.S.K., Suganthan, P.N.: A differential covariance matrix
adaptation evolutionary algorithm for real parameter optimization. Inf. Sci. 182, 199219
(2012)
54. Fox, B., Xiang, W., Lee, H.: Industrial applications of the ant colony optimization algorithm.
Int. J. Adv. Manuf. Technol. 31, 805814 (2007)
55. Geem, Z., Cisty, M.: Application of the harmony search optimization in irrigation. Recent
Advances in Harmony Search Algorithm, pp. 123134. Springer, Berlin (2010)
56. Selim, S.Z., Ismail, M.A.: K-means-type algorithms: a generalized convergence theorem and
characterization of local optimality pattern analysis and machine intelligence. IEEE Trans.
PAMI 6, 8187 (1984)
57. Wang, J., Peng, H., Shi, P.: An optimal image watermarking approach based on a multi-
objective genetic algorithm. Inf. Sci. 181, 55015514 (2011)
58. Picard, D., Revel, A., Cord, M.: An application of swarm intelligence to distributed image
retrieval. Inf. Sci. 192, 7181 (2012)
59. Chaturvedi, D.: Applications of genetic algorithms to load forecasting problem. Springer,
Berlin, pp. 383402 (2008) (Journal of Soft Computing)
60. Christmas, J., Keedwell, E., Frayling, T.M., Perry, J.R.B.: Ant colony optimization to identify
genetic variant association with type 2 diabetes. Inf. Sci. 181, 16091622 (2011)
61. Guo, Y.W., Li, W.D., Mileham, A.R., Owen, G.W.: Applications of particle swarm
optimization in integrated process planning and scheduling. Robot. Comput.-Integr. Manuf.
Elsevier 25(2), 280288 (2009)
62. Rana, S., Jasola, S., Kumar, R.: A review on particle swarm optimization algorithms and their
applications to data clustering. Artif. Intell. Rev. 35, 211222 (2011)
63. Yeh, W.C.: Novel swarm optimization for mining classication rules on thyroid gland data.
Inf. Sci. 197, 6576 (2012)
64. Zhang, Y., Gong, D.W., Ding, Z.: A bare-bones multi-objective particle swarm optimization
algorithm for environmental/economic dispatch. Inf. Sci. 192, 213227 (2012)
65. Marinakis, Y., Marinaki, M., Dounias, G.: Honey bees mating optimization algorithm for the
Euclidean traveling salesman problem. Inf. Sci. 181, 46844698 (2011)
66. Anderberg, M.R.: Cluster analysis for application. Academic Press, New York (1973)
67. Hartigan, J.A.: Clustering Algorithms. Wiley, New York (1975)
68. Valizadegan, H., Jin, R., Jain, A.K.: Semi-supervised boosting for multi-class classication.
19th European Conference on Machine Learning (ECM), pp. 1519 (2008)
69. Chris, D., Xiaofeng, He: Cluster merging and splitting in hierarchical clustering algorithms.
Proc. IEEE ICDM 2002, 18 (2002)
70. Leung, Y., Zhang, J., Xu, Z.: Clustering by scale-space ltering. IEEE Trans. Pattern Anal.
Mach. Intell. 22, 13961410 (2000)
71. Révész, P.: On a problem of Steinhaus. Acta Math. Acad. Scientiarum Hung. 16(34),
311331(1965)
Black Hole Algorithm and Its Applications 169
72. Niknam, T., et al.: An efcient hybrid evolutionary optimization algorithm based on PSO and
SA for clustering. J. Zhejiang Univ. Sci. A 10(4), 512519 (2009)
73. Niknam, T., Amiri, B.: An efcient hybrid approach based on PSO, ACO and k-means for
cluster analysis. Appl. Soft Comput. 10(1), 183197 (2011)
74. Ding, C., He, X.: K-means clustering via principal component analysis. Proceedings of the
21th international conference on Machine learning, pp. 29 (2004)
75. Uddin, M.F., Youssef, A.M.: Cryptanalysis of simple substitution ciphers using particle swarm
optimization. IEEE Congress on Evolutionary Computation, pp. 677680 (2006)
76. Clerc, M., Kennedy, J.: The particle swarm-explosion, stability, and convergence in a
multidimensional complex space. IEEE Trans. Evol. Comput. 6, 5873 (2002)
77. Danziger, M., Amaral Henriques, M.A.: Computational intelligence applied on cryptology: a
brief review. Latin America Transactions IEEE (Revista IEEE America Latina) 10(3),
17981810 (2012)
78. Chee, Y., Xu, D.: Chaotic encryption using discrete-time synchronous chaos. Phys. Lett.
A348(36), 284292 (2006)
79. Hussein, R.M., Ahmed, H.S., El-Wahed, W.: New encryption schema based on swarm
intelligence chaotic map. Proceedings of 7th International Conference on Informatics and
Systems (INFOS), pp. 17 (2010)
80. Chen, G., Mao, Y.: A symmetric image encryption scheme based on 3D chaotic cat maps.
Chaos Solutions Fractals 21, 749761 (2004)
81. Hongbo, Liu: Chaotic dynamic characteristics in swarm intelligence. Appl. Soft Comput. 7,
10191026 (2007)
82. Azizipanah-Abarghooeea, R., et al.: Short-term scheduling of thermal power systems using
hybrid gradient based modied teachinglearning optimizer with black hole algorithm.
Electric Power Syst. Res. Elsevier 108, 1634 (2014)
83. Bard, J.F.: Short-term scheduling of thermal-electric generators using Lagrangian relaxation.
Oper. Res. 36(5), 756766 (1988)
84. Yu, I.K., Song, Y.H.: A novel short-term generation scheduling technique of thermal units
using ant colony search algorithms. Int. J. Electr. Power Energy Syst. 23, 471479 (2001)
170 S. Kumar et al.

Supplementary resource (1)

... The black hole optimization algorithm is a robust stochastic optimization technique based on an explanation of the behavior of a black hole in space (Kumar et al., 2015). The steps outlined below explain how to model the black hole anomaly's BHA: ...
... Schema for Black Holes(Kumar et al., 2015). ...
... where M is the black hole mass, G is the gravitational constant and c is the light speed. According to the Black Hole theories [18], all objects that enter into the event horizon can not escape due to the massive gravitational attraction force. (c) Illustration of white-hole, black-hole and wormhole, respectively from left to right. ...
Article
Full-text available
Control algorithms have been proposed based on knowledge related to nature-inspired mechanisms, including those based on the behavior of living beings. This paper presents a review focused on major breakthroughs carried out in the scope of applied control inspired by the gravitational attraction between bodies. A control approach focused on Artificial Potential Fields was identified, as well as four optimization metaheuristics: Gravitational Search Algorithm, Black-Hole algorithm, Multi-Verse Optimizer, and Galactic Swarm Optimization. A thorough analysis of ninety-one relevant papers was carried out to highlight their performance and to identify the gravitational and attraction foundations, as well as the universe laws supporting them. Included are their standard formulations, as well as their improved, modified, hybrid, cascade, fuzzy, chaotic and adaptive versions. Moreover, this review also deeply delves into the impact of universe-inspired algorithms on control problems of dynamic systems, providing an extensive list of control-related applications, and their inherent advantages and limitations. Strong evidence suggests that gravitation-inspired and black-hole dynamic-driven algorithms can outperform other well-known algorithms in control engineering, even though they have not been designed according to realistic astrophysical phenomena and formulated according to astrophysics laws. Even so, they support future research directions towards the development of high-sophisticated control laws inspired by Newtonian/Einsteinian physics, such that effective control-astrophysics bridges can be established and applied in a wide range of applications.
... The algorithm principle is based on the black hole phenomenon and the stars that move toward the black hole. There is a possibility that a star reaches a better location in continuous space or a better objective in discrete space [31,32]. Thus, the new star replaces the black hole. ...
Chapter
Full-text available
Black hole algorithm (BHA) is a popular metaheuristic algorithm proposed and applied for data clustering in 2013. BHA was applied to continuous and discrete problems; it is also hybridized with some algorithms in the literature. The pure BHA shows better performance than others in discrete optimization, such as traveling salesman problems. However, it requires improving the algorithm with competitive heuristics. Many heuristics have often been used to construct the initial tour of a salesman, such as the nearest neighbor algorithm (NN), nearest insertion algorithm (NI), cheapest insertion algorithm (CI), random insertion algorithm (RI), furthest insertion algorithm (FI), and minimal spanning tree algorithm (MST). In addition, the black hole algorithm is combined with popular heuristics, such as swap/or insert, reverse/or 2-opt swap, and swap-reverse/or 3-opt swap, and tested with proper parameters in this study. In the experimentation, classical datasets are used via TSP-library. The experimental results are given as best, average solutions/or deviations, and CPU time for all datasets. Besides, the hybrid algorithms demonstrate a better performance rate to get optimality. Finally, hybrid algorithms solve the discrete optimization problem in a short computing time for all datasets.
... One advantage of the BHA is its parameterless nature, which simplifies its implementation. Unlike some other heuristics, the BHA has the ability to converge to the global optimum in all runs and is not prone to being trapped in local optima [29,[39][40][41][42]. In this study, the BHA was selected as the feature selection method for enhancing the detection rate of IDS due to its simplicity, ease of implementation, and absence of specific parameters. ...
Article
Full-text available
The detection rate of network intrusion detection systems mainly depends on relevant features; however, the selection of attributes or features is considered an issue in NP-hard problems. It is an important step in machine learning and pattern recognition. The major aim of feature selection is to determine the feature subset from the current/existing features that will enhance the learning performance of the algorithms, in terms of accuracy and learning time. This paper proposes a new hybrid filter-wrapper feature selection method that can be used in classification problems. The information gain ratio algorithm (GR) represents the filter feature selection approach, and the black hole algorithm (BHA) represents the wrapper feature selection approach. The comparative analysis of network intrusion detection methods focuses on accuracy and false positive rate. GBA shines with exceptional results: achieving 96.96% accuracy and a mere 0.89% false positive rate. This success can be traced to GBA's improved initialization via the GR technique, which effectively removes irrelevant features. By assigning these features almost zero weights, GBA hones its ability to accurately spot intrusions while drastically reducing false alarms. These standout outcomes underline GBA's superiority over other methods, showcasing its potential as a reliable solution for bolstering network security.
... Azizipanah-Abarghooee et al. 2014 proposed a new optimization approach known as gradient-based modified teach-learning-based optimization combined with black hole (MTLBO-BH) algorithm to search for optimum operating cost. Kumar et al. 2015 explained that BHA can provide a model for solving wireless sensor network optimization problems because it is independent of parameter tuning problems. Gao et al. 2016 have proposed the BHA and Spencer method to examine the stability of the high set slope of an airport. ...
Article
Full-text available
The logistic regression is generally preferred when there is no big difference in the occurrence frequencies of two possible results for the considered event. However, for the events occurring rarely such as wars, economic crisis and natural disasters, namely having relatively small occurrence frequency when compared to the general events, the logistic regression gives biased parameter estimations. Therefore, the logistic regression underestimates the occurrence probability of the rare events. In this study, black hole algorithm is proposed and used to obtain unbiased estimation parameters for rare events, instead of using the classical logistic regression approach. In order to estimate the logistic regression parameter for the cases dichotomous event groups are rare, we propose a black hole algorithm (BHA) approach. For the samples with different rareness degrees, we obtain the parameter values and their bias and root mean square errors for BHA and logistic regression, and then compare them. Moreover, we also investigate the classification performance of two methods on a real life data. As a result, we obtained that BHA gives less biased estimates in simulation and real-life data compared to logistic regression.
Article
Effective control of blasting outcomes depends on a thorough understanding of rock geology and the integration of geological characteristics with blast design parameters. This study underscores the importance of adapting blast design parameters to geological conditions to optimize the utilization of explosive energy for rock fragmentation. To achieve this, data on fifty geo-blast design parameters were collected and used to train machine learning algorithms. The objective was to develop predictive models for estimating the blast oversize percentage, incorporating seven controlled components and one uncontrollable index. The study employed a combination of hybrid long-short-term memory (LSTM), support vector regression, and random forest algorithms. Among these, the LSTM model enhanced with the tree seed algorithm (LSTM-TSA) demonstrated the highest prediction accuracy when handling large datasets. The LSTM-TSA soft computing model was specifically leveraged to optimize various blast parameters such as burden, spacing, stemming length, drill hole length, charge length, powder factor, and joint set number. The estimated percentage oversize values for these parameters were determined as 0.7 m, 0.9 m, 0.65 m, 1.4 m, 0.7 m, 1.03 kg/m3, 35%, and 2, respectively. Application of the LSTM-TSA model resulted in a significant 28.1% increase in the crusher's production rate, showcasing its effectiveness in improving blasting operations.
Chapter
Due to the advent of Next Generation Sequencing and multiple innovative experimental techniques there is an exponential increase in biological data. It is necessary to capture meaningful information and valuable knowledge from this data which can immensely benefit all living things on the planet. Recent advances in machine learning and deep learning have indeed been able to handle this with the deployment of novel algorithms. Data imbalance frequently occurs in biological data mining tasks. This is because in several omics related problems data belonging to the positive class is much less than the data belonging to the negative class. Such imbalance can affect performance and produce faulty results and conclusions. Nature inspired Evolutionary and Metaheuristic algorithms are robust and can handle data imbalance quite efficiently. In this work we have described the use of these algorithms for tackling data imbalance in biological data. We have provided lucid explanations of these algorithms along with potentially important case studies.
Chapter
Software testing is an important phase under the development of any software. It is crucial to be able to test the software before it can be used by users. This paper discusses the convertor tool that is first used to convert the mobile application into Java source codes. Then software testing is done by using Eclipse Plug-in Tool (EPiT) to generate test cases automatically. The Java Unit (Junit) testing framework is also used for to generate the test cases. Then both EPiT and JUnit are used for comparison purpose of the time taken to generate the test cases.
Article
In this paper, we present a heuristic for the non‐unicost set covering problem using local branching. Local branching eliminates the need to define a problem specific search neighbourhood for any particular (zero‐one) optimisation problem. It does this by incorporating a generalised Hamming distance neighbourhood into the problem, and this leads naturally to an appropriate neighbourhood search procedure. We apply our approach to the non‐unicost set covering problem. Computational results are presented for 65 test problems that have been widely considered in the literature. Our results indicate that our heuristic is better than six of the eight other heuristics we examined, slightly worse than that of one heuristic, but that there is a single heuristic that outperforms all others. We believe that the work described here illustrates that the potential for using local branching, operating as a stand‐alone matheuristic, has not been fully exploited in the literature.
Article
Full-text available
The Henry Gas Solubility Optimization (HGSO) is a physics-based metaheuristic inspired by Henry’s law, which describes the solubility of the gas in a liquid under specific pressure conditions. Since its introduction by Hashim et al. in 2019, HGSO has gained significant attention for its unique features, including minimal adaptive parameters and a balanced exploration-exploitation trade-off, leading to favorable convergence. This study provides an up-to-date survey of HGSO, covering the walk through the historical development of HGSO, its modifications, and hybridizations with other algorithms, showcasing its adaptability and potential for synergy. Recent variants of HGSO are categorized into modified, hybridized, and multi-objective versions, and the review explores its main applications, demonstrating its effectiveness in solving complex problems. The evaluation includes a discussion of the algorithm’s strengths and weaknesses. This comprehensive review, featuring graphical and tabular comparisons, not only indicates potential future directions in the field but also serves as a valuable resource for researchers seeking a deep understanding of HGSO and its advanced versions. As physics-based metaheuristic algorithms gain prominence for solving intricate optimization problems, this study provides insights into the adaptability and applications of HGSO across diverse domains.
Article
Full-text available
Evolutionary programs are gaining popularity in many engineering and scientific applications due to their enormous advantages such as adaptability, ability to handle non-linear, ill-defined and probabilistic problems. Specific reference to genetic algorithms (Gas), some parameters that influence the convergence to the optimal value are the population size (popsize), the crossover probability (Pc) and the mutation propability (Pm). Normally these values are prescribed initially and do not vary during the execution of the program, although these parameters greatly affect the performance of GA. The present chapter deals with the development of an improved genetic algorithm (IGA) by introducing a variation in the values of the parameters like population size (popsize), the crossover probability (Pc) and the mutation propability (Pm). The aim of this variation is to minimize the convergence time. This work presents a method of dynamically varying the parameters of operation of the GA program using fuzzy state theory (FST) so that the final convergence is obtained in a shorter time. Also, in this chapter a function has been developed and optimized for long-term load forecasting problem using IGA. This technique does not require any previous assumption of a function for load forecasting, further, it does not need any functional relationship between dependent and independent variables. The results obtained by this technique are compared with the data available from central electricity authority (CEA), India to demonstrate the effectiveness of the proposed technique.
Chapter
The goal of this research is to investigate and develop heuristic tools in order to extract meaningful knowledge from archeological large-scale data sets. Database queries help us to answer only simple questions. Intelligent search tools integrate heuristics with knowledge discovery tools and they use data to build models of the real world. We would like to investigate these tools and combine them within the genetic algorithm framework. Some methods, taken from the area of soft computing techniques, use rough sets for data reduction and the synthesis of decision algorithms. However, because the problems are NP-hard, using a heuristic approach by combining Boolean reasoning with genetic algorithms seems to be one of the best approaches in terms of efficiency and flexibility. We will test our tools on several large-scale archeological data sets generated from an intensive archaeological survey of the Valley of Oaxaca in Highland Mesoamerica.