Sie sind auf Seite 1von 10

Information Sciences 267 (2014) 191200

Contents lists available at ScienceDirect

Information Sciences
journal homepage: www.elsevier.com/locate/ins

How novel is the novel black hole optimization approach?


Adam P. Piotrowski , Jaroslaw J. Napiorkowski, Pawel M. Rowinski
Institute of Geophysics, Polish Academy of Sciences, Ks. Janusza 64, 01-452 Warsaw, Poland

a r t i c l e i n f o a b s t r a c t

Article history: Due to abundance of novel optimization algorithms in recent years, the problem of large
Received 23 April 2013 similarities among methods that are named differently is becoming troublesome and gen-
Received in revised form 8 January 2014 eral. The question arises if the novel source of inspiration is sufcient to breed an optimi-
Accepted 15 January 2014
zation algorithm with a novel name, even if its search properties are almost the same as, or
Available online 24 January 2014
are even a simplied variant of, the search properties of an older and well-known method.
In this paper it is rigidly shown that the recently proposed heuristic approach called the
Keywords:
black hole optimization is in fact a simplied version of Particle Swarm Optimization with
Black hole algorithm
Particle Swarm Optimization
inertia weight. Additionally, because a large number of metaheuristics developed during
Evolutionary algorithm the last decade is claimed to be nature-inspired, a short discussion on inspirations of opti-
Global optimization mization algorithms is presented.
No Free Lunch theorems 2014 Elsevier Inc. All rights reserved.
Nature-inspired heuristic

1. Introduction

In recent years a lot of novel metaheuristics have been proposed and most of them were, according to their inventors
claim, inspired by some processes, behaviors or philosophies that are widely known to human beings for years. Examples
include the algorithms inspired by the process of evolution [29,2,22,46,1], the behavior of animals [14,12,8,24], the cooper-
ation [32,9], the harmony of music [21], the chemical reactions [31], the physical laws [27,38] or the philosophical concepts
[26,7]. However, not all inspirations lead to truly successful algorithms [10].
As plenty of metaheuristics exist [6], some of them, although use different names and are claimed to be inspired by dif-
ferent entities, in fact share large similarities with, or are simply an extension of, the others. The inspiring discussion on large
and important similarities between a few types of Genetic Algorithms [25,44] and the basic variants of Biogeography-Based
optimization [41], Differential Evolution [46], (l,k)-Evolution Strategy [33] and Particle Swarm Optimization (PSO) [14] has
been given by Simon et al. [42]. However, the above-mentioned newer metaheuristics are the extensions of Genetic Algo-
rithms, and hence open new possibilities that could be (and usually are) further successfully extended and examined in
the future. But it seems difcult to accept that an algorithm which is a simplication of the well known older method should
bear a novel name and be called a new approach. If such a path would be followed, soon plenty of novel names (but not
necessarily truly novel methods) could emerge in the literature. A good example is the recently proposed black hole opti-
mization approach [23] this new metaheuristic is de facto a signicant simplication of PSO with inertia weight, what
will be shown in the next sections.

Corresponding author. Tel.: +48 22 6915 858; fax: +48 22 6915 915.
E-mail address: adampp@igf.edu.pl (A.P. Piotrowski).

http://dx.doi.org/10.1016/j.ins.2014.01.026
0020-0255/ 2014 Elsevier Inc. All rights reserved.
192 A.P. Piotrowski et al. / Information Sciences 267 (2014) 191200

2. Black hole optimization approach

The optimization method called black hole approach [23] is a population-based algorithm. As the method is said to be
inspired by the black hole phenomenon, the solutions that are moving in the search space are called stars. The meaning
of the stars is exactly the same as that of individuals in Evolutionary Algorithms, particles in PSO, points in Nel-
der-Mead algorithm [34], etc. In this elitist method, the best solution found so-far is called the black hole. The algorithm
works as follows. First, the N + 1 stars, xi e RD, i = 1, . . . , N + 1 (where N is population size), are randomly initialized in the
search space. Their tness is evaluated and the best one is termed the black hole xBH. As the black hole is static it does
not move until a better solution is found by the other stars the number of individuals that search for the optimum is equal
to N. Then in each generation every star is moving towards the black hole according to the following equation:

xi t 1 xi t randi 0; 1  xBH  xi t; i 1; . . . ; N 1
where randi(0, 1) is a random number generated within an interval [0, 1]. Note that according to [23] in each generation only
a single randi(0, 1) is generated for each individual i. Then the tness of each ith star in the new location xi(t + 1) is evaluated.
If the tness of xi(t + 1) is better than the tness of xBH, then xi(t + 1) becomes a black hole and the former black hole becomes
a star xi(t + 1).
In the black hole algorithm, the star that comes too close to the black hole (closer than the so-called event horizon) dis-
appears. The radius of the event horizon (R) is dened by

fBH
R PN 2
i1 fi

If a star disappears, a new star is randomly generated in the search space, hence the number of stars (population size) is
constant.
According to [23] the proposed approach is not the rst optimization algorithm inspired by the black holes, as one of PSO
variants was already based on their behavior [53]. However, the black hole optimization approach proposed in [23] and sum-
marized above is claimed to be completely different from the black hole PSO. Unfortunately, when comparing the algo-
rithm dened in [23] with the basic PSO method with inertia weight [40] one may see that the black hole approach [23]
is just a simplication of the latter. As the black hole approach was proposed to data clustering in [23] it must be stressed
here that, among plenty of other algorithms (some of the most recent include [16,3,19]), also a large number of PSO variants
have been successfully applied to this task the detailed review may be found in [37].

3. Particle swarm optimization

Particle Swarm Optimization [14] is a very popular stochastic population-based algorithm, inspired by the behavior of the
swarm of animals. In PSO the solutions in the D-dimensional search space are called particles. Initial positions xi(0) of N par-
ticles (i = 1, . . . , N) are usually generated randomly within the bounds of the search space. The initial velocities vi(0) of each
particle are usually generated from pre-specied interval, which frequently depends on the differences between upper and
lower bounds of the search space. The tness value is evaluated for each particle. Then in each generation (t) the particles are
moving through the search space according to the following equation:

v ji t 1 w  v ji t c1  rand1ji 0; 1  pbestji t  xji t c2  rand2ji 0; 1  gbestj t  xji t


3
xji t 1 xji t v ji t 1

where j = 1, . . . , D, pbesti(t) and gbest(t) are the best position visited during the search by ith particle and the best position
visited by any particle in the swarm, respectively; rand1ji 0; 1 and rand2ji 0; 1 are two random numbers generated at each
generation from [0, 1] interval for each i and j index separately, and c1 and c2 are acceleration coefcients (algorithm param-
eters to be set by the user). As may be seen, for each ith particle three vectors are remembered its current position xi(t), the
best position pbesti(t) visited by the ith particle since the initialization of the search and ith particles current velocity vi(t).
The parameter w is the so-called inertia weight. Its value may be a function of time or not, its precise denition signicantly
depends on the variant used. Although w was not used in the rst PSO algorithm proposed by Eberhart and Kennedy [14], it
was quickly added in [40] to balance the global and local search ability. Today, PSO approaches are considered among the
most popular and successful metaheuristics [15].

4. Discussion

The black hole algorithm is composed of two parts: the movement of stars described by Eq. (1) and the re-initialization of
stars that cross the D-dimensional hypersphere, called event horizon, around the black hole, with the radius dened by
Eq. (2).
The rst part is a core of the black hole approach it fully determines the movement of solutions in the search space.
However, let us consider the movement of particles in PSO with inertia weight (Eq. (3)). If one sets w = 0, c1 = 0, c2 = 1
A.P. Piotrowski et al. / Information Sciences 267 (2014) 191200 193

and uses the same rand2ji 0; 1 for every j in other words, generates a single rand2i(0, 1) for every individual in each gen-
eration then Eq. (3) simplies to Eq. (1). Such a simplied equation would dene the moves of all PSO particles (called stars
in the black hole approach) but the best particle in the swarm. Indeed, in PSO the particle whose best visited position equals
gbest(t) takes part in the search, but in the black hole approach xBH does not. Note, however, that it simply means that the
movement of xBH in the black hole approach is dened by Eq. (3) with w = 0, c1 = 0 and c2 = 0 (mixing notations from PSO and
the black hole approach one may say that for xBH the velocity vBH(t) = 0 for each t), and the population size (say, NPSO) in PSO
is equal to the population size (say, NBH + 1) in black hole approach. In other words, the movement of stars in the black hole
algorithm is performed according to a much simplied rule of movement of particles in PSO with inertia weight, the algo-
rithm which was introduced in 1998 [40].
Due to the proposed simplication of PSO with inertia weight, the so-called ith star in the black hole approach moves only
along a [xBH, xi] line segment, always towards xBH. Any star may change the direction only if one of the stars nds a better
solution than the solution occupied by the current black hole, and becomes a new black hole. Apparently the concept of the
event horizon had to be introduced because in such a way the stars would relatively quickly converge to the point in the
search space occupied by the black hole (the algorithm does not have exploration capabilities). This concept is the only dif-
ference between the much simplied PSO with inertia weight and the black hole algorithm. However, the event horizon does
not allow the method to intensify exploration or cumulate any knowledge about the already visited solutions, it is rather a
simple restart approach performed for each star separately. Note that a similar restart PSO variant, in which all but the best
particles in the swarm are restarted when they converge too close to each other, is also known from the literature [20]. Other
variants of PSO re-initialization may be found in [30,13]. But all these approaches are still modied PSO variants, by no
means a novel kind of metaheuristics.
Finally, because the radius of the event horizon in the search space is dened according to the tness of the stars and the
tness of the black hole, the radius value may differ signicantly from iteration to iteration. Note also that if the problem is
bounded within [LB, UB]D, and the tness in the global minimum and maximum is f(xmin) and f(xmax), respectively, then the
lowest radius of the event horizon Rl is obtained when all stars but the black hole are located at the global maximum. In such
a case

f xBH
Rl ; f xmin 6 f xBH 6 f xmax 4
Nf xmax

and one may note that some problems may exist for which the radius of the ball RLU that comprises [UBLB]D is so small that
the event horizon encompasses the whole search space:

f xBH
R  Rl > RLU 5
Nf xmax

In the case of problems with at tness landscape, even if RLU is larger than 1/N, it may happen that the global optimum
would be located within the event horizon. It occurs under the condition that at any time during computation the distance
between the xBH and the xmin would become smaller than Rl. In such a case no star would be able to reach the global opti-
mum. Hence, it is difcult to accept that the concept of event horizon introduces a novelty to PSO that deserves the novel
name for the simplied version of the well known algorithm. It may be added here that some other researchers who refer to
the black-hole approach in recent papers address it as a kind of PSO [51] or, more broadly speaking, swarm intelligence
methods [17].
The question may arise why such a simplied PSO variant as the so-called black hole approach works successfully in data
clustering, as shown in [23]. Would the black-hole algorithm be also successful in solving other popular problems? Accord-
ing to No Free Lunch theorems [50], the expected performance of all non-revisiting algorithms that use only information
about objective function from the already visited samples for all possible problems would be equal, if the search space
and the space of possible tness values are nite the last two assumptions are always met in practice when digital com-
puters are used. According to the proof of No Free Lunch theorems [50,28], for all possible problems the expected perfor-
mance of any heuristic is the same as the expected performance of the simple random search. This means that if some
complicated, or advanced, metaheuristics outperform the simple ones (including random search) on some problems
and such a case is frequently reported in the literature the simple metaheuristics (including random search) must outper-
form the more advanced ones on some other problems which are probably rarely studied. Also, as shown empirically in
recent studies [7,26], some very simple algorithms may outperform the elaborated ones even on large number of widely
used benchmark problems [47]. Following the same empirical path, the performance of PSO with inertia weight (denoted
now as PSO_iw) is here compared with the performance of a number of its simplied variants, including the so-called
black-hole approach, on the very popular set of 10-, 30- and 50-dimensional CEC2005 problems proposed in [47], the basic
properties of which are specied in Table 1.
Tests are performed with the following classical PSO_iw parameter settings: N = 30, c1 = c2 = 1.49, w(gen) = 0.9 -
 (0.5  gen)/maxgen (as for example in [48]), where gen is the generation number and maxgen is the maximum number
of generations. We start the test from PSO_iw and then, in each consecutive variant, simplify the algorithm. The tested vari-
ants include: 1. PSO_iw; 2. PSO with inertia weight equal to 0 (PSO_iw = 0; in such variant the information about past veloc-
ities is lost); 3. PSO with inertia weight equal to 0 and c1 equal to 0 (PSO_iw = 0_c1 = 0; in this variant the information about
194 A.P. Piotrowski et al. / Information Sciences 267 (2014) 191200

Table 1
Summary of CEC 2005 test problems.

Problem Type f(x) Initialization range Bounds


F1 Unimodal 450. [100, 100]D [100, 100]D
F2 Unimodal 450. [100, 100]D [100, 100]D
F3 Unimodal 450. [100, 100]D [100, 100]D
F4 Unimodal 450. [100, 100]D [100, 100]D
F5 Unimodal 310. [100, 100]D [100, 100]D
F6 Basic multimodal 390. [100, 100]D [100, 100]D
F7 Basic multimodal 180. [0, 600]D none
F8 Basic multimodal 140. [32, 32]D [32, 32]D
F9 Basic multimodal 330. [5, 5]D [5, 5]D
F10 Basic multimodal 330. [5, 5]D [5, 5]D
F11 Basic multimodal 90. [0.5, 0.5]D [0.5, 0.5]D
F12 Basic multimodal 460. [p, p]D [p, p]D
F13 Expanded multimodal 130 [3, 1]D [3, 1]D
F14 Expanded multimodal 300. [100, 100]D [100, 100]D
F15 Hybrid composition 120. [5, 5]D [5, 5]D
F16 Hybrid composition 120. [5, 5]D [5, 5]D
F17 Hybrid composition 120. [5, 5]D [5, 5]D
F18 Hybrid composition 10. [5, 5]D [5, 5]D
F19 Hybrid composition 10. [5, 5]D [5, 5]D
F20 Hybrid composition 10. [5, 5]D [5, 5]D
F21 Hybrid composition 360. [5, 5]D [5, 5]D
F22 Hybrid composition 360. [5, 5]D [5, 5]D
F23 Hybrid composition 360. [5, 5]D [5, 5]D
F24 Hybrid composition 260. [5, 5]D [5, 5]D
F25 Hybrid composition 260. [2, 5]D None

both the past velocities and particles best historical position is lost); 4. PSO with inertia weight equal to 0, c1 equal to 0 and
c2 equal to 1 (PSO_iw = 0_c1 = 0_c2 = 1; this variant is similar to the previous one, but differs by eliminating c2, which is set
to unity). 5. black-hole approach (BH) (which is equal to PSO_iw = 0_c1 = 0_c2 = 1 with rand2ji (0,1) (see Eq. (3)) kept the same
for each jth index and with re-initialization of particles/stars that enter the event horizon, consult Eq. (2)). Unfortunately, in
[23] the population size of the black hole approach was not specied, what may affect the present comparison. To alleviate
the problem, two black-hole variants are tested one with the same population size as PSO variants (N = 30), and the other
one: 6. black-hole approach with N = 100 (BH100).
Each variant is applied 30 times for each problem. The median, mean and standard deviations of each variants perfor-
mance, as well as the variants ranking (1 means the best, 6 the lowest performance), are given in Tables 24.
The results show both expected and unexpected issues. Firstly, as one might suppose, PSO_iw turned out to be the best
method for almost all problems this may allow one to conclude that its simplied variants are in fact not needed. Secondly,
as long as random numbers rand1ji 0; 1 and rand2ji 0; 1 are generated for each jth dimension independently (variants 2-4),
the more simplied the chosen PSO variant is, the poorer is the achieved performance. However, when information about
previous velocities and best positions is missing, and in addition a single rand2ji 0; 1 for each jth index is generated and
the individuals that come too close to the best particle are re-generated (which jointly leads to the so-called black-hole ap-
proach), the results tend to improve. Thirdly, although the results achieved by the black-hole approach are almost always a
way behind the results achieved by PSO_iw, in a few cases one or both tested black-hole variants outperform PSO_iw. The
superiority of the black-hole approach is noted for the problem 8, irrespective of the dimensionality. It is also seen for 10-
dimensional versions of problems 12, 22 (only when the mean is compared) and 25, and 50-dimensional versions of prob-
lems 18, 19 (in both cases, only when the median is compared) and 20. Interestingly, none of such problems is unimodal,
most of them being hybrid composite functions, widely considered difcult to be solved. The tness landscape of problem
8 is bumpy and deceptive as the algorithms are unable to gain information that the global optimum is on the bounds, each
method nds the way to one among plenty poor local minima that share very similar function values. This results in only
marginal difference between performance of all tested variants. Problems 19 and 20 are modied variants of problem 18,
which partly claries the good performance of the simplied algorithm for 50-dimensional variants of all three functions.
How to explain such results? As PSO_iw turns out to be better than the black hole variant in case of 2124 out of 25 prob-
lems, depending on the dimensionality and measure, its superiority is very clear. However, there are still some problems for
which the black hole approach performs better. This may conrm the possibility of reasonable performance of the simplied
PSO_iw variants in data clustering [23]. It is also in accordance with the empirical ndings regarding very simple algorithms
given in [7,26] and theoretical issues implied by No Free Lunch theorems [50], which are frequently overseen by the
researchers. On the other hand, in the studies regarding No Free Lunch [39] it was shown that among all problems there
are many that cannot be of interest to anyone, at least because the global maximum and the global minimum are neighbored,
hence the solutions of such problem cannot be robust. If such problems are excluded from considerations, the No Free Lunch
theorems do not hold [39] and the expected performance of some algorithms (possibly the more advanced ones) may be
Table 2
Results obtained for 10-dimensional problems. Algorithms ranking is provided in column rank according to median (left) and mean (right) statistics.

PSO_iw BH BH100 PSO_iw = 0

Median Mean Std Rank Median Mean Std Rank Median Mean Std Rank Median Mean Std Rank

F1 4.500E+02 4.500E+02 1.447E07 1 1 1.163E+02 1.183E+02 1.167E+02 2 2 1.900E+02 2.058E+02 1.641E+02 3 3 2.277E+03 2.571E+03 1.543E+03 4 4
F2 4.500E+02 4.500E+02 1.034E07 1 1 1.461E+03 1.431E+03 2.435E+02 3 2 1.438E+03 1.458E+03 3.390E+02 2 3 5.828E+03 5.498E+03 1.762E+03 4 4
F3 6.946E+04 1.016E+05 1.168E+05 1 1 1.886E+06 2.007E+06 9.171E+05 2 2 2.711E+06 3.018E+06 1.384E+06 3 3 1.707E+07 1.988E+07 1.353E+07 4 4
F4 4.500E+02 4.500E+02 9.871E08 1 1 2.917E+03 2.908E+03 6.287E+02 3 3 2.443E+03 2.412E+03 7.221E+02 2 2 5.655E+03 6.288E+03 2.393E+03 4 4
F5 1.805E+02 2.062E+02 2.440E+02 1 1 9.609E+03 9.366E+03 1.015E+03 3 3 9.290E+03 9.191E+03 1.248E+03 2 2 1.218E+04 1.191E+04 2.125E+03 4 4
F6 3.916E+02 3.986E+02 2.423E+01 1 1 4.545E+06 6.062E+06 4.014E+06 2 2 8.258E+06 9.052E+06 5.142E+06 3 3 2.405E+08 4.059E+08 4.199E+08 5 5
F7 1.798E+02 1.797E+02 2.367E01 1 1 1.783E+02 1.781E+02 1.072E+00 2 2 1.781E+02 1.780E+02 1.075E+00 3 3 1.207E+03 1.198E+03 2.679E+02 4 4

A.P. Piotrowski et al. / Information Sciences 267 (2014) 191200


F8 1.197E+02 1.197E+02 9.291E02 2 2 1.197E+02 1.197E+02 7.959E02 4 4 1.198E+02 1.197E+02 9.047E02 1 1 1.197E+02 1.197E+02 1.028E01 3 3
F9 3.270E+02 3.263E+02 1.716E+00 1 1 2.985E+02 2.983E+02 1.219E+01 4 4 3.027E+02 3.012E+02 1.002E+01 3 3 3.030E+02 3.028E+02 8.511E+00 2 2
F10 3.191E+02 3.178E+02 5.422E+00 1 1 2.866E+02 2.868E+02 1.539E+01 4 4 2.883E+02 2.896E+02 1.490E+01 3 3 2.908E+02 2.902E+02 1.673E+01 2 2
F11 9.279E+01 9.282E+01 1.355E+00 1 1 9.723E+01 9.729E+01 1.156E+00 3 3 9.712E+01 9.695E+01 1.422E+00 2 2 9.794E+01 9.787E+01 1.391E+00 4 4
F12 1.729E+02 1.399E+03 3.021E+03 2 2 2.722E+02 7.300E+02 2.726E+03 1 1 2.115E+02 5.645E+03 1.086E+04 3 3 9.839E+03 1.367E+04 1.179E+04 4 4
F13 1.294E+02 1.293E+02 2.568E01 1 1 1.282E+02 1.281E+02 8.689E01 3 3 1.280E+02 1.279E+02 9.656E01 4 4 1.284E+02 1.284E+02 5.698E01 2 2
F14 2.970E+02 2.970E+02 4.697E01 1 1 2.966E+02 2.966E+02 3.081E01 3 3 2.967E+02 2.967E+02 3.428E01 2 2 2.965E+02 2.965E+02 4.417E01 4 4
F15 4.200E+02 3.931E+02 1.871E+02 1 1 4.809E+02 5.144E+02 1.372E+02 3 3 4.646E+02 5.107E+02 1.497E+02 2 2 5.348E+02 5.898E+02 2.049E+02 4 4
F16 2.389E+02 2.607E+02 7.756E+01 1 1 3.200E+02 3.204E+02 3.532E+01 2 2 3.390E+02 3.384E+02 4.508E+01 3 3 3.392E+02 3.698E+02 8.954E+01 4 4
F17 2.404E+02 2.427E+02 1.286E+01 1 1 3.406E+02 3.429E+02 3.509E+01 2 2 3.482E+02 3.674E+02 5.067E+01 3 3 3.788E+02 3.757E+02 6.247E+01 4 4
F18 8.100E+02 6.415E+02 2.996E+02 1 1 1.074E+03 1.067E+03 7.144E+01 2 2 1.083E+03 1.077E+03 4.475E+01 3 3 1.145E+03 1.153E+03 8.081E+01 4 4
F19 9.334E+02 8.102E+02 2.549E+02 1 1 1.085E+03 1.079E+03 3.743E+01 3 3 1.076E+03 1.063E+03 6.046E+01 2 2 1.169E+03 1.167E+03 4.693E+01 4 4
F20 9.015E+02 7.393E+02 2.823E+02 1 1 1.081E+03 1.082E+03 3.558E+01 2 2 1.092E+03 1.088E+03 4.809E+01 3 3 1.165E+03 1.167E+03 6.294E+01 4 4
F21 8.600E+02 1.006E+03 3.022E+02 1 1 1.559E+03 1.492E+03 1.854E+02 3 3 1.443E+03 1.350E+03 2.640E+02 2 2 1.603E+03 1.623E+03 1.001E+02 4 4
F22 1.139E+03 1.160E+03 6.049E+01 1 3 1.199E+03 1.055E+03 2.607E+02 2 2 1.200E+03 1.040E+03 2.614E+02 3 1 1.351E+03 1.364E+03 8.659E+01 4 4
F23 1.198E+03 1.235E+03 2.839E+02 1 1 1.522E+03 1.502E+03 1.135E+02 2 2 1.560E+03 1.510E+03 1.544E+02 3 3 1.662E+03 1.666E+03 6.915E+01 4 4
F24 4.600E+02 5.567E+02 1.402E+02 1 1 9.927E+02 9.959E+02 1.774E+02 3 3 9.234E+02 9.799E+02 1.960E+02 2 2 1.568E+03 1.539E+03 1.017E+02 4 4
F25 1.044E+03 9.698E+02 2.409E+02 3 3 6.818E+02 7.265E+02 1.390E+02 1 1 6.901E+02 8.182E+02 2.183E+02 2 2 1.883E+03 1.870E+03 1.248E+02 4 4
PSO_iw = 0_c1 = 0 PSO_iw = 0_c1 = 0_c2 = 1 PSO_iw = 0_c1 = 0 PSO_iw = 0_c1 = 0_c2 = 1

Median Mean Std Rank Median Mean Std Rank Median Mean Std Rank Median Mean Std Rank

F1 2.805E+03 3.507E+03 2.366E+03 5 5 4.897E+03 4.628E+03 2.015E+03 6 6 F14 2.959E+02 2.959E+02 3.096E01 5 5 2.959E+02 2.959E+02 4.045E01 6 6
F2 6.417E+03 6.410E+03 1.893E+03 5 5 8.122E+03 8.496E+03 3.118E+03 6 6 F15 7.434E+02 7.404E+02 1.763E+02 5 6 7.717E+02 7.398E+02 2.279E+02 6 5
F3 3.676E+07 4.669E+07 4.609E+07 5 5 5.972E+07 7.423E+07 6.366E+07 6 6 F16 4.041E+02 4.186E+02 9.497E+01 5 5 5.061E+02 5.190E+02 1.116E+02 6 6
F4 9.307E+03 9.824E+03 3.213E+03 6 6 8.576E+03 9.746E+03 3.602E+03 5 5 F17 4.962E+02 5.367E+02 1.514E+02 5 6 5.064E+02 5.168E+02 1.091E+02 6 5
F5 1.327E+04 1.349E+04 1.804E+03 5 5 1.536E+04 1.515E+04 1.392E+03 6 6 F18 1.190E+03 1.199E+03 8.644E+01 5 5 1.242E+03 1.250E+03 9.705E+01 6 6
F6 1.523E+08 3.350E+08 3.692E+08 4 4 9.912E+08 1.315E+09 1.099E+09 6 6 F19 1.223E+03 1.214E+03 8.063E+01 5 5 1.251E+03 1.270E+03 9.011E+01 6 6
F7 1.668E+03 1.644E+03 3.614E+02 5 5 2.686E+03 2.714E+03 2.745E+02 6 6 F20 1.208E+03 1.199E+03 6.517E+01 5 5 1.231E+03 1.233E+03 7.098E+01 6 6
F8 1.195E+02 1.195E+02 1.673E01 5 5 1.194E+02 1.194E+02 1.511E01 6 6 F21 1.657E+03 1.656E+03 6.577E+01 5 5 1.694E+03 1.695E+03 6.050E+01 6 6
F9 2.925E+02 2.900E+02 1.291E+01 5 5 2.818E+02 2.833E+02 1.285E+01 6 6 F22 1.430E+03 1.459E+03 1.100E+02 5 5 1.504E+03 1.518E+03 1.169E+02 6 6
F10 2.775E+02 2.749E+02 1.747E+01 5 5 2.525E+02 2.565E+02 1.696E+01 6 6 F23 1.685E+03 1.679E+03 7.214E+01 5 5 1.706E+03 1.696E+03 5.679E+01 6 6
F11 9.904E+01 9.901E+01 1.593E+00 5 5 1.003E+02 1.000E+02 1.656E+00 6 6 F24 1.579E+03 1.560E+03 9.150E+01 5 5 1.611E+03 1.589E+03 1.110E+02 6 6
F12 2.442E+04 2.781E+04 2.377E+04 5 5 4.649E+04 5.802E+04 3.484E+04 6 6 F25 2.040E+03 2.015E+03 9.557E+01 5 5 2.295E+03 2.294E+03 4.692E+01 6 6
F13 1.272E+02 1.268E+02 1.481E+00 5 5 1.265E+02 1.264E+02 1.183E+00 6 6

195
196
Table 3
Results obtained for 30-dimensional problems. Algorithms ranking is provided in column rank according to median (left) and mean (right) statistics.

PSO_iw BH BH100 PSO_iw = 0

Median Mean Std Rank Median Mean Std Rank Median Mean Std Rank Median Mean Std Rank

F1 4.500E+02 4.500E+02 5.241E08 1 1 1.604E+04 1.589E+04 1.097E+03 3 3 1.574E+04 1.581E+04 1.260E+03 2 2 3.779E+04 3.808E+04 7.105E+03 4 4
F2 4.500E+02 4.500E+02 6.637E05 1 1 1.290E+04 1.295E+04 5.347E+02 2 2 1.345E+04 1.347E+04 6.052E+02 3 3 3.550E+04 3.826E+04 9.706E+03 4 4
F3 4.788E+06 6.450E+06 6.145E+06 1 1 8.404E+07 8.310E+07 9.863E+06 2 2 8.620E+07 8.614E+07 1.080E+07 3 3 4.671E+08 4.747E+08 2.070E+08 4 4
F4 2.065E+02 1.421E+02 2.387E+02 1 1 2.230E+04 2.256E+04 2.151E+03 2 2 2.274E+04 2.278E+04 1.739E+03 3 3 4.515E+04 4.557E+04 1.112E+04 4 4
F5 3.402E+03 3.342E+03 9.045E+02 1 1 2.490E+04 2.492E+04 1.490E+03 3 3 2.446E+04 2.458E+04 1.256E+03 2 2 3.210E+04 3.120E+04 3.508E+03 4 4
F6 4.096E+02 4.192E+02 3.170E+01 1 1 1.660E+09 1.669E+09 1.912E+08 2 2 1.786E+09 1.809E+09 2.132E+08 3 3 9.966E+09 1.141E+10 4.303E+09 4 4
F7 1.800E+02 1.800E+02 1.443E02 1 1 1.799E+02 1.799E+02 9.758E02 2 2 1.789E+02 1.790E+02 2.085E01 3 3 7.786E+03 7.834E+03 7.683E+02 4 4

A.P. Piotrowski et al. / Information Sciences 267 (2014) 191200


F8 1.191E+02 1.191E+02 6.435E02 3 3 1.191E+02 1.191E+02 7.686E02 2 2 1.192E+02 1.192E+02 1.032E01 1 1 1.190E+02 1.191E+02 6.132E02 4 4
F9 2.952E+02 2.960E+02 1.001E+01 1 1 1.216E+02 1.065E+02 4.651E+01 2 2 8.458E+01 8.690E+01 4.120E+01 4 4 9.805E+01 8.918E+01 3.402E+01 3 3
F10 2.653E+02 2.634E+02 2.530E+01 1 1 4.058E+01 3.739E+01 5.160E+01 2 2 4.606E+01 5.399E+01 4.443E+01 3 3 8.332E+01 9.461E+01 5.520E+01 4 4
F11 1.101E+02 1.101E+02 2.829E+00 1 1 1.270E+02 1.263E+02 2.890E+00 3 3 1.271E+02 1.266E+02 2.714E+00 4 4 1.258E+02 1.262E+02 3.043E+00 2 2
F12 4.868E+04 6.368E+04 4.835E+04 1 1 1.683E+05 1.614E+05 4.094E+04 2 2 1.886E+05 1.785E+05 3.952E+04 3 3 6.117E+05 6.246E+05 1.449E+05 4 4
F13 1.271E+02 1.270E+02 7.889E01 1 1 1.146E+02 1.145E+02 3.566E+00 3 3 1.150E+02 1.146E+02 4.835E+00 2 2 1.106E+02 1.098E+02 5.453E+00 4 4
F14 2.876E+02 2.878E+02 5.619E01 1 1 2.869E+02 2.869E+02 3.565E01 4 4 2.869E+02 2.870E+02 3.383E01 3 2 2.870E+02 2.869E+02 4.151E01 2 3
F15 5.104E+02 4.847E+02 7.553E+01 1 1 8.685E+02 8.836E+02 7.142E+01 2 2 8.852E+02 9.236E+02 9.241E+01 3 3 1.101E+03 1.092E+03 6.751E+01 4 4
F16 5.200E+02 4.485E+02 1.928E+02 1 1 7.036E+02 7.086E+02 8.273E+01 2 2 7.057E+02 7.145E+02 6.421E+01 3 3 9.053E+02 9.342E+02 1.475E+02 4 4
F17 2.756E+02 3.732E+02 1.864E+02 1 1 7.418E+02 7.512E+02 9.359E+01 2 3 7.596E+02 7.467E+02 6.806E+01 3 2 9.261E+02 9.159E+02 1.727E+02 4 4
F18 9.319E+02 9.341E+02 1.187E+01 1 1 1.199E+03 1.188E+03 8.086E+01 3 2 1.198E+03 1.201E+03 2.725E+01 2 3 1.264E+03 1.262E+03 4.228E+01 4 4
F19 9.322E+02 9.326E+02 3.046E+00 1 1 1.209E+03 1.174E+03 9.308E+01 3 2 1.205E+03 1.195E+03 6.179E+01 2 3 1.257E+03 1.260E+03 3.739E+01 4 4
F20 9.308E+02 9.316E+02 3.853E+00 1 1 1.189E+03 1.171E+03 7.536E+01 2 2 1.205E+03 1.191E+03 5.911E+01 3 3 1.261E+03 1.262E+03 3.530E+01 4 4
F21 8.600E+02 9.020E+02 1.395E+02 1 1 1.643E+03 1.644E+03 1.335E+01 3 3 1.640E+03 1.642E+03 1.644E+01 2 2 1.670E+03 1.671E+03 2.771E+01 4 4
F22 1.280E+03 1.283E+03 2.204E+01 1 1 1.649E+03 1.648E+03 4.332E+01 3 3 1.623E+03 1.625E+03 5.161E+01 2 2 1.666E+03 1.674E+03 6.528E+01 4 4
F23 8.942E+02 9.749E+02 1.643E+02 1 1 1.648E+03 1.650E+03 1.420E+01 3 3 1.648E+03 1.649E+03 1.709E+01 2 2 1.674E+03 1.678E+03 2.580E+01 4 4
F24 4.600E+02 6.191E+02 3.236E+02 1 1 1.622E+03 1.618E+03 1.842E+01 4 4 1.612E+03 1.610E+03 1.735E+01 3 2 1.610E+03 1.613E+03 2.524E+01 2 3
F25 4.901E+02 4.906E+02 9.178E+00 1 1 5.963E+02 9.347E+02 4.915E+02 2 2 6.311E+02 9.834E+02 5.131E+02 3 3 1.982E+03 1.982E+03 5.474E+01 4 4
PSO_iw = 0_c1 = 0 PSO_iw = 0_c1 = 0_c2 = 1 PSO_iw = 0_c1 = 0 PSO_iw = 0_c1 = 0_c2 = 1

Median Mean Std Rank Median Mean Std Rank Median Mean Std Rank Median Mean Std Rank

F1 3.994E+04 4.118E+04 6.442E+03 5 5 4.669E+04 4.734E+04 7.163E+03 6 6 F14 2.862E+02 2.862E+02 3.411E01 5 5 2.860E+02 2.861E+02 2.508E01 6 6
F2 5.360E+04 5.421E+04 1.091E+04 6 6 5.051E+04 5.227E+04 9.394E+03 5 5 F15 1.144E+03 1.130E+03 9.763E+01 5 5 1.188E+03 1.187E+03 1.003E+02 6 6
F3 6.092E+08 6.551E+08 2.550E+08 5 5 6.886E+08 7.705E+08 3.225E+08 6 6 F16 1.001E+03 9.866E+02 1.319E+02 5 5 1.030E+03 1.048E+03 1.304E+02 6 6
F4 6.174E+04 6.292E+04 1.378E+04 5 5 6.236E+04 6.445E+04 1.569E+04 6 6 F17 1.218E+03 1.186E+03 1.354E+02 5 5 1.238E+03 1.227E+03 1.559E+02 6 6
F5 3.268E+04 3.349E+04 4.083E+03 5 5 3.685E+04 3.732E+04 4.072E+03 6 6 F18 1.301E+03 1.295E+03 3.886E+01 5 5 1.315E+03 1.317E+03 3.236E+01 6 6
F6 1.398E+10 1.393E+10 4.134E+09 5 5 1.761E+10 1.828E+10 6.937E+09 6 6 F19 1.295E+03 1.297E+03 4.391E+01 5 6 1.298E+03 1.293E+03 4.509E+01 6 5
F7 9.104E+03 9.085E+03 5.866E+02 5 5 1.085E+04 1.099E+04 4.312E+02 6 6 F20 1.291E+03 1.289E+03 4.105E+01 5 5 1.303E+03 1.306E+03 3.367E+01 6 6
F8 1.190E+02 1.190E+02 1.235E01 5 5 1.189E+02 1.189E+02 8.768E02 6 6 F21 1.694E+03 1.691E+03 2.594E+01 5 5 1.699E+03 1.699E+03 2.548E+01 6 6
F9 3.660E+01 4.147E+01 3.381E+01 5 5 1.096E+01 1.357E+01 3.140E+01 6 6 F22 1.722E+03 1.739E+03 9.064E+01 5 5 1.790E+03 1.784E+03 9.149E+01 6 6
F10 1.704E+02 1.739E+02 6.251E+01 5 5 1.860E+02 1.818E+02 5.660E+01 6 6 F23 1.697E+03 1.692E+03 2.398E+01 5 5 1.706E+03 1.700E+03 2.468E+01 6 6
F11 1.298E+02 1.294E+02 3.554E+00 5 5 1.311E+02 1.305E+02 3.191E+00 6 6 F24 1.638E+03 1.639E+03 2.337E+01 5 5 1.641E+03 1.645E+03 3.032E+01 6 6
F12 8.399E+05 8.442E+05 2.173E+05 5 5 9.136E+05 9.591E+05 2.604E+05 6 6 F25 2.080E+03 2.077E+03 3.983E+01 5 5 2.154E+03 2.152E+03 3.304E+01 6 6
F13 1.010E+02 1.015E+02 7.650E+00 6 6 1.048E+02 1.028E+02 8.391E+00 5 5
Table 4
Results obtained for 50-dimensional problems. Algorithms ranking is provided in column rank according to median (left) and mean (right) statistics.

PSO_iw BH BH100 PSO_iw = 0

Median Mean Std Rank Median Mean Std Rank Median Mean Std Rank Median Mean Std Rank

F1 4.500E+02 4.500E+02 3.205E08 1 1 4.180E+04 4.165E+04 1.913E+03 2 2 4.225E+04 4.251E+04 1.692E+03 3 3 8.581E+04 8.520E+04 1.096E+04 4 4
F2 4.494E+02 4.466E+02 7.851E+00 1 1 3.419E+04 3.420E+04 9.896E+02 2 2 3.538E+04 3.531E+04 1.333E+03 3 3 1.077E+05 1.114E+05 2.322E+04 4 4
F3 3.528E+07 3.864E+07 1.951E+07 1 1 7.658E+08 7.694E+08 9.071E+07 2 2 8.338E+08 8.185E+08 8.401E+07 3 3 2.942E+09 2.847E+09 6.138E+08 4 4
F4 8.874E+03 9.612E+03 4.283E+03 1 1 5.425E+04 5.494E+04 3.855E+03 2 2 5.576E+04 5.500E+04 4.419E+03 3 3 1.078E+05 1.127E+05 2.614E+04 4 4
F5 7.768E+03 8.050E+03 1.671E+03 1 1 2.975E+04 2.969E+04 1.019E+03 3 3 2.932E+04 2.928E+04 9.564E+02 2 2 3.425E+04 3.426E+04 2.559E+03 4 4
F6 4.687E+02 4.582E+02 4.250E+01 1 1 4.993E+09 4.925E+09 4.649E+08 2 2 5.212E+09 5.220E+09 4.484E+08 3 3 3.187E+10 3.171E+10 7.294E+09 5 4
F7 1.800E+02 1.800E+02 1.541E02 1 1 1.799E+02 1.799E+02 4.955E02 2 2 1.789E+02 1.790E+02 2.203E01 3 3 1.228E+04 1.198E+04 1.056E+03 4 4

A.P. Piotrowski et al. / Information Sciences 267 (2014) 191200


F8 1.189E+02 1.189E+02 5.385E02 3 3 1.190E+02 1.190E+02 8.659E02 2 2 1.190E+02 1.190E+02 7.408E02 1 1 1.189E+02 1.189E+02 4.450E02 5 5
F9 2.529E+02 2.434E+02 2.665E+01 1 1 9.529E+01 1.140E+02 5.688E+01 2 2 1.208E+02 1.459E+02 7.675E+01 3 3 2.155E+02 2.079E+02 4.430E+01 4 4
F10 2.265E+02 2.225E+02 2.053E+01 1 1 4.801E+02 4.893E+02 7.827E+01 3 3 4.715E+02 4.734E+02 8.845E+01 2 2 5.729E+02 5.617E+02 9.123E+01 4 4
F11 1.329E+02 1.332E+02 4.658E+00 1 1 1.574E+02 1.572E+02 3.388E+00 3 3 1.593E+02 1.584E+02 4.146E+00 4 4 1.570E+02 1.563E+02 3.565E+00 2 2
F12 2.969E+05 3.066E+05 1.458E+05 1 1 1.385E+06 1.382E+06 7.471E+04 2 2 1.414E+06 1.444E+06 1.247E+05 3 3 3.818E+06 3.873E+06 6.060E+05 4 4
F13 1.227E+02 1.227E+02 1.639E+00 1 1 9.404E+01 9.408E+01 9.187E+00 3 3 9.597E+01 9.535E+01 8.249E+00 2 2 6.632E+01 6.046E+01 2.094E+01 4 4
F14 2.787E+02 2.787E+02 7.446E01 1 1 2.774E+02 2.774E+02 4.433E01 3 3 2.773E+02 2.773E+02 3.744E01 4 4 2.776E+02 2.776E+02 6.305E01 2 2
F15 5.200E+02 4.897E+02 8.436E+01 1 1 9.486E+02 1.003E+03 1.079E+02 2 2 9.860E+02 1.036E+03 1.153E+02 3 3 1.224E+03 1.196E+03 9.318E+01 4 4
F16 2.596E+02 3.239E+02 1.361E+02 1 1 8.608E+02 8.647E+02 5.831E+01 2 2 8.865E+02 8.843E+02 7.702E+01 3 3 1.062E+03 1.059E+03 9.218E+01 4 4
F17 2.833E+02 3.179E+02 1.140E+02 1 1 9.192E+02 9.120E+02 8.340E+01 3 2 9.191E+02 9.336E+02 7.551E+01 2 3 1.049E+03 1.045E+03 1.192E+02 4 4
F18 9.625E+02 9.633E+02 1.307E+01 3 1 9.100E+02 9.860E+02 1.403E+02 2 2 9.100E+02 1.057E+03 1.612E+02 1 3 1.288E+03 1.293E+03 2.704E+01 4 4
F19 9.642E+02 9.529E+02 6.174E+01 3 1 9.100E+02 9.750E+02 1.323E+02 2 2 9.100E+02 1.060E+03 1.633E+02 1 3 1.286E+03 1.291E+03 2.582E+01 4 4
F20 9.635E+02 9.637E+02 7.019E+00 3 2 9.100E+02 9.548E+02 1.164E+02 2 1 9.100E+02 1.002E+03 1.437E+02 1 3 1.288E+03 1.295E+03 3.410E+01 4 4
F21 1.415E+03 1.228E+03 2.574E+02 1 1 1.707E+03 1.707E+03 1.721E+01 3 3 1.707E+03 1.705E+03 1.272E+01 2 2 1.749E+03 1.751E+03 2.119E+01 4 4
F22 1.332E+03 1.332E+03 1.659E+01 1 1 1.805E+03 1.802E+03 5.485E+01 3 3 1.794E+03 1.777E+03 5.080E+01 2 2 1.819E+03 1.823E+03 6.291E+01 4 4
F23 1.406E+03 1.240E+03 2.328E+02 1 1 1.705E+03 1.704E+03 1.536E+01 2 2 1.706E+03 1.706E+03 1.728E+01 3 3 1.735E+03 1.743E+03 2.838E+01 4 4
F24 4.600E+02 4.918E+02 1.744E+02 1 1 1.649E+03 1.646E+03 1.668E+01 3 3 1.639E+03 1.642E+03 1.771E+01 2 2 1.670E+03 1.665E+03 2.853E+01 4 4
F25 6.027E+02 8.513E+02 4.363E+02 1 1 1.660E+03 1.666E+03 4.890E+01 2 2 1.676E+03 1.668E+03 5.580E+01 3 3 2.144E+03 2.146E+03 4.143E+01 4 4
PSO_iw = 0_c1 = 0 PSO_iw = 0_c1 = 0_c2 = 1 PSO_iw = 0_c1 = 0 PSO_iw = 0_c1 = 0_c2 =

Median Mean Std +/=/ Median Mean Std +/=/ Median Mean Std +/=/ Median Mean Std +/=/

F1 8.701E+04 8.850E+04 9.857E+03 5 5 9.128E+04 9.301E+04 1.008E+04 6 6 F14 2.764E+02 2.765E+02 3.725E01 6 6 2.765E+02 2.765E+02 3.316E01 5 5
F2 1.475E+05 1.479E+05 3.437E+04 5 5 1.767E+05 1.799E+05 4.888E+04 6 6 F15 1.271E+03 1.252E+03 1.065E+02 5 5 1.356E+03 1.357E+03 7.781E+01 6 6
F3 3.681E+09 3.745E+09 1.355E+09 5 5 4.624E+09 4.579E+09 1.206E+09 6 6 F16 1.122E+03 1.117E+03 1.179E+02 5 5 1.184E+03 1.190E+03 1.138E+02 6 6
F4 1.595E+05 1.851E+05 6.413E+04 5 5 1.773E+05 1.994E+05 7.345E+04 6 6 F17 1.288E+03 1.289E+03 1.225E+02 5 5 1.339E+03 1.351E+03 1.104E+02 6 6
F5 3.623E+04 3.622E+04 2.838E+03 5 5 3.889E+04 3.799E+04 3.096E+03 6 6 F18 1.314E+03 1.312E+03 2.963E+01 6 5 1.312E+03 1.313E+03 3.707E+01 5 6
F6 3.054E+10 3.332E+10 1.001E+10 4 5 3.591E+10 3.767E+10 9.071E+09 6 6 F19 1.301E+03 1.306E+03 2.865E+01 5 5 1.314E+03 1.312E+03 2.445E+01 6 6
F7 1.346E+04 1.353E+04 7.399E+02 5 5 1.536E+04 1.531E+04 4.364E+02 6 6 F20 1.317E+03 1.314E+03 2.719E+01 6 6 1.298E+03 1.303E+03 2.686E+01 5 5
F8 1.189E+02 1.189E+02 8.488E02 4 4 1.188E+02 1.188E+02 7.593E02 6 6 F21 1.768E+03 1.762E+03 2.285E+01 5 5 1.771E+03 1.769E+03 2.186E+01 6 6
F9 2.597E+02 2.593E+02 4.537E+01 5 5 3.077E+02 3.075E+02 5.008E+01 6 6 F22 1.889E+03 1.895E+03 7.203E+01 5 5 1.919E+03 1.948E+03 1.092E+02 6 6
F10 6.872E+02 6.858E+02 1.052E+02 5 5 6.960E+02 7.197E+02 1.126E+02 6 6 F23 1.765E+03 1.765E+03 1.969E+01 5 5 1.766E+03 1.766E+03 1.689E+01 6 6
F11 1.632E+02 1.629E+02 4.030E+00 5 5 1.645E+02 1.642E+02 4.804E+00 6 6 F24 1.699E+03 1.697E+03 3.130E+01 6 6 1.691E+03 1.693E+03 2.479E+01 5 5
F12 4.389E+06 4.405E+06 9.152E+05 5 5 5.049E+06 5.100E+06 9.623E+05 6 6 F25 2.220E+03 2.221E+03 3.564E+01 5 5 2.277E+03 2.273E+03 3.508E+01 6 6
F13 5.437E+01 4.798E+01 2.456E+01 5 5 4.812E+01 3.999E+01 2.754E+01 6 6

197
198 A.P. Piotrowski et al. / Information Sciences 267 (2014) 191200

better than the expected performance of the others (possibly the more nave). However, one may still expect that the per-
formance of a very simple metaheuristics would be better than the performance of the more advanced ones for some among
a number of problems that may be of interest. Such expectations are conrmed theoretically by the so-called Focused No
Free Lunch [49] and empirically for selected problems in [35,36]. Interestingly, it was shown in [36] that random sampling
does outperform some elaborate Differential Evolution algorithms on 10-dimensional problem 8 from [47], which is in accor-
dance with the present results. Having said that, the better performance of the so-called black hole approach than its more
elaborated ancestor (or some other advanced metaheuristics) achieved for some specied problems can no longer be
surprising.
To conclude one may show that the black-hole approach is only a simplication of PSO with inertia weight, and hence it
is not a novel approach and does not deserve a novel name. One may also verify that this simplication leads to outstanding
deterioration of the results for vast majority of benchmark problems. However, one must also trustfully admit that for some
problems a much simplied variant may perform better than its elaborated ancestor, which is not unexpected according to
[7,26,35,36,39].

5. Some comments on nature-inspired optimization methods

During the last fty years, the nature has become a source of inspiration for plenty of novel heuristic optimization algo-
rithms [11,5]. However, the question whether any inspiration is needed to create an efcient optimization approach seems
still unanswered [4]. Can we say that some of its sources are better or worse if the inspiration is useful? Are methods
inspired by nature, or living organisms, somehow privileged among metaheuristics? If not, why are they so plentiful today?
Let us now discuss briey, and from a few different points of view, the possible relevance of inspirations for optimization
methods.
From the rst, probably the simplest and the most pragmatic point of view, each inspiration is equally good if it helps the
inventors to develop successful methods. Such a pragmatic approach may be supported by the well known fact that even the
observation of living entities can only help us thinking metaphorically, thus enhancing our intuition and, as claimed in [45],
giving the mind greater exibility in exploring the solutions. Indeed, the complexity of living organisms, or interactions be-
tween organisms, is by no means comparable to the complexity of any known metaheuristics, hence all claimed inspira-
tions from nature in optimization algorithms are rather intuitive and vague. One may also be convinced by Fogel [18],
who noted that there is virtually no limit to the types of variation operators that can be devised, nor any reason to be con-
strained by nature for inspiration.
However, other points of view are also worth attention. The laws of nature, including physical laws or the non-living
bodies that exist in the universe, may be considered as a better source of inspirations than, for example, the harmony of mu-
sic [21] or philosophical concepts [1,7,26], as such laws are human-independent, not subjective, and lead to effects that are
truly observed in the universe. For example, physical objects tend to low energy states, the gravitational force tends to atten
the planets surface, black-holes devour the nearby stars clearing the space around it. Some may argue that such well tested
physical phenomena realized in a number of algorithms, like simulated annealing [27], gravitation search [38] and quantum-
inspired gravitational-search algorithm [43], black holes PSO [53], or chemical reaction inspired metaheuristics [31], lead to
more rm inspirations than the subjective concepts created by human beings.
But from yet another point of view the living bodies may be the best sources of inspirations, as their survival depends on
their and their ancestors past choices. Similarly, metaheuristics process the knowledge they accumulate in order to make,
hopefully, the right choices. But why algorithms based on the pure physical or chemical laws should lead to optimal re-
sults? The physical objects just follow the forces that act on them. They simply do not compete, do not accumulate any
knowledge, do not make any choices. If one puts aside the quantum mechanics and lack of determinism in the universe, from
a coarse point of view the fate of physical, non-living objects, looks predictable. Physical objects do not have any impact on
their fate, or selection whether they survive or be soon destroyed. Possibly this is the reason why most of novel heuristics
have been motivated by the biological features of living creatures, like the process of evolution, immune systems, the behav-
ior of the swarms of animals, or other types of animals activity like, for example, cuckoo search [52]. Although, as dis-
cussed in the beginning of this section, the question whether such motivations are important remains an open issue [4],
the biological inspirations may have some merit as the life as we know it is indeed a subject of optimization for about four
billion years. During that time the biological processes at different levels (organic chemical reactions, physiology, genetics,
immune systems, senses, behaviors, etc.) were tuned in such a way that the individuals (both at cell and large creature lev-
els) which were somehow better tted to the environment, and were able to reproduce effectively, survived when the oth-
ers disappeared. As the evolutionary principles that we may observe today allowed the species that followed them to survive
(hence may be termed successful), one may hope that using them as a source of inspiration may result in the construc-
tion of a successful optimizer. Moreover, all known living organisms belong to one of the species, and some form of coop-
eration between individuals has been developed within many species. The behavior of animals that cooperate must help
them to survive, otherwise the cooperation would be a waste of energy and there would be little reason that such a behavior
would become widespread among individuals of such species without any gain. Hence it also seems to be a good source of
inspiration for optimization methods. In other words, all living organisms must choose the response to the stimulus from
the environment. The ones that make better choices more frequently, avoid fatal choices and are able to propagate the
A.P. Piotrowski et al. / Information Sciences 267 (2014) 191200 199

knowledge about the rules that lead to good choices to the next generation have a higher chance to survive. In this re-
spect, the living creatures are being optimized. This, however, cannot be said about non-living physical objects, such as the
black holes.
Possibly, one of the proposed reasoning may convince the reader, but it is expected that one may add much different
argumentations to such discussion. An open question thus remains whether the terms drawn from the theories of both phys-
ical universe or life, or from any other source of inspiration, are informative at all and should be used for intentionally hu-
man-designed algorithms. However, in our opinion, as so many novel metaheuristics are proposed each year and most of
them are said to be inspired by very different entities, some discussion on sources of their inspirations is needed, at least
to better understand and systematize the algorithmic philosophies that should not be coined only to allure the reader
and make the paper more citable.

6. Conclusions

In the present paper the recently proposed black hole optimization approach [23] is presented as a simplication of the
well known Particle Swarm Optimization with inertia weight [40]. The theoretical discussion is supported by the empirical
tests performed on popular CEC2005 benchmark set [47]. It is shown that the performance of the so-called black hole ap-
proach is for vast majority of considered problems noticeably inferior to the performance of Particle Swarm Optimization
with inertia weights. However, for a few benchmarks the opposite results are obtained, which supports both theoretical
and empirical claims which are quite abundant in the literature [7,26,35,36,39] that very simple algorithms may sometimes
win with the more elaborated ones. The more general question is posed, whether an idiosyncratic case of a more general
method may be termed as an alternative method bearing a completely different name. The authors express the concern that
such an approach may jeopardize achievements in the eld of Evolutionary Computing. In addition, a short debate on the
usefulness of sources of inspirations while creating novel metaheuristic optimization algorithms is presented.

References

[1] J. Akhtar, B.B. Koshul, M.M. Awais, A framework for evolutionary algorithms based on Charles Sanders Peirces evolutionary semiotics, Inf. Sci. 236
(2013) 93108.
[2] T. Bck, H.P. Schwefel, An overview of evolutionary algorithms for parameter optimization, Evol. Comput. 1 (1) (1993) 123.
[3] L. Bai, J. Liang, C. Sui, C. Dang, Fast global k-means clustering based on local geometrical information, Inf. Sci. 245 (2013) 168180.
[4] H.G. Beyer, B. Sendhoff, I. Wegener, How to analyse evolutionary algorithms, Theoret. Comput. Sci. 287 (2002) 101130.
[5] E. Bonabeau, M. Dorigo, G. Theraulaz, Inspiration for optimization from social insect behaviour, Nature 406 (1791) (2000) 3942.
[6] I. Boussaid, J. Lepagnot, P. Siarry, A survey on optimization metaheuristics, Inf. Sci. 237 (2013) 82117.
[7] F. Carafni, F. Neri, G. Iacca, A. Mol, Parallel memetic structures, Inf. Sci. 227 (2013) 6082.
[8] S.C. Chu, P.W. Tsai, Computational intelligence based on the behavior of cats, Int. J. Innovative Comput. Control 3 (1) (2007) 163173.
[9] P. Civicioglu, Articial cooperative search algorithm for numerical optimization problems, Inf. Sci. 229 (2013) 5876.
[10] M. Crepinsek, S.H. Liu, L. Mernik, A note on teaching-learning-based optimization algorithm, Inf. Sci. 212 (2012) 7993.
[11] L.N. de Castro, Fundamentals of natural computing: an overview, Phys. Life Rev. 4 (2007) 136.
[12] M. Dorigo, V. Maniezzo, A. Colorni, Ant system: optimization by a colony of cooperating agents, IEEE Trans. Sys. Man Cybern. Part B Cybern. 26 (1)
(1996) 2941.
[13] W. Du, B. Li, Multi-strategy ensemble particle swarm optimization for dynamic optimization, Inf. Sci. 178 (15) (2008) 30963109.
[14] R.C. Eberhart, J. Kennedy, A new optimizer using particle swarm theory, in: Proc. 6th Int. Symp. Micromachine Human Sci., Nagoya, Japan, 1995, pp.
3943.
[15] M. El-Abd, Performance assessment of foraging algorithms vs. evolutionary algorithms, Inf. Sci. 182 (2012) 243263.
[16] M.B. Ferraro, P. Giordani, On possibilistic clustering with repulsion constraints for imprecise data, Inf. Sci. 245 (2013) 6375.
[17] I. Fister, I. Fister Jr., X.S. Yang, J. Brest, A comprehensive review of rey algorithms, Swarm Evol. Comput. 13 (2013) 3446.
[18] D.B. Fogel, What is evolutionary computation?, IEEE Spectr 37 (2) (2000) 2632.
[19] R. Forsati, M. Mahdavi, M. Shamsfard, M.R. Meybodi, Efcient stochastic algorithms for document clustering, Inf. Sci. 220 (2013) 269291.
[20] J. Garcia-Nieto, E. Alba, Restart particle swarm optimization with velocity modulation: a scalability test, Soft. Comput. 15 (2011) 22212232.
[21] Z.W. Geem, J.H. Kim, G.V. Loganathan, A new heuristic optimization algorithm: harmony search, Simulation 76 (2) (2001) 6068.
[22] N. Hansen, A. Ostermeier, Adapting arbitrary normal mutation distributions in evolution strategies: the covariance matrix adaptation, in: Proc. IEEE
Int. Conf. Evol. Comput., Nagoya, Japan, 1996, pp. 312317.
[23] A. Hatamlou, Black hole: a new heuristic optimization approach for data clustering, Inf. Sci. 222 (2013) 175184.
[24] S. He, Q.H. Wu, J.R. Saunders, Group search optimizer: an optimization algorithm inspired by animal search behavior, IEEE Trans. Evol. Comput. 13 (5)
(2009) 973990.
[25] I.H. Holland, Adaptation in natural and articial systems, University of Michigan Press, Ann Arbor, 1975.
[26] G. Iacca, F. Neri, E. Mininno, Y.S. Ong, M.H. Lim, Ockhams razor in memetic computing: three stage optimal memetic exploration, Inf. Sci. 188 (2012)
1743.
[27] S. Kirkpatrick, C.D. Gelatt, M.P. Vecchi, Optimization by simulated annealing, Science 220 (1983) 671680.
[28] M. Koppen, D.H. Wolpert, W.G. Macready, Remarks on a recent paper on the No free lunch theorems, IEEE Trans. Evol. Comput. 5 (3) (2001) 295
296.
[29] J.R. Koza, Genetic Programming: On the Programming of Computers by Means of Natural Selection, MIT Press, 1992.
[30] R.A. Krohling, E. Mendel, Bare bones particle swarm optimization with Gaussian or Cauchy jumps, in: Proceedings in IEEE Congress on Evolutionary
Computation (CEC), 2009, pp. 32853291.
[31] A.Y.S. Lam, V.O.K. Li, Chemical reaction inspired metaheuristic for optimization, IEEE Trans. Evol. Comput. 14 (3) (2010) 381399.
[32] A.D. Masegosa, D.A. Pelta, J.L. Verdegay, A centralised cooperative strategy for continuous optimisation: the inuence of cooperation in performance
and behaviour, Inf. Sci. 219 (2013) 7392.
[33] Z. Michalewicz, Genetic Algorithms + Data Structures = Evolution Programs, Springer, 1998.
[34] J.A. Nelder, R. Mead, A simplex-method for function minimization, Comput. J. 7 (4) (1965) 308313.
[35] M. Oltean, Searching for a practical evidence for the no free lunch theorems, in: Bioinspired Approaches to Advanced Information Technology, Springer,
Lausanne, Switzerland, 2004.
200 A.P. Piotrowski et al. / Information Sciences 267 (2014) 191200

[36] A.P. Piotrowski, Adaptive Memetic Differential evolution with global and local neighborhood-based mutation operators, Inf. Sci. 241 (2013) 164194.
[37] S. Rana, S. Jasola, R. Kumar, A review on particle swarm optimization algorithms and their applications to data clustering, Artif. Intell. Rev. 35 (3) (2011)
211222.
[38] E. Rashedi, H. Nezamabadi-pour, S. Saryazdi, GSA: a gravitational search algorithm, Inf. Sci. 179 (2009) 22322248.
[39] C. Schumacher, M.D. Vose, L.D. Whitley, The no free lunch and problem description length, in: Proc. Genet. Evolut. Comput. Conf., 2001, pp. 565570.
[40] Y. Shi, R.C. Eberhart, A modied particle swarm optimizer, in: Proceeding in IEEE Congress on Evolutionary Computation (CEC), 1998, pp. 6973.
[41] D. Simon, Biogeography-based optimization, IEEE Trans. Evol. Comput. 12 (6) (2008) 702713.
[42] D. Simon, R. Rarick, M. Ergezer, D.W. Du, Analytical and numerical comparisons of biogeography-based optimization and genetic algorithms, Inf. Sci.
181 (7) (2011) 12241248.
[43] M. Soleimanpour-Moghadam, H. Nezamabadi-pour, M.M. Farsangi, A quantum inspired gravitational search algorithm for numerical function
optimization, Inf. Sci. 267 (2014) 83100..
[44] M. Srinivas, L.M. Patnaik, Genetic algorithms a survey, Computer 27 (6) (1994) 1726.
[45] K.C.B. Steer, A. Wirth, S.K. Halgamuge, The rationale behind seeking inspirations from nature, in: Nature-Inspired Algorithms for Optimization, SCI 193,
R. Chiong (Ed.), Springer-Verlag, Berlin Heidelberg, Germany, 2009, pp. 5176.
[46] R. Storn, K.V. Price, Differential evolution a simple and efcient heuristic for global optimization over continuous spaces, J. Global Optim. 11 (4)
(1997) 341359.
[47] P.N. Suganthan, N. Hansen, J.J. Liang, K. Deb, Y.P. Chen, A. Auger, S. Tiwari, Problem Denitions and Evaluation Criteria for the CEC 2005 Special Session
on Real-Parameter Optimization, Nanyang Technol. Univ., Singapore, Tech. Rep. KanGAL #2005005, IIT Kanpur, India, 2005.
[48] Y. Wang, B. Li, T. Weise, J. Wang, B. Yuan, Q. Tian, Self-adaptive learning based particle swarm optimization, Inf. Sci. 181 (20) (2011) 45154538.
[49] D. Whitley, J. Rowe, Focused no free lunch theorems, in: Proc. Genet. Evol. Comput. Conf., 2008, pp. 811818.
[50] D.H. Wolpert, W.G. Macready, No free lunch theorems for optimization, IEEE Trans. Evol. Comput. 1 (1) (1997) 6782.
[51] Y.L. Wu, T.F. Ho, S.J. Shyu, B.M.T. Lin, Discrete particle swarm optimization with scout particles for library materials acquisition, Sci. World J. (2013),
http://dx.doi.org/10.1155/2013/636484. Art ID 636484.
[52] X.S. Yang, S. Deb, Cuckoo search via Levy ights, World Congr. Nat. Biologically Inspired Comput. 2009 (2009) 210214.
[53] J. Zhang, K. Liu, Y. Tan, X. He, Random black hole particle swarm optimization and its application, in: IEEE International Conference Neural Networks
and Signal Processing, ICNNSP, 2008, pp. 359365.

Das könnte Ihnen auch gefallen