Sie sind auf Seite 1von 6

2014 UKSim-AMSS 16th International Conference on Computer Modelling and Simulation

Improved Hybridized Bat Algorithm for Global Numerical Optimization


Adis Alihodzic

Milan Tuba

Faculty of Mathematics
University of Sarajevo
Sarajevo, Bosnia and Herzegovina
adis_mtkn@yahoo.com

Faculty of Computer Science


Megatrend University Belgrade
Belgrade, Serbia
tuba@ieee.org
individual agents give the intelligent global behaviour. The
most prominent classes of the swarm-intelligence systems are:
ant colony optimization (ACO), particle swarm optimization
(PSO), artificial bee colony (ABC), seeker optimization
algorithm (SOA), firefly algorithm (FA), cuckoo search (CS),
krill herd (KH) and bat algorithm (BA).
Particle swarm optimization is one of the first swarm
intelligence metaheuristics It emulates social behavior of fish
or birds [6]. Ant colony optimization simulates the foraging
behavior of ants. Many successful implementations of this
approach can be found in the literature [7], [8], [9]. Artificial
bee colony was inspired by foraging behavior of honey bee
swarm. This metaheuristic was successfully applied to constrained problems [10], [11], [12], [13]. Seeker optimization
algorithm models human search procedure that employs human
reasoning, interactions, and past experience. Global optimization problems were successfully solved with this technique
[14]. Firefly algorithm (FA) is based on the flashing patterns
and behaviour of fireflies. It has been proven that firefly
algorithm is very efficient in dealing with multimodal, global
optimization problems [15], [16], [17]. Cuckoo search is a
robust optimization technique that simulates search process by
utilizing Levy flights [18], [19], [20]. Social behavior of
Antarctic krills was inspiration for emergence of the krill herd
(KH) algorithm [21], [22].
Bat algorithm was introduced by Yang in 2010 [23]. It is a
metaheuristic search algorithm inspired by the echolocation
characteristic of bats [24], [25]. The primary purpose for bat's
echolocation is to serve as a hunting strategy. A comprehensive
review about swarm intelligence involving the bat algorithm is
performed by Zhang and Wang [26] and Huang and Zhao [27].
Furthermore, Tsai proposed an evolving bat algorithm to
imporve the performance of standard BA with better efficiency
[28]. Basically, the bat algorithm is powerful at intensification,
but at times it may get trapped into some local optima, so that it
may not perform diversification very well. It was the main
reason for research and improvements of this standard bat
algorithm. One of the improvements to the standard bat
algorithm is the recently developed hybrid bat algorithm
(HBA) [29]. However, by analyzing the HBA algorithm it can
be seen that further improvement are possible. In this paper we
propose additional modifications and tuning of the HBA that
almost uniformely improve results.
The rest of the paper is organized as follows. Section 2
briefly describes global numerical optimization problems, the
differential evolution (DE) algorithm, and the bat algorithm
(BA). Section 3 gives a detailed description of our proposed
modified bat-inspired differential evolution (MBDE)

AbstractSwarm intelligence algorithms have been successfully applied to intractable optimization problems. Bat algorithm
is one of the latest optimization metaheuristics and research
about its capabilities and possible improvements is at the early
stage. This algorithm has been recently hybridized with
differential evolution and improved results were demonstrated
on standard benchmark functions for unconstrained optimization. In this paper, in order to further enhance the performance
of this hybridized algorithm, a modified bat-inspired differential
evolution algorithm is proposed. The modifications include operators for mutation and crossover and modified elitism during
selection of the best solution. It also involves the introduction of a
new loudness and pulse rate functions in order to establish better
balance between exploration and exploitation. We used the same
five standard benchmark functions to verify the proposed algorithm. Experimental results show that in almost all cases, our
proposed method outperforms the hybrid bat algorithm.
Keywordsbat algorithm; swarm intelligence; metaheuristic
optimization; global optimization

I.

INTRODUCTION

Optimization is present and applicable in almost all aspects


of human civilization. Many optimization problems in business
and industry belong to the group of intractable continuous or
discrete optimization problems. Different approaches have
been developed for solving such optimization problems.
Possible classification of these approaches can divide them into
the following groups: classical methods, stochastic algorithms,
population based algorithms and other approaches. Classical
methods for the same origin values follow the same path and
always give the same final results. Stochastic algorithms are
based on the randomization and they give different final results
each time, even when starting from the same initial values [1].
Population based algorithms deal with a set of solutions and try
to improve them on each iteration step. Although the difference
between them exists, they will find the similar suboptimal
solutions each time. These algorithms can be classified into
evolutionary algorithms [2], [3] and swarm intelligence based
algorithms [4]. One of the most common representatives of
evolutionary algorithms is differential evolution (DE). Differential evolution algorithm is a stochastic search algorithm
with self-organizing tendency and it does not use the
derivatives [5].
The swarm intelligence is a research branch that models the
population of interacting agents. Although a centralized
component that controls the behaviour of individuals does not
exist in swarm intelligence systems, local interactions among
This research was supported by Ministry of Education, Science and Technological Development of the Republic of Serbia, Grant No. III-44006.

978-1-4799-4923-6/14 $31.00 2014 IEEE


DOI 10.1109/UKSim.2014.97

57

the differential weight F and the crossover probability Cr. In


this paper, we use the binomial DE/rand/1/bin scheme. The
outline of the basic DE algorithm can be described as shown in
Fig 1.

algorithm. The experimental and comparative results of the


implemented algorithm are discussed in Section 4. Finally,
Section 5 is the conclusion.
II.

RELATED WORK

In this section we will present a brief background theory of


the optimization problems, differential evolution (DE)
algorithm, and bat algorithm (BA).

Begin
Step 1: Initialization. Set the generation counter t=
1; Initialize the population x with n randomly
generated solutions; Set the weight F and crossover
probability Cr.
Step 2: Repeat.
for i=1:n do
Generate randomly three different integers a, b, c
from {1,2,...,n}such that:; Generate randomly
integer jr from{1,2,...,d};
for j=1:d do
if (rand<=Cr or j=jr) then
vi,jt =xc,jt + F( xa,jt - xb,jt )
else vi,jt =xi,jt
end if
end for
Select and update solutions by Eq (4)
end for
Update the counters such as t=t+1;
Step 3: Until t<MaxGeneration
Step 4: Post-process and output the best found
solution
End

A. Optimization Problems
In mathematics and computer science, the optimization
problems are those in which a set of unknowns {x1,,xn} is
required to be determined so that an objective function f is
minimized (maximized) and the number of constraints is
satisfied. In an optimization problem the equalities and
inequalities which represent the limitations imposed on the
decision-making problem that are required to be satisfied are
called constraints. A global optimization problem can be
defined as follows:
(1)
min f ( x)
x

xn*}

where a point x*={x1 ,,


from the set A of feasible points
in a search space is either a global minimum if f(x*) f(x) or
maximum if f(x*) f(x) for all x that belong to the set A.
Hence, the goal of optimization process is to find the values of
decision variables xk that participate in a maximum or
minimum of an objective function f.
B. Differential Evolution Algorithm (DE)
Differential evolution (DE) was introduced by Storm and
Price [5]. It is a stochastic search algorithm with selforganizing tendency. Thus, it is a population based, derivativefree algorithm. It uses a special recombination operator that
performs a linear combination of a number on individuals
(usually three) and one parent (which is subject to be replaced)
to create one child. The algorithm consists of three main parts:
mutation, crossover and selection.
Differential mutation randomly chooses three distinct
vectors xa, xb, xc, at some cycle t, and then generate a donor
vector ut by the following scheme:

u it = x ct + F ( x at x bt ), i = 1, , n

Fig. 1. Differential evolution algorithm (DE)

C. Bat Algorithm (BA)


Bat algorithm is a new population based metaheuristic
approach proposed by Xin-She Yang [23]. The algorithm
exploits the so-called echolocation of the bats. Echolocation is
typical sonar which bats use to detect prey and to avoid
obstacles. It is generally known that sound pulses are
transformed into a frequency which reflects from obstacles.
The bats navigate by using the time delay from emission to
reflection. The pulse rate can be simply determined in the
range from 0 to 1, where 0 means that there is no emission and
1 means that the bats emitting is at their maximum. In order to
transform these behaviors of bats to algorithm, Yang used three
generalized rules:
All bats use echolocation to sense distance, and they
also know the surroundings in some magical way;
Bats fly randomly with velocity vi at position xi with a
fixed frequency fmin, varying wavelength and loudness
A0 to search for prey. They can automatically adjust the
wavelength of their emitted pulses and adjust the rate of
pulse emission r from [0,1], depending on the proximity
of their target;
Although the loudness can vary in many ways, it is
assumed that the loudness varies from a positive large
value A0 to a minimum constant value Amin.

(2)

where F belongs to the closed interval [0,1] and denotes the


differential weight that scales the rate of modification.
Differential crossover is controlled by a crossover
probability Cr (Cr belongs to the interval [0,1]) and an
uniformly distributed random number ri from the interval [0,1].
Crossover can be expressed as:
u it, j , if ( ri C r )
vit, j = t
j = 1,, d
xi , j , otherwise

(3)

Differential selection can be expressed as follows:


vit , if ( f (vit ) f ( xit ))
xit +1 = t
xi , otherwise

(4)

It can be seen that the selection selects the solution with the
best fitness. Also, the overall search efficiency is controlled by

58

bats at this time step t. We can see that a scaling factor also
represents direction and intensity of random-walk. Note that
the procedure for updating the velocity and position of the bats
is similar to that of the PSO algorithm. Thus, the BA algorithm
can be seen as a well-established balance between the standard
PSO algorithm and the intensive local search controlled by the
loudness and pulse rate. The local search is launched with the
proximity depending on the rate ri of pulse emission for the i-th
bat. As the loudness usually decreases once a bat has found its
pray, while the rate of pulse emission increases, the loudness
can be chosen as any value of convenience. Mathematically,
these characteristics are captured with the following equations:

Based on these approximations and idealization, the basic


steps of the bat algorithm (BA) can be described as shown
Fig.2.
In BA algorithm, initialization of the bat population is
performed randomly and each bat is defined by its locations xit,
velocity vit, frequency fit loudness Ait, and the emission pulse
rate rit in a D-dimensional search space. The new solutions xit
are performed by moving virtual bats according to the
following equations:

f i = f min + ( f max f min )

(5)

vit = vit 1 + ( xit x* ) f i


xit = xit 1 + vit

(6)
(7)

where from the closed interval [0,1] is a random vector


drawn generated by a uniform distribution. Here x * is the
current global best location (solution) which is located after
comparing all the solutions among all the bats. Initially, each
bat is randomly assigned a frequency which is drawn uniformly
from the interval [fmin, fmax]. In our implementation, we will use
fmin =0 and fmax =2.

Ait +1 = Ait

(9)

ri t +1 = ri 0 (1 e t )

(10)

where and are constants. Actually, parameter plays a


similar role as the cooling factor of a cooling schedule in the
simulated annealing.
Hybridization of the bat algorithm (HBA) [29] improved
results, however the fine-tunning of parameters can be futher
improved, as well as the functions for rate pulse and loudness.
The HBA algorithm makes no promises that the best solutions
will not be corrupted by mutation and crossover operators.
Since the improvements obtained by this algorithm are not
completely satisfactory for all the benchmark test problems, we
propose a new method that will give better results in terms of
accuracy, convergence speed and establishing a good balance
between diversification and intensification. Therefore, in this
paper, we merge two approaches to produce a new modified
bat-inspired differential evolution (MBDE) algorithm
according to the principle of bat algorithm and differential
evolution. The experimental results indicate that the proposed
MBDE algorithm performs more efficiently and accurately
than the hybrid bat algorithm.

Begin
Step 1:Initialization. Set the generation counter t=1;
Initialize the population of n bats randomly and each
bat corrsponding to a potential solution to the given
problem; Define loudness Ai, pulse frequency Qi and
the initial velocities vi (i= 1, 2, . . . , N); Set pulse
rate ri.
Step 2: Repeat. Generate new solutions by
adjusting frequency, and updating velocities and
locations/solutions Eq. [(5)- (7)];
if (rand > ri) then
Select a solution among the best solutions;
Generate a local solution around the selected
best solution;
end if
Generate a new solution by flying randomly;
if (rand<Ai && f(xi) < f(x*)) then
Accept the new solutions;
Increase ri and reduce Ai Eq. [(9)- (10)];
end if
Rank the bats and find the current best x* ;
t=t+1;
Step 3: Until the termination criteria is not
satisfied or t<MaxGeneration
Step 4: Post-processing the results and visualizat.
End

III.

MODIFIED BAT ALGORITHM

Based on the introduction of the DE and BA algorithms in


previous section, we will describe how we merge them to get
the modified bat algorithm based on differential evolution,
which modifies the solutions with poor fitness in order to
achieve a good balance between intensification and diversification. The mainframe of the MBA is presented in Fig. 3.
In order to increase the diversification of the BA, so as to
avoid trapping into local optima, a main modification of the
basic BA includes adding a differential operators as mutation
and crossover in DE, with the aim of speeding up convergence
of the BA and preserving the attractive characteristics of that
algorithm.
The difference between the MBDE and BA is that the
mutation and crossover operators are used to improve the
original BA generating a new solution for each bat. Therefore,
the MBDE can efficiently explore the new search space with
the DE and exploit the population information with the BA. In
this way, the MBDE algorithm can avoid trapping into local
optima. This algorithm includes three modifications.
The first modification is that we use logarithmically equally
spaced numbers from 1 to 0 for loudness, and to take instead of
Eq. (10) formulas rit+1= ri0 (1-pow(-t)) for pulse rates. It will

Fig. 2. Bat algorithm (BA)

A random walk with direct exploitation is used for the local


search that modifies the current best solution according the
equation:
xnew = x * + * At
(8)
where from the interval [-1,1] is a scaling factor which is a
random number, while At=<Ait> is the average loudness of all

59

be shown that the best results are obtained for initial pulse rates
ri0=0.9, initial loudness A0=0.95 and =0.9.

solutions, a new solution for each bat is generated locally using


random walk by Eq. (8), when a random real number rand
drawn from a uniform distribution is greater than pulse rate i,.
If rand i, we use mutation and crossover operators in the
DE updating the new solution to increase diversity of the bats
to improve the search efficiency, as shown in Eq. (3).
Through testing benchmarks in Section IV.B, it was found
that setting the parameter of differential weight F to 0.75 and
crossover probability Cr to 0.95 produced the best results.
The third modification is the introduction of some elitism
into the MBDE in order to select the best solution. In this way
we prevent the best solutions from being corrupted by
differential operators. Hence, the best solutions will be saved in
an auxiliary vector, and even if differential operators ruins it,
we can revert back to it, if needed.

Begin
Step 1: Initialization. Set the generation counter t=1;
Initialize the population of n bats randomly and each
bat corrsponding to a potential solution to the given
problem; Define loudnesses Ai, pulse rates ri. pulse
frequency Qi , the initial velocities vi and ri0 (i= 1,
2, . . . , N);Set the differential weight F and crossover
probability Cr;
Step 2: Evaluate the fitness value for each bat
determined by the objective function .
Step 3: Repeat.
for i=1:N do
Generate new solutions xit by adjusting freq.
fit, and updating velocities vit and solutions xit-1
Eq. [(5)-(7)].
if (rand > rit) then
Select a solution among the best solutions x*;
Generate a local solution xut around the best
selected solution Eq. (8);
else
xut= xit; //keep previous solution
end if
Generate randomly integers a, b, c and jr such
that a, b, c are from the interval [0,n*rand], jr
takes values from [0, d*rand], and a b c i;
for j=1:d do
if (rand<=Cr or j=jr) then //crossover
xd t(j) = xct(j) + F(xat(j) - xbt(j)); // mutation
else
xd t(j) = xit(j); // take components by Eq. (7)
end if
end for
Evaluate the fitness for the offsprings xu t, xi t, xd t;
Select the offspring xlBt with the best fitness
value between the above mentioned offsprings;
if (rand<A0 && f(xit) < f(xlBt))
Accept the new solution xlBt;
Increase ri and reduce Ai;
end
Sort the bats and find the current global best x* ;
end for
t=t+1;
Step 3: Until the termination criteria is not satisfied
or t<MaxGeneration
Step 4: Post-processing the results and
visualization.
End

IV.

BENCHMARK FUNCTIONS AND EXPERIMENTAL RESULTS

A. Benchmark Functions
In this paper, our test suite consists of five standard
benchmark test functions, as used in [29]. The test test suite
includes many different types of problems such as unimodal,
multimodal, separable, non-separable, convex and continuos.
The unimodal functions contain a single global minimum with
no or single local minimum, while multimodal functions
contain more than one local optimum. The separable functions
of n variables can be written as a sum on n function of just one
variable, while nonseparable functions of n variables cannot be
written, because their variables are not independent.
The Griewank`s function is the multimodal and
nonseparable function. It has the global minimum f(x*)=0
located at the point x* = (0,0,,0). It can be defined as:
D

f1 ( x) =
i =1

D
xi2
x
cos( i ) + 1
4000 i =1
i

(11)

subject to 600 xi 600


The Rosenbrocks function is the unimodal and
nonseparable function. It has value f(x*)=0 at its global
minimum which is located at x * = (1,1,,1). It can be defined
as:
D 1

f 2 ( x) = 100( xi2 xi +1 ) 2 + ( xi 1) 2
i =1

(12)

subject to 15 xi 15
The Sphere function is the unimodal, continuous, convex
and separable function. It has value f(x*)=0 at its global
minimum which is located at x* = (0,0,,0). It can be defined
as:
D
(13)
f ( x) = x 2
3

i =1

Fig. 3. Modified bat algorithm

subject to 15 xi 15

The second modification is to add the mutation and


crossover operators in an attempt to increase diversity of the
bats to improve the search efficiency and speed up the
convergence to optimum. As we know, for the local search
part, once a solution is selected among the current best

The Rastrigins function is the multimodal and additively


separable function with the addition of cosine modulation to
produce many local minima. It has several local optima
arranged in a regular lattice, but it only has one global

60

optimum f(x*)=0 located at the point x *= (0,0,,0). It can be


defined as:
D

f 4 ( x ) = 100 + ( xi2 10 cos( 2 xi ))

second column, we use statistical measures: best, worst, mean


and standard deviation values for comparing the obtained
results between the MBDE and HBA. In the remaining five
columns the results of the MBDE and HBA algorithms are
represented for five benchmark functions denoted as f1, f2, f3, f4,
and f5.
In all experiments for each algorithm the same size of
population of 20 bats was used and the algorithms were
repeated for 25 runs.
From Table I it can be seen that the obtained results are
evidently much better for our proposed method compared to
the HBA in terms of search precision and robustness. We can
note that our proposed method gave the best results for
multimodal test functions such as f1 and f4. In the case when
the dimensionality of the test functions is equal 10, the MBDE
is much superior to the HBA. The reason for obtaining such
good results lies in the fine-tunning the control parameters,
both in the DE and BA algorithm. Furthermore, these results
are obtained as a result of modified functions of pulse rate and
loudness. One of the main reasons for achieving high precision,
especially for the mentioned functions f1 and f4, is adjusted
elitism that keeps the best solutions from being corrupted by
the local search or by the operator of mutation and crossoever.
Introducing such elitism established a good balance between
exploration and exploitation.
Table I shows that the mean of the HBA algorithm is only
once better than the mean of the MBDE algorithm. In all other
cases MBDE outperformed the HBA. It can be seen in Table I,
that the best results for the MBDE are obtained by optimizing
the functions f1 and f4 with the dimension D=10, while the
worst results are obtained by optimizing the function f2 with the
dimension D=30. If we compare the results of the MBDE and
HBA algorithms, we can conclude that both algorithms
performance is very similar for problem f2 considering the
dimensions D=20 and D=30.

(14)

i =1

subject to 15 xi 15
The Ackleys function is the multimodal, continuous and
nonseparable function. It is obtained by modulating an
exponential function with a cosine wave of moderate
amplitude. Originally, it was formulated by Ackley only for the
two dimensional case. It is presented here in a generalized,
scalable version. Its topology is characterized by an almost flat
(due to the dominating exponential) outer region and a central
hole or peak where the modulations by the cosine wave
become more and more influential. It has the global minimum
f(x*)=0 located at the point x*= (0,0,,0). It can be defined as:

f 5 ( x) = 20 + e 20e

0.2

1 D 2
xi
D i =1

1 D
cos( 2 xi )
D i =1

(15)

subject to 32 xi 32
B. Experimental Results
The proposed method has been implemented in C#
programming language. All tests were done on an Intel Core i7
3770 K, 3.5 GHz with 16GB of RAM. The experimental
results, both for the proposed MBDE method and hybrid bat
algorithm HBA, are summarized in Table I. In this atrticle, we
used the same format of table as in paper [27], in order to make
the comparisons of obtained resulsts easier.
TABLE I.
D
10
M
B
D

20

E
30

10
H
B
A

20

30

RESULST OF COMPARISION BETWEEN THE MBDE AND HBA


ALGORITGMS OVER 25 MULTI-THREADED RUNS
Val
Best
Worst
Mean
StDev
Best
Worst
Mean
StDev
Best
Worst
Mean
StDev
Best
Worst
Mean
StDev
Best
Worst
Mean
StDev
Best
Worst
Mean

f1
0
0
0
0
0
5.55E-16
1.48E-16
1.06E-16
2.22E-16
1.32E-14
9.77E-16
2.30E-15
2.25E-09
3.97E-05
3.18E-06
1.14E-07
1.01E-07
2.96E+01
8.56E-07
2.17E+00
6.38E-06
3.57E+01
6.42E-05

f2
2.13E-14
4.30E+00
3.70E-01
1.06E+00
1.73E-05
1.68E+01
1.13E+01
3.11E+00
7.24E+00
8.62E+01
2.67E+01
1.57E+01
6.34E-02
5.10E+02
6.22E+01
7.73E+00
9.73E-03
9.24E+01
1.10E-01
2.00E+01
8.28E+00
2.17E+02
6.59E+01

f3
4.56E-29
4.91E-23
2.63E-24
9.10E-24
8.62E-23
4.87E-17
2.44E-18
8.76E-18
3.09E-18
1.36E-13
1.40E-14
3.42E-14
4.83E-09
2.89E-03
1.26E-04
1.66E-07
4.83E-04
5.47E+01
5.87E-03
1.60E+01
3.37E-01
9.97E+01
3.09E+00

f4
0
0
0
0
0
2.13E-14
2.31E-15
4.57E-15
0
1.55E-13
1.49E-14
3.06E-14
5.12E+00
2.38E+01
1.55E+01
1.69E+01
1.89E-03
1.77E+01
2.18E-02
6.18E+00
1.62E+00
3.98E+01
1.29E+01

f5
3.51E-14
2.68E-12
3.88E-13
5.12E-13
8.56E-12
4.76E-09
7.75E-10
1.20E-09
1.62E-09
3.07E-07
4.23E-08
6.31E-08
6.31E-04
2.00E+01
1.16E+01
1.78E+01
3.70E-05
5.48E+01
3.82E-05
1.95E+01
5.43E-04
9.85E+01
2.53E-03

StDev

3.12E+00

2.00E+01

1.72E+01

5.03E+00

1.94E+01

V.

CONCLUSION

In this paper we presented improved hybridization of the


basic bat algorithm with the well-known DE algorithm. Bat
algorithm has been previously successfully hybridized with
differential evolution algorithm (HBA) [29]. Our proposed
modified bat-inspired differential evolution (MBDE) algorithm
has been implemented and tested on the set of five standard
benchmark functions. From the analysis of the experimental
results we concluded that our proposed MBDE algorithm
generates better quality solutions on most multimodal and
unimodal benchmark problems compared to the HBA
algorithm and it also improves convergence rate without losing
the robustness of the basic bat algorithm. Therefore, the
proposed MBDE algorithm modifications, namely operators
for mutation and crossover, modified elitism during selection
of the best solution and new loudness and pulse rate functions
improve the performance over the HBA algorithm and show
that bat algorithm as a recently developed algorithm has
potential for further improvements.
ACKNOWLEDGMENT

Table I consists of seven colums in order to faciliate side by


side comparasion. In the first column, we show the dimensions
of the becnmark test functions: 10, 20 or 30 as in [29]. In the

This research was supported by Ministry of Education,


Science and Technological Development of the Republic of
Serbia, Grant No. III-44006.

61

[15] I. Fister, I. Fister Jr., X. S. Yang, J. Brest, A comprehensive review of


firefly algorithms, Swarm and Evolutionary Computation, vol. 13, no.
1, pp. 34-46, 2013.
[16] A. Gandomi, X.-S. Yang, S. Talatahari, A. Alavi, Firefly algorithm
with chaos, Communications in Nonlinear Science and Numerical
Simulation, vol. 18, no. 1, pp. 89-98, 2013.
[17] X.-S. Yang, S. Hosseini, A. Gandomi, Firefly Algorithm for solving
non-convex economic dispatch problems with valve loading effect,
Applied Soft Computing, 2012, vol. 12. no. 3, pp. 1180-1186.
[18] I. Brajevic, and M. Tuba, Cuckoo search and firefly algorithm applied
to multilevel image thresholding, in Cuckoo Search and Firefly
Algorithm: Theory and Applications, ser. Studies in Computational
Intelligence, X.-S. Yang, Ed., vol. 516. Springer International
Publishing, 2014, pp. 115139.
[19] E. Valian; S. Mohanna; S. Tavakoli, Improved Cuckoo Search
Algorithm for Global Optimization, International Journal of Communications and Information Technology, 2011, vol. 1, no. 1, pp. 31-44.
[20] X.S. Yang, S. Deb, Engineering optimization by cuckoo search,
International Journal of Mathematical Modeling and Numerical
Optimisation, vol. 1, no. 4, 2010, pp. 330-343.
[21] S. Saremi, S. M. Mirjalili, S. Mirjalili, Chaotic Krill Herd Optimization
Algorithm, Procedia Technology, 2014, vol. 12, pp. 180-85.
[22] A. H. Gandomi and A. H. Alavi, Krill herd: A new bio-inspired
optimization algorithm, Communications in Nonlinear Science and
Numerical Simulation, vol. 17, no. 12, 2012, pp. 48314845.
[23] X.-S. Yang, A new metaheurisitic bat-inspired algorithm, in: Nature
Inspired Cooperative Strategies for Optimization (NISCO 2010) (Eds. J.
R. Gonzalez et al.), Springer, SCI 284, 2010, pp. 65-74.
[24] X. S. Yang, Bat algorithm for multi-objective optimisation,
International Journal of Bio-Inspired Computation, vol. 3, no. 5, 2011,
pp. 267-274.
[25] G. Wang, L. Guo, H. Duan, L. Liu, and H. Wang, A Bat algorithm with
mutation for UCAV path planning, The Scientific World Journal, 2012,
vol. 2012, Article ID 418946, doi:10.1100/2012/418946, 15 pages.
[26] G. Q. Huang, W. J. Zhao, Q. Q. Lu, Bat algorithm with global
convergence for solving large-scale optimization problem, Application
Research of Computers, 2013, vol. 30, no. 3, pp. 1-10.
[27] J. W. Zhang, G. G. Wang, Image matching using a bat algorithm with
mutation, Applied Mechanics and Materials, 2012, vol. 203, no. 1, pp.
8893.
[28] P., W. Tsai, J. S. Pan, B. Y. Liao, M. J. Tsai, V. Istanda, Bat Algorithm
Inspired Algorithm for Solving Numerical Optimization Problems,
Applied Mechanics and Materials, 2012, vol. 148, pp. 134-137.
[29] I. Jr. Fister, D. Fister, X.-S. Yang, A hybrid bat algorithm,
Elekrotehniki Vestnik, vol. 80, 2013, pp. 1-7

REFERENCES
[1]
[2]
[3]
[4]
[5]
[6]
[7]

[8]

[9]

[10]
[11]
[12]

[13]

[14]

J. Dro, P. Siarry, A. Ptrowski and E. Taillard, Metaheuristics for


Hard Optimization, Springer Berlin Heidelberg, 2006, p. 372.
A. E. Eiben and J. E. Smith, Introduction to Evolutionary computing,
Springer-Verlag, Berlin, 2003, p. 300.
M. repinek, S.-H. Liu, M. Mernik, Analysis of exploration and
exploitation in evolutionary algorithmy by ancectry trees, Int. J. of
Innovative Computing and Applications, vol. 3, no. 1, pp. 11-19, 2011.
X.-S. Yang, Z. H. Cui, R. B. Xiao, A. H. Gandomi, and M.
Karamanoglu, Swarm Intelligence and Bio-Inspired Computation:
Theory and Applications, Elsevier Science, USA, 2013, p. 450.
R. Storm, K. Price, Differential evolution a simple and efficient
heuristic for global optimization over continuous spaces. Journal of
global optimization, vol. 11, no. 4, 1997, pp. 341-359.
J. Kennedy, R. C. Eberhart, Particle swarm optimization, in: IEEE
International Conference on Neural Networks, (Piscataway, NJ), 1995,
pp. 1942-1948.
R. Jovanovic, and M. Tuba, An ant colony optimization algorithm with
improved pheromone correction strategy for the minimum weight vertex
cover problem, Applied Soft Computing, vol. 11, no. 8, pp. 53605366,
December 2011.
R. Jovanovic, and M. Tuba, Ant colony optimization algorithm with
pheromone correction strategy for the minimum connected dominating
set problem, Computer Science and Information Systems (ComSIS),
vol. 10, no. 1, pp. 133149, January 2013.
M. Tuba, and R. Jovanovic, Improved ant colony optimization algorithm with pheromone correction strategy for the traveling salesman
problem, International Journal of Computers, Communications &
Control, vol. 8, no. 3, pp. 477485, June 2013.
N. Bacanin, and M. Tuba, Artificial bee colony (ABC) algorithm for
constrained optimization improved with genetic operators, Studies in
Informatics and Control, vol. 21, no. 2, pp. 137146, June 2012.
I. Brajevic, and M. Tuba, An upgraded artificial bee colony algorithm
(ABC) for constrained optimization problems, Journal of Intelligent
Manufacturing, vol. 24, no. 4, pp. 729740, August 2013.
M. Tuba, and N. Bacanin, "Artificial bee colony algorithm hybridized
with firefly metaheuristic for cardinality constrained mean-variance
portfolio problem," Applied Mathematics & Information Sciences, vol.
8, no. 6, 2014
M. Subotic, and M. Tuba, "Parallelized Multiple Swarm Artificial Bee
Colony Algorithm (MS-ABC) for Global Optimization," Studies in
Informatics and Control, vol. 23, no. 1, pp. 117-126, 2014
M. Tuba, I. Brajevic, and R. Jovanovic, Hybrid seeker optimization
algorithm for global optimization, Applied Mathematics & Information Sciences, vol. 7, no. 3, pp. 867875, May 2013.

62

Das könnte Ihnen auch gefallen