Sie sind auf Seite 1von 8

Dynamic Selection Parameter Based Differential Evolution Algorithm

1
Avjeet Singh,

Research Scholar, Motilal Nehru National Institute of Technology Allahabad, Allahabad


1
avijeet.mnnit.cs@gmail.com

Abstract: Differential evolution is a standout amongst the most effective nature-inspired algorithm which is utilized to
solve the complex problems. In DE algorithm, start the maximum generation of function evaluation so increase the
computation cost. In this paper, dynamic selection based parameter has been utilized for enhancing the execution of
differential evolution algorithms. Proposed selection parameter enhances the convergence speed. Comparisons with other
DE variants such as DE-F, JADEb, DE, DEAE, DEb, DE-PAL, CMAES, MVDE establishes that the proposed dynamic selection
parameter can enhance the execution of differential evolution algorithm.

Index term- Differential evolution algorithm, Endocrine, COCO Platform

I. INTRODUCTION Algorithm 1 Basic Differential Evolution:

Worldwide optimization problem-solving methods 1: procedure DEA


are appropriate for each complex problem and 2: Population initialization
provide the best test solution in terms of true Evaluate the fitness value
3: Apply Mutation Strategy
minimum optimum, fast convergence rate, a
4: Generate a doner vector:
minimum number of control parameters, which are
γi, G = {γ1,i, G , γ 2i, G,……… γ D, i, G }
easy to use for the real application. In 1995, Storn and 5: Apply crossover Strategy
Price introduced the “Differential Evolution” 6: Generate a trial vector:
algorithm in 1995, which is a population-based ̃ i, G +1 = ( ⃗1i, G+1, ⃗2i, G+1……., ⃗Di, G+1, )
stochastic search algorithm for constraint and 7: Apply Selection Strategy
unconstrained persistent optimization problems. 8: A better Solution schema is used
DE is a preferred optimization technique due to 9: If the condition meets then termination
10: end procedure
following reasons [1, 14]:

 DE has only three parameters to i.e. F


Dynamic selection parameter to prevent solutions on
(Mutation), Cr (Crossover) and NP (Population
converging local and global optimal solutions, new
Size) to control the algorithm and to be
parameter schema are proposed in this paper of DE is
controlled for improving the performance of
designed. This variant is named as Dynamic Selection
DE.
parameter Based on Differential Evolution Algorithm
 DE has the reduce computation cost. (DSPDE) because this variant initiate to maintain the
 DE is easy to implement for different same environment which was created during initial
application such as engineering and statics. generations. To check the performance of proposed
The essential differential evolution algorithm is dynamic selection based parameter, these algorithms
depicted in algorithm 1. Fundamentally, DE searches are compared with other standard DE algorithms.
for a goes through four steps i. e. initialization, Performance analysis shows that these variants
mutation, crossover and selection. outperform other variants of DE.

The Performance of DE relies upon control parameter Whatever is left of the paper is sorted out as takes
so the trouble for changing the parameters in low to a after; section II gives a basic concept of DE. In
higher measurement. It is not suitable parameter so section III related work is given, in section IV
increases the generation in high measurement for proposed approach using dynamic selection based
local and global search. The parameter changes the parameter is clarified, Section V describes
total execution; it isn't a decent outcome, at that point experimental result and analysis with comparing
changes the parameter. Some related constant value variants of DE and in section VI conclusion and
is not effective in keeping up diversity higher future work of this paper is depicted.
measurement problem. Utilizing the dynamic
selection parameter, the convergence rate of DE
algorithm can be made strides. II. Basic Concepts OF DE
Differential Evolution [17] is a random population- held.. Till now many variants of DE have been
based algorithm for continuous function optimization. developed that are detailed in further section 3.
Basically, a population of NP and D-dimensional
parameter vectors is involved in differential evolution
that encodes candidate solutions that are III. Related Works
⃗⃗i, G = { ⃗⃗1i, G, ⃗⃗2i, G …………⃗⃗Di, G} Differential evolution performs well in standard
benchmark test functions and true improvement
where { i= 1, 2, 3........NP }, G is the of generation, issues [14] such as multi-objective, non-convex, non-
D is a dimension and a population consists of NP differential, non-linear constraints and dynamic
parameter vectors ⃗⃗i, G . components [1, 14, 28]. DE confronted is the
fundamental issues:
Mutation:
Mutation strategies and control parameters need to
This mutation operator to prolificacy a donor vector be decided well before the application of DE in
⃗⃗i,G with regard to each of individual target vectors problem-solving.
Every optimization problem needs a particular choice
⃗⃗i, G in current population. For each target vector ⃗⃗i, G of mutation factor and control parameters [14].
at the generation G, it's relating a donor vector is
generated: For improving the execution and utilization of DE,
numerous variations of DE for continuous, Single-
⃗⃗i,G = { ⃗⃗⃗ 1, i, G , ⃗⃗ i, G,……… ⃗⃗ D, i, G } can be generated objective and Multi-objective optimization problems
by and with various decision rules have been produced
⃗⃗i,G = ⃗⃗ + δ. (⃗⃗ - ⃗⃗ ) which are quickly depicted underneath.
Where r1 ≠ r2 ≠ r3 ≠ i, randomly chosen indexes
r1, r2 , r3 ∈ [1, NP ]. F is a real number (F ∈ [0, 2]) that Jason Teo [30] proposed (DESAP) two procedures:
Utilize an absolute encoding technique and use
controls the addition of the difference vector ⃗⃗ -
relative encoding strategy for population measure.
⃗⃗ ). The proposed algorithm with dynamic self-adjusting
population sizes performed better from regular
Crossover: differential evolution algorithms with static
After completion of mutation phase, crossover is populations and furthermore demonstrated that an
absolute encoding strategy for self-adjusting
used to generate trail vector of each pair of target
population delivered a superior outcome from relative
vector ⃗⃗r,G and its corresponding donor vector ⃗⃗i,G + 1
encoding philosophy.
as follows: In 2003 [15] Proposed a hybrid PSO termed as
̃ i, G +1 = ( ⃗1i, G+1, ⃗2i, G+1……., ⃗Di, G+1, ) DEPSO with DE operator. This uses bell-shaped
mutation with accord on the population diversity
Trail vector: along with the evolution while retaining the self-
organizing PSO dynamics. To diminish the
̃ j, G +1 = ⃗⃗⃗ji,G+1, if r(j) ≤ CR or j = rn(i)
computational time and enhance the execution of DE,
⃗ i,G, if r(j) > CR and j ≠ r n (i)
Tasoulis et al. [3] proposed its parallelization in a
virtual parallel condition, in 2004. The proposed
Where J = 1, 2, 3, 4 ... D, r (j) ϵ [0,1], j the evaluation model maps an aggregate subpopulation to a
of a uniform random generator number, CR is the processor, enabling diverse subpopulations to make
crossover constant [0,1], rn(i) ϵ (1,2,...D) is a autonomously towards a solution.
R. Mallipeddi et al. [31] proposed a utilize a troupe of
arbitrarily picked index which guarantees that ̃ j, G +1
mutation procedures and control parameters with the
gets at least one element from ⃗ji,G+1,
DE that is (EPSDE). The author demonstrates a pool
of distinct mutation techniques alongside a pool of
qualities for each control parameter exists together all
Selection: Apply better selection scheme: through the evolution procedure. The execution of
EPSDE is assessed on the arrangement of benchmark
̃ i, G +1, if f( ̃ i, G +1) < f (⃗⃗i,G ) for main prob. issues and contrasted with the conventional DE and a
⃗⃗i,G+1, = ⃗⃗i,G, otherwise few several state-of-the-art parameter versatile DE
variations.
In 2014, Rahul A. Sarkar et al. [33] proposed a
For j = 1, 2, 3, 4….., D. if trail vector ⃗i,G+1, yields a powerfully three arrangements of the parameter:
superior cost function value than target ⃗⃗i,G, then amplification factor, hybrid rate and population size
⃗i,G+1, is set to ⃗⃗i,G+1 ; generally, the old value ⃗⃗i,G is of an issue over the span of a solitary run. The
execution of the proposed algorithm is broken down
to three arrangements of optimization test issues, two diversity. Maintain the diversity so required the
of them are obliged and the third one is an dynamic selection based parameter. The dynamic
unconstrained test issue. The proposed algorithm selection based parameter to randomly choose
spares the computational time by 14.48 percent in constant value according to population size. This
contrast with a DE algorithm with a solitary blend of initial function evaluation (increase or decrease)
parameters. based dynamic selection based parameter. A
10] Proposed DE/current-to-Rand/utilizing number proposed strategy that follows the concept of
juggling recombination, which supplanted binomial
(DSPDE \ rand \ 1), these procedures explain in
crossover methodology with rotationally invariant
algorithm 2.
math line recombination operator that is utilized to
produce the trial vector utilizing a linear combination Algorithm 2 (DSPDE \ rand \ 1)
of the target vector and it’s comparing donor vector.
[32] Proposed the (DynDE) for solving a dynamic
optimization problem using DE and describe random: 1: Procedure Start
F and CR parameters. 2: Population initialization.
Quantum individuals, Brownian individuals, and 3: Dynamic Selection Parameter
entropic differential evolution are compared to each 4: F dynamic selection (0.1 to 2)
other for increasing the diversity during the run. 5: Cr dynamic selection (0.1 to 1)
In 2003[5], Fan and Lampinen, proposed a variant of 6: Evaluate the fitness value.
DE using trigonometric mutation operator with a 7: Mutation Strategy
probability of Mt = 0.05 and the mutation strategy 8: Generate a donor vector:
with a probability of (1-Mt). ⃗⃗i,G = ⃗⃗ + δ. (⃗⃗ - ⃗⃗ )

Omran et al. [11] presented SDE with a scaling factor 9: Crossover Strategy
parameter (self-versatile) utilizing mutation factor F. 10: Generate a trail vector: Apply the binomial
CR esteem for every person from an ordinary crossover
dissemination N (0.5, 0.15) was created by the self- 11: Selection Strategy
versatile parameter.
In 2008, Zhang, Min et al. [36] proposed (DSS-MDE) 12: Evaluate fitness of ⃗⃗i,G and ⃗i,G+1
dynamic stochastic determination for the structure of 13: if un_improved _solutions ≥ ((5* NP) /5)
multimember differential evolution to take care of the then
compelled issues. The proposed approach 14: counter _index = conter_index + 1
demonstrates dynamic altering settings for 15: end if
examination likelihood, by analyses and experiments 16: if counter _index ≥ 3 then
and the convergence speed up by the dynamic setting. 17: ⃗⃗ i,G = ( rand* rand /2 * ⃗⃗i,G)
18: else if Fitness of ⃗⃗⃗ i,G is better than parent
In 2008, Das et al. [43], has proposed a plan, in which then
by obtaining a vector distinction operator from DE, 19: selection as solution
the execution of PSO has been enhanced regarding 20: end if
solution quality, time to discover the solution 21: Choose best Vectors, but not improve then go to
(convergence speed), the recurrence of discovering dynamic selection parameter.
ideal solution and versatility. 22: If conditions met then termination.
23: end procedure
IV. Proposed Approach

DE performance depends on control parameter and 1) Selection Strategy:


these control parameters have to be changed per time. Find the Better candidate solution and comparing
Differential evolution suffers from the problem fitness population. If the counter index becomes
change of control parameter different dimension. Its greater than three we use of the vectors and see if
solutions local and global search space as a number they have better fitness. The better solution saves the
of generation increases, this shows that its control archive and comparing the old solution with a new
parameter is not good to handle to increases the solution. After comparing the best candidate solution
generation on a different dimension. This difficulty and replace the archive in old solution [7]. Explain
removes by dynamic selection based parameter. the procedure in algorithm 2.
In this approach, this parameter is mutation (F) and
2) Dynamic selection based parameter:
crossover (Cr) change value because it is not a better
solution then apply the dynamic selection based If the total number of vectors is not improved the
parameter. DE start solution then increases the convergence rate and improves diversity. So, change
generation of function evaluation where the lost the mutation factor and crossover value. Then apply
for dynamic parameter such that δ scale factor and
crossover value TABLE II
Benchmark Function
Cr. δ = random/2 + (0.0 to 1). Random value selects
on population size in dimension.
V. Experimental Results Group f# Function name

The COCO stage has been utilized for the Black- Separable
F1 Sphere
Box-Optimization-Benchmarking (BBOB) 24 F2 Ellipsoidal
noiseless test functions [4] and clarifies in table II. F3 Rastrigin
These variants verify in COCO framework with F4 Bache Rastrigin
F5 Linear Slope
respect to given criteria:

1) Comparing proposed variant Dynamic


Selection Parameter Based Differential Low or moderate conditioning F6 Attractive
Evolution Algorithm Using (DSPDE) with DE F7 Step Ellipsoidal
variants DE-PAL, MVDE , DE, DEAE, F8 Rosenbrock, original
F9 Rosenbrock, rotated
CMAES, DE-F, JADEb, DEb on the
dimensions 2D, 3D, 10D, 20D & 40D all Unimodal with high conditioning F10 Ellipsoidal
functions(f1-f24), Multi-model with weak F11 Discus
global (f20-f24) and Multi-model (f15-f19). F12 Bent Cigar
F13 Sharp Ridge
F14 Different Powers
2) Empirical (cumulative) distribution functions
(ECDF) of running length in 20D.

3) Expected running time (ERT) loss ratio of Multi-modal with adequate


proposed variants on 20-dimension. global structure F15 Rastrigin
F16 Weierstrass
F17 Schaffers F7
TABLE 1 F18 Schaffers F7
CONTROL PARAMETERS moderately ill conditioned
F19 Composite Griewank-
Rosenbrock F8F2
S.No. Parameters Types

1. Population Size(NP) 50
2. Mutation(Scale factor) Δ[0 -2] Multi-modal with
Weak global structure F20 Schwefel
3. Crossover Rate Cr[0 - 1] F21 Gallaghers
4. Dimension (Low) [2, 3, & 5] Gaussian 101-me Peaks
5. Dimension(High) [10, 20 & 40] F22 Gallaghers
Gaussian 21-hi Peaks
F23 Katsuura
A. Testing Framework & Benchmark Functions F24 Lunacek bi-Rastrigin

EBDE is tried utilizing COCO structure. The Search


Space is [-5, 5], an i.e. vector can have the donor and
trail vector inside area [-5, 5]. Majority of benchmark
A. Result Analysis
capacities has optima in [-4, 4] space. The issue is
tried for minimization. Fifteen instances of each Proposed variant DSPDE these compared with
function were taken. The terminal conditions are standard DE algorithms JADEb [23], DE [16], DEAE
either work assessments or an accuracy of more [22], DEb [23], DE-PAL [24], CMAES [26], MVDE
than . EBDE-variations utilize the different [25], DE-F[27] on the dimensions 2D, 3D, 5D, 10D,
parameters the estimation of which is clarified as 20D & 40D. These standard DE algorithm available
population measure is taken as 50. The population data set on http://coco.gforge.inria.fr.
estimate is settled. Control parameter clarifies in table
1. The most extreme number of functional
evaluations FE is 10000*D, where D is the
dimension. The analyses were done on a PC with
Intel Core i5 CPU with a speed of 3.40GHz,
introduced memory (RAM) 4 GB, operating system
Windows 10 Pro 64 bit and x64 based processor.
Fig.1. Comparing graph proposed and standard DE in 2-D Fig.4. Comparing graph proposed and standard DE in 20-D

Fig.2. Comparing graph proposed and standard DE in 3-D Fig.5. Comparing graph proposed and standard DE in 40-D

Fig.3. Comparing graph proposed and standard DE in 10-D Fig.6. Comparing graph proposed and standard DE in 2-D
Fig.7. Comparing graph proposed and standard DE in 3-D Fig.10. Comparing graph proposed and standard DE in 40-D

Fig.8. Comparing graph proposed and standard DE in 10-D Fig.11.ECDF convergence graph for achieving target functions in
20-D

Fig.9. Comparing graph proposed and standard DE in 20-D


Fig.12. ERT loss graph for achieving target functions in 20-D
1) Multi-Model with weak global (F20-F24) } VI. Conclusion

Justify the result analysis for improving the In this paper applying, a dynamic selection based
convergence rate in the multi-model with weak global parameter to improve convergence rate for high
structure. To test comparing performance in best dimension in the search space. In DE algorithm, start
2009 during observed target function proposed the maximum generation of function evaluation
variants explain the better performance are DSPDE applies the dynamic selection approach. In this
on 2-dimension showing in figure 1, DSPDE on 3- approach provide the maintain diversity in local and
dimension showing in figure 2, DSPDE on 10- global search space.
dimension showing in figure 3, DSPDE on 20-
REFERENCES
dimension showing in figure 4, DSPDE on 40-
dimension showing in figure 5. [1] Das, Swagatam, and Ponnuthurai Nagaratnam Suganthan,
Differential evolution: a survey of the state-of-the-art
2) All functions (F1-F24): Evolutionary Computation, pp. 4-31, IEEE Trans. on
Evolutionary Computation, Vol. 15, No. 1, Feb. 2011.
Black Box optimization Benchmark function (F1 - [2] D. Zaharie, Critical values for the control parameters of
F24) in ERT (Expected Running Time) "best 2009" differential evolution algorithms, pp. 62-67, Proc. 8th Int.
line observed during for each single target. To test Mendel Conf. Soft Comput., 2002.
comparing performance proposed approach with [3] D. K. Tasoulis, N. G. Pavlidis, V. P. Plagianakos, and M. N.
standard DE algorithms in best 2009 during observed Vrahatis, Parallel differential evolution, pp. 2023-2029 in
target function . Explain in given below Proc. Congr. Evol. Comput., 2004.
[4] http://coco.gforge.inria.fr/.
Verify the improve the convergence rate for [5] H.Y. Fan and J. Lampinen, A trigonometric mutation
proposed variants are DSPDE on 2-dimension operation to differential evolution, pp. 105-129, J. Global
showing in figure 6, DSPDE on 3-dimension showing Optimization, vol. 27, no. 1, 2003.
in figure 7, DSPDE on 10-dimension showing in [6] IztokFajfar, JanezPuhan, SaoTomai, and rpdBrmen, On
figure 8, DSPDE on 20-dimension showing in figure Selection in Differential Evolution, pp. 275-280,
9, DSPDE on 40-dimension showing in figure 10. ELEKTROTEHNI KI VESTNIK , English Edition, 2011.
[7] J. Zhang, A. C. Sanderson, JADE: Adaptive differential
evolution with the optional external archive, pp. 945 IEEE
Trans. on Evol. Comput. 13 (5) 2009.
3) ECDF: [8] J. Brest, S. Greiner, B. Bo skovi c, M. Mernik, and V.Zumer,
Self-adapting control parameters in differential evolution: A
The running time of an ECDF number of function comparative study on numerical benchmark problems, pp.
evaluations partitions search space dimension (D) for 646-657 IEEE Trans. Evol. Comput., vol. 10, no. 6, Dec. 2006.
ECDF of the best-accomplished fopt + f with f = . [9] J. Ronkkonen, S. Kukkonen, and K. V. Price, Real parameter
Where n = {1, -1, -4, -8}. These n values mean best- optimization with differential evolution, pp. 506-513 Proc.
achieve a target of difficulty on a set of 24 benchmark IEEE CEC, vol. 1, 2005.
function trails for two algorithms best 2009 line and [10] K. V. Price: An introduction to differential evolution, in New
proposed EBDE variants. Check the ECDF run length Ideas in Optimization, D. Corne, M. Dorigo, and V. Glover,
pp. 79-108, Eds. London, U.K.: McGraw-Hill, 1999.
of two algorithms in 20-dimension explains shown in
[11] M. G. H. Omran, A. Salman, and A. P. Engelbrecht, Self-
fig. [11].
adaptive differential evolution, pp. 192-199, in Proc.
4) Expected Run Time performance Comput. Intell. Security, Lecture Notes in Artificial
Intelligence 3801. 2005.
ERT (Expected Run Time) of the loss ratios of [12] N. Noman and H. Iba, Accelerating differential evolution
achieving the target precision for expected run length using an adaptive local search, pp. 107-125, IEEE Trans. Evol.
compared to best 2009 line on 20-dimension. Comput., vol. 12, no. 1, Feb. 2008.
Proposed variants explain in figure [12].
TABLE III
POSITION OF DSPDE VARIANTS ON TEN STANDARD ALGORITHMS OVER BBOB BENCHMARK FUNCTIONS

Group Dimension Proposed DE Variants DE Variants Other Optimization Algorithm

All Function (f1-f24) 2 1-DSPDE 2-DE 5-CMAES


3 1-DSPDE 2-DE -PAL 4-CMAES
10 1-DSPDE 2-JADEb 4-CMAES
20 1-DSPDE 2-DEAE 4-CMAES
40 1-DSPDE 2-DEAE 4-CMAES
[13] Qinqin Fan and Xuefeng Yan, Self-adaptive differential [30] Teo, Jason. "Exploring dynamic self-adaptive populations in
evolution algorithm with discrete mutation control differential evolution." Soft Computing-A Fusion of
parameters, pp. 1551-1572, Expert Systems with Foundations, Methodologies, and Applications 10.8 (2005):
673-686.
Applications: An International Journal, Volume 42 Issue 3,
[31] Mallipeddi, Rammohan, et al. "Differential evolution
and February 2015. algorithm with an ensemble of parameters and mutation
[14] [2] Ruhul A. Sarker, Saber M. Elsayed, and Tapabrata Ray, strategies." Applied Soft Computing 11.2 (2011): 1679-1696.
Differential Evolution With Dynamic Parameters Selection [32] Mendes, Rui, and Arvind S. Mohais. "DynDE: a differential
for Optimization Problems, pp. 689-707, IEEE Transaction on evolution for dynamic optimization problems." Evolutionary
Evolutionary Computation, VOL. 18, NO. 5, OCTOBER 2014. Computation, 2005. The 2005 IEEE Congress on. Vol. 3. Ieee,
[15] [3] Storn, Rainer, and Kenneth Price, Differential 2005.
[33] Sarker, Ruhul A., Saber M. Elsayed, and Tapabrata Ray.
Evolution€”A Simple and Efficient Adaptive Scheme for
"Differential Evolution with Dynamic Parameters Selection
Global Optimization Over Continuous Spaces, International
for Optimization Problems." IEEE Trans. Evolutionary
Computer Science Institute, Berkeley. Berkeley, CA (1995), Computation 18.5 (2014): 689-707.
1995. [34] A. K. Qin, V. L. Huang and P. N. Suganthan, "Differential
[16] [4] [15] W. B. Langdon and R. Poli, Evolving problems to Evolution Algorithm With Strategy Adaptation for Global
learn about particle swarm optimizers and other search Numerical Optimization," in IEEE Transactions on
algorithms, pp. 561-578, IEEE Trans. Evol. Comput., vol. 11, Evolutionary Computation, vol. 13, no. 2, pp. 398-417, April
no. 5, Oct. 2007. 2009.doi: 10.1109/TEVC.2008.927706
[35] Storn, Rainer, and Kenneth Price. "Differential evolution a
[17] W. B. Langdon and R. Poli, Foundations of Genetic
simple and efficient heuristic for global optimization over
Programming, New York: Springer-Verlag, 2002.
continuous spaces." Journal of global optimization 11.4
[18] Wenchao Yi, Liang Gao, Xinyu Li and Yinzhi Zhou, A new (1997): 341-359.
differential evolution algorithm with a hybrid mutation [36] Zhang, Min, Wenjian Luo, and Xufa Wang. "Differential
operator and self-adapting control parameters for global evolution with dynamic stochastic selection for constrained
optimization problems, pp. 642-660, ApplIntell (2015) 42, optimization." Information Sciences 178.15 (2008)
2015. [37] Brest, Janez, et al. "High-dimensional real-parameter
optimization using self-adaptive differential evolution
[19] Xinyu Zhou, Zhijian Wu, Hui Wang and Shahryar
algorithm with population size reduction." Evolutionary
Rahnamayan, Enhancing differential evolution with role Computation, 2008. CEC 2008.IEEE World Congress on
assignment scheme, pp. 2209-2225, Soft Compute (2014) 18, Computational Intelligence). IEEE Congress on. IEEE, 2008.
2014. [38] Islam, Sk Minhazul, et al. "An adaptive differential evolution
[20] Z. Yang, J. He and X. Yao, Making a difference to differential algorithm with novel mutation and crossover strategies for
evolution, pp. 415-432, in Advances in Metaheuristics for global numerical optimization." IEEE Transactions on
Hard Optimization, Eds. Berlin, Germany: Springer, 2007. Systems, Man, and Cybernetics, Part B (Cybernetics) 42.2
(2012): 482-500.
[21] Petr Pok and Vclav Klem, Benchmarking the Differential
[39] Neri, Ferrante, and Ville Tirronen. "Recent advances in
Evolution with Adaptive Encoding on Noiseless Functions, differential evolution: a survey and experimental analysis."
GECCO 12, Philadelphia, PA, USA, July 2012. Artificial Intelligence Review 33.1-2 (2010): 61-106.
[22] Petr Pok and Vclav KlemĄ, "JADE, an Adaptive Differential [40] Yildiz, Ali R. "A new hybrid differential evolution algorithm
Evolution Algorithm, Benchmarked on the BBOB Noiseless for the selection of optimal machining parameters in milling
Testbed", GECCO 12, Philadelphia, PA, USA, July 2012. operations." Applied Soft Computing 13.3 (2013): 1561-
1566.
[23] Lszl PAl, Benchmarking a Hybrid Multi-Level Single Linkage
[41] Lu, Youlin, et al. "An adaptive hybrid differential evolution
Algorithm on the BBOB Noiseless Testbed, GECCO 13,
algorithm for dynamic economic dispatch with valve-point
Amsterdam, Netherlands, July 2013.
effects." Expert Systems with Applications 37.7 (2010): 4842-
[24] Vincius Veloso de Melo, "Benchmarking the Multi-View
4849.
Differential Evolution on the Noiseless BBOB-2012 Function
[42] Brest, Janez, et al. "Dynamic optimization using self-adaptive
Testbed, GECCO 12, Philadelphia, PA, USA, July 2012.
differential evolution." Evolutionary Computation, 2009.
[25] Frank Hutter, Holger Hoos and Kevin Leyton-Brown, An
CEC'09. IEEE Congress on. IEEE, 2009.
Evaluation of Sequential Model-Based Optimization for
[43] Yuan, Xiaohui, et al. "A hybrid differential evolution method
Expensive Blackbox Functions, GECCO 13, Amsterdam,
for dynamic economic dispatch with valve-point effects."
Netherlands, July 2013.
Expert systems with applications 36.2 (2009): 4042-4048.
[26] Ryoji Tanabe and Alex Fukunaga, Tuning Differential
[44] Das, Swagatam, Ajith Abraham, and Amit Konar. "Particle
Evolution for Cheap, Medium, and Expensive Computational
swarm optimization and differential evolution algorithms:
Budgets, CEC-BBOB-2015, 2015.
technical analysis, applications, and hybridization
[27] K. Price, R. Storn, and J. Lampinen, "Differential Evolution€”
perspectives." Advances in computational intelligence in
A Practical Approach to Global Optimization", Berlin,
industrial systems (2008): 1-38.
Germany: Springer, 2005.
[45] Lu, Youlin, et al. "Chaotic differential evolution methods for
[28] Sarker, Ruhul A., Saber M. Elsayed, and Tapabrata Ray.
dynamic economic dispatch with valve-point effects."
"Differential Evolution with Dynamic Parameters Selection
Engineering Applications of Artificial Intelligence 24.2 (2011):
for Optimization Problems." IEEE Trans. Evolutionary
378-387.
Computation 18.5 (2014).
[29] Bck, Thomas, and Hans-Paul Schwefel. "An overview of
evolutionary algorithms for parameter optimization."
Evolutionary computation 1.1 (1993): 1-23.

Das könnte Ihnen auch gefallen