Sie sind auf Seite 1von 18

Genetic Algorithms and Genetic Programming Lecture 3

Gillian Hayes 28th September 2007

Gillian Hayes

GAGP Lecture 3

28th September 2007

Lecture 3
Admin Tutorial groups: look out for email from ITO (Zoe Wyatt) Preparation: Read pp 127 of Mitchell Read Whitleys GA tutorial sections 1 and 2 in lecture notes (Notes and other reading from previous years webpage)

Gillian Hayes

GAGP Lecture 3

28th September 2007

Contents
Whitleys Canonical GA: representation evaluation/tness selection crossover mutation an example
Gillian Hayes GAGP Lecture 3 28th September 2007

The Canonical GA
In the canonical GA, only 2 components are problem dependent: the problem encoding (representation) the evaluation function Typical problem: parameter optimisation nd parameters that maximise R
x1 10 x2 x3 x4

Problem: mixing desk output R is a nonlinear function of input values


output value (R)

R = F (x1, x2, x3, x4)

Interactions between parameters (epistasis) must be considered to maximise R.


Gillian Hayes GAGP Lecture 3 28th September 2007

Problem Encoding: Representation


In the canonical GA, solutions are encoded as binary integers (bit strings): R = F (x1, x2, x3, x4) Suppose x1 : 0 to 31 x2 : 0 to 1023 x3 : 0 to 3 x3 : 0 to 3 Total bits required = ? If actual parameters are continuous then transform into a discrete range 0 (2n 1) e.g. 0 to 1023
Gillian Hayes GAGP Lecture 3 28th September 2007

Problem Encoding: Representation

If we require a certain precision: Suppose x is in range [1, 2] and we wish to have 6 decimal places of precision, then need 3000000 bins, hence a 22-bit chromosome. 2097152 = 221 < 3000000 222 = 4194304 Binary number for a given value of x in [1, 2] is convert base 10 to binary((x + 1.0) (222 1)/3) What if values are a nite set V where |V | = 2n? map all 2n values onto some value in V (look-up table) give impossible binary values a low evaluation
Gillian Hayes GAGP Lecture 3 28th September 2007

Evaluation Function
Evaluation function gives a score to each set of parameters: fi = F (xi , xi , xi , . . .) 1 2 3 for individual solution i. If f is the average evaluation over the whole population of N individuals, then the tness of i is fi/f (so its a tness relative to the average). (Strictly, in the canonical GA, the tness is this relative value. In GAs generally, you will nd that tness refers to fi, fi/f , fi/i fi or some other function of fi (e.g. to turn it into a cost rather than a tness).)
Gillian Hayes GAGP Lecture 3 28th September 2007

Evaluation Function

Example: fi = x2, x = 00000, . . . , 11111 (i.e. 5-bit string, 0 to 31) i So fi for xi = 01101 (13) is 169. If f for the whole population is 142 (say), then: tness(01101) = 169/142 = 1.19 So we put one copy of 01101 into the intermediate population (see later) and choose a second copy with probability 19%. Now we get on to the other components of the canonical GA...
Gillian Hayes GAGP Lecture 3 28th September 2007

Selection
In the canonical GA, selection maps the current population of size N onto an intermediate population of size N :

01011010 11011011 00110101 N 10101110 Selection N Reproduction Crossover Mutation Intermediate population Next generation N

Current population

Gillian Hayes

GAGP Lecture 3

28th September 2007

Selection

Roulette Wheel selection: each solution gets a chunk of the wheel which is proportional to its tness (tness proportional selection). The above average solutions get more copies in the intermediate population than the below average ones. Probability of selection of solution with evaluation fi is fi/ifi Spin the wheel N times to get N members of the intermediate population. Each time select chromosome with probability proportional to tness selection with replacement, so one chromosome can be selected many times. Known as Stochastic sampling with replacement.

Gillian Hayes

GAGP Lecture 3

28th September 2007

10

Reproduction, Crossover and Mutation


The next generation is created by reproduction, crossover and mutation. Select two parents at random from the intermediate population. Apply crossover with probability = pc, with probability = 1 pc copy the parents unchanged into the next generation reproduction. Crossover: from the 2 parents create 2 children using 1-point, 2-point, n-point crossover. Select crossover point uniform-randomly: P1: 01101|011011101 P2: 11001|101100100
Gillian Hayes

01101|101100100 11001|011011101
GAGP Lecture 3 28th September 2007

Reproduction, Crossover and Mutation

11

Mutation: take each bit in turn and with Prob(mutation) = pm, ip it (0 1, 1 0). pm < 0.01 usually. This is one generation. Do for many generations, till solutions are optimal or good enough. So: Repeat Evaluate tness Select intermediate population Do crossover or reproduction Do mutation Until solutions good enough
Gillian Hayes GAGP Lecture 3 28th September 2007

12

An Example
Maximise y = x2 for x in the range 0 to 31. (What is the answer?)
1200

1000

800

y 600
400

200

10

15

20

25

30

35

x
Gillian Hayes GAGP Lecture 3

31
28th September 2007

An Example

13

Represent x as 5 bits: 00000 0 0 00001 1 1 00010 2 4 . . 11111 31 961 Use a population of size 4 (far too small!) Initial population: i 1 2 3 4 Value 01101 11000 01000 10011 Evaluation 169 576 64 361 % of Total (= prob. of being selected) 14.4 49.2 5.5 30.9
GAGP Lecture 3 28th September 2007

Gillian Hayes

An Example

14

Selection:
14.4% 30.9% 1 4

2 49.2%

3 5.5%

So our intermediate population might be: 1, 2, 2, 4

Gillian Hayes

GAGP Lecture 3

28th September 2007

An Example

15

Create next generation: pc = 1.0, pr = 1 pc = 0.0 so crossover is always applied here Pair parents randomly, choose crossover points randomly: Parents: 1 and 2 0110 1 1100 0 Parents: 2 and 4 11 000 10 011
Gillian Hayes

0110 0 1100 1

11 011 10 000
GAGP Lecture 3 28th September 2007

An Example

16

Mutation: pm = 0.001, 20 bits. No mutation here (20 0.001 = 0.02). New population: i 1 2 3 4 Value 01100 11001 11011 10000 Evaluation 144 625 729 256 % of Total 8.21 35.63 41.6 14.6

What is the average evaluation of this population? How does it compare with the average evaluation of the previous generation? Continue until no improvement in the best solution for k generations, or run for xed number of generations.
Gillian Hayes GAGP Lecture 3 28th September 2007

An Example

17

How does it work? What is the best solution?


1200

1000

800

600

400

200

10

15

20

25

30

35

31

Climbing up the tness curve, putting together building blocks of good subsolutions.
Gillian Hayes GAGP Lecture 3 28th September 2007

Das könnte Ihnen auch gefallen