Sie sind auf Seite 1von 20

Simulated Annealing

Prepared by:
Engr. Ian Kent S. Bescoro
Annealing meaning in
Metallurgy
• At high temp. movement of atoms is
increased (in molten state) & vice-
versa.
• At low temp, the atoms gets ordered
and crystals develop with min. energy.
• If temp is reduced at a lower rate then
absolute min energy state occurs.
Simulated Annealing
• Resembles the cooling process of molten metal
through annealing.
• The cooling phenomenon is simulated by
controlling the temp parameter in the Boltzmann
probability distribution.
• A system in thermal equilibrium T, the
probabilistic energy distribution can be
represented as, P(E)= e(-E/kT) where,
k is Boltzmann constant.
Motivation
• Heat the solid state metal to a high temperature
• Cool it down very slowly according to a specific
schedule.
• If the heating temperature is sufficiently high to
ensure random state and the cooling process is
slow enough to ensure thermal equilibrium, then
the atoms will place themselves in a pattern that
corresponds to the global energy minimum of a
perfect crystal.
Metropolis method
• Metropolis introduced a method to implement Boltzmann
distribution.
• Consider an instant the current point is at x(t) and the fn.
Value at that point is E(t)= f(x(t))
• Using metropolis algorithm we can say that the probability
of next point being at x(t+1)depends on the difference the fn.
Values of the two points
ΔE = E(t+1) – E(t)
And is calculated using Boltzmann probability distribution
P(E(t+1)) = min [1 , e(-ΔE/kT) ]
Metropolis method
• If ΔE≤0, the probability is 1 and point x(t+1) is
accepted.
• If ΔE>0, the point x(t+1) is worse than x(t)
• The point x(t+1) is chosen in a special method
• A random number r, is assumed in the range (0-1).
• If r ≤ e(-E/kT) set t=t+1, it is accepted. If not the point
(t+1) is rejected and we assume a new point for
analysis.
Algorithm Procedure
1. Choose an initial point x0 and a termination criteria Є. Set T
a sufficiently high value, number of iterations be n, and set
t=0.
2. Assume the neighborhood points xt+1 = N(xt) usually a
random point at the neighborhood is selected
3. If ΔE = E(xt+1 ) – E(xt) < 0, set t = t+1;
Else create a random number r in the range (0,1). If r ≤ e(-
E/kT)
set t= t+1.
Else go to step 2
4. If | xt+1 – xt|< Є and T is small, Terminate;
Else, lower T according to a cooling schedule. Go to step 2;
Else go to step 2
Cooling schedule
Convergence of simulated annealing

AT INIT_TEMP Unconditional Acceptance

HILL CLIMBING Move accepted with


probability
= e-(^C/temp)
COST FUNCTION, C

HILL CLIMBING

HILL CLIMBING

AT FINAL_TEMP

NUMBER OF ITERATIONS
Example
• Minimize the following problem using simulated
annealing method; minimize

f(x1, x2) = (x12+ x2 -11)2 + (x1+x22 - 7)2


• Step 1- Iteration 1
– Choose an initial point x0 = (2.5,2.5)T and a
termination factor Є=10-3
– To find the initial temp T, we find the average of
the function values at points (0,0), (0,5), (5,0),
(5,5). So T = 405 . Set the initial iteration
counter to t = 0
• Step 2
– We create a point in the neighborhood of x0
– We assume the neighborhood values are Δx1=
0.037 & Δx2 = - 0.086
– The new x1 = (2.537, 2.414)T With a fn value
f(x1) = 6.482. Initially f(x0) = 8.125
Step 3
Now ΔE = f(x1 ) – f(x0) = -1.643.
Since ΔE < 0 we accept the new point.
We increment the counter t = 1 and proceed to
step 4
Step 4
Since x0 & x1 are not close enough for termination
we cannot terminate.
One iteration is complete here
To limit the number of iterations we reduce the T
by half
So new T = 0.5 x 405 = 202.5
• Step 2 - Iteration 2
– Now a pair of new points are created in the
neighborhood of x1
– We assume the neighborhood values are Δx1= - 0.426 &
Δx2 = - 1.810
– Now new x2 = (2.072, 0.604)T and f(x2) = 58.067
• Step 3
– Now ΔE = f(x2 ) – f(x1) = 58.067-6.482 = 51.585
– Since the quantity is positive we use Metropolis
algorithm to decide whether to accept or to reject
– Assume a random numb r = 0.649
– Probability to accept the new point is e(-51.585/202.5) =
0.775
– Since r < 0.775 point is accepted
– We set t = 2 and proceed
Step 4
The termination criteria not met
Iteration compete
So T = 0.5 x 202.5 = 101.25

Step 2 – Iteration 3
Next point found at the vicinity of the current
point
Here Δx1= -0.103 & Δx2 = - 2.812 ; x3 = (2.397,
- 0.312)T and f(x3) = 51.287
• Step 3
– Now ΔE = f(x3 ) – f(x2) = 51.287 – 58.067 =
- 6.780
– we accept and t = 3
• Step 4
– Termination criteria not satisfied
– T = 0.5 x 101.25 = 50.625
• Step 2 – Iteration 4
– Here Δx1= - 1.103 & Δx2 = - 0.779
– Similar to above x4 = (1.397, 1.721)T and f(x4)
= 60.666
• Step 3
– ΔE = f(x4 ) – f(x3) = 9.379
– Since the quantity is positive. We use Metropolis
algorithm to decide whether to accept or to reject
– Assume a random numb r = 0.746 should be under the
range 0 – 1
– Probability to accept the new point is e(-9.379/50.625) =
0.831
– Since r < 0.831 we accept this point and t = 4
• Step 4
– Termination criteria not satisfied
– Now T = 25. 313. Iteration is complete
• Step 2 – Iteration 5
– Here Δx1= - 1.707 & Δx2 = - 0.550
– Here x5 = (0.793, 1.950)T and f(x5) = 76.697
• Step 3
– Here ΔE = f(x5 ) – f(x4) = 16.031
– Since the quantity is positive. We use
metropolis algorithm
– r = 0.793
– e(-16.031/25.313) = 0.531. Here r > 0.531 We do not
accept the point
– Now we need to assume a new point.
• Step 2 - Iteration 5
– Here Δx1= - 0.809 & Δx2 = - 0.411
– Here x6 = (1.691, 2.089)T and f(x6) = 37.514
• Step 3
– Here ΔE = f(x6 ) – f(x5) = -23.152
– Since ΔE is –ve point is selected and t = 5
• Step 4
– T = 12.656
CONCLUSION
• The quality of the final solution is not affected by the
initial guess, except that the computational effort just
increases
• This process continues until the T reduces to a small
value
• In the early stages in SA any point is equally likely to be
accepted
• So search space is well investigated before convergence
to a optimum solution
• For sufficiently large iterations at each temp and with a
small cooling rate, the algorithm guarantees convergence
to the globally optimal solution
Thank You!

Das könnte Ihnen auch gefallen