Beruflich Dokumente
Kultur Dokumente
Prepared by:
Engr. Ian Kent S. Bescoro
Annealing meaning in
Metallurgy
• At high temp. movement of atoms is
increased (in molten state) & vice-
versa.
• At low temp, the atoms gets ordered
and crystals develop with min. energy.
• If temp is reduced at a lower rate then
absolute min energy state occurs.
Simulated Annealing
• Resembles the cooling process of molten metal
through annealing.
• The cooling phenomenon is simulated by
controlling the temp parameter in the Boltzmann
probability distribution.
• A system in thermal equilibrium T, the
probabilistic energy distribution can be
represented as, P(E)= e(-E/kT) where,
k is Boltzmann constant.
Motivation
• Heat the solid state metal to a high temperature
• Cool it down very slowly according to a specific
schedule.
• If the heating temperature is sufficiently high to
ensure random state and the cooling process is
slow enough to ensure thermal equilibrium, then
the atoms will place themselves in a pattern that
corresponds to the global energy minimum of a
perfect crystal.
Metropolis method
• Metropolis introduced a method to implement Boltzmann
distribution.
• Consider an instant the current point is at x(t) and the fn.
Value at that point is E(t)= f(x(t))
• Using metropolis algorithm we can say that the probability
of next point being at x(t+1)depends on the difference the fn.
Values of the two points
ΔE = E(t+1) – E(t)
And is calculated using Boltzmann probability distribution
P(E(t+1)) = min [1 , e(-ΔE/kT) ]
Metropolis method
• If ΔE≤0, the probability is 1 and point x(t+1) is
accepted.
• If ΔE>0, the point x(t+1) is worse than x(t)
• The point x(t+1) is chosen in a special method
• A random number r, is assumed in the range (0-1).
• If r ≤ e(-E/kT) set t=t+1, it is accepted. If not the point
(t+1) is rejected and we assume a new point for
analysis.
Algorithm Procedure
1. Choose an initial point x0 and a termination criteria Є. Set T
a sufficiently high value, number of iterations be n, and set
t=0.
2. Assume the neighborhood points xt+1 = N(xt) usually a
random point at the neighborhood is selected
3. If ΔE = E(xt+1 ) – E(xt) < 0, set t = t+1;
Else create a random number r in the range (0,1). If r ≤ e(-
E/kT)
set t= t+1.
Else go to step 2
4. If | xt+1 – xt|< Є and T is small, Terminate;
Else, lower T according to a cooling schedule. Go to step 2;
Else go to step 2
Cooling schedule
Convergence of simulated annealing
HILL CLIMBING
HILL CLIMBING
AT FINAL_TEMP
NUMBER OF ITERATIONS
Example
• Minimize the following problem using simulated
annealing method; minimize
Step 2 – Iteration 3
Next point found at the vicinity of the current
point
Here Δx1= -0.103 & Δx2 = - 2.812 ; x3 = (2.397,
- 0.312)T and f(x3) = 51.287
• Step 3
– Now ΔE = f(x3 ) – f(x2) = 51.287 – 58.067 =
- 6.780
– we accept and t = 3
• Step 4
– Termination criteria not satisfied
– T = 0.5 x 101.25 = 50.625
• Step 2 – Iteration 4
– Here Δx1= - 1.103 & Δx2 = - 0.779
– Similar to above x4 = (1.397, 1.721)T and f(x4)
= 60.666
• Step 3
– ΔE = f(x4 ) – f(x3) = 9.379
– Since the quantity is positive. We use Metropolis
algorithm to decide whether to accept or to reject
– Assume a random numb r = 0.746 should be under the
range 0 – 1
– Probability to accept the new point is e(-9.379/50.625) =
0.831
– Since r < 0.831 we accept this point and t = 4
• Step 4
– Termination criteria not satisfied
– Now T = 25. 313. Iteration is complete
• Step 2 – Iteration 5
– Here Δx1= - 1.707 & Δx2 = - 0.550
– Here x5 = (0.793, 1.950)T and f(x5) = 76.697
• Step 3
– Here ΔE = f(x5 ) – f(x4) = 16.031
– Since the quantity is positive. We use
metropolis algorithm
– r = 0.793
– e(-16.031/25.313) = 0.531. Here r > 0.531 We do not
accept the point
– Now we need to assume a new point.
• Step 2 - Iteration 5
– Here Δx1= - 0.809 & Δx2 = - 0.411
– Here x6 = (1.691, 2.089)T and f(x6) = 37.514
• Step 3
– Here ΔE = f(x6 ) – f(x5) = -23.152
– Since ΔE is –ve point is selected and t = 5
• Step 4
– T = 12.656
CONCLUSION
• The quality of the final solution is not affected by the
initial guess, except that the computational effort just
increases
• This process continues until the T reduces to a small
value
• In the early stages in SA any point is equally likely to be
accepted
• So search space is well investigated before convergence
to a optimum solution
• For sufficiently large iterations at each temp and with a
small cooling rate, the algorithm guarantees convergence
to the globally optimal solution
Thank You!