Beruflich Dokumente
Kultur Dokumente
Local Search
Reference:
Johnson and McGeoch, The Traveling Salesman Problem - A
Case Study in Local Optimization. In Local Search in
Combinatorial Optimization, E. H. L. Aarts and J. K. Lenstra
(eds), John-Wiley and Sons, 1997
Objectives:
To learn the basic local search paradigm for solving (NPhard) optimization problems. Why?
To design LS-based meta-heuristics (tabu search, simulated
annealing, genetic algorithms, ACO, etc)
Heuristics
Optimization problem:
minxSf(x), where S=feasible solutions space
(find a feasible x such that f is minimized)
Heuristic algorithms do not guarantee to find an
optimal solution (as opposed to exact algorithms)
However, they are usually designed to find a
reasonably good solution quickly.
Solutions to optimization problems:
Global Optimum: Equal/better than all other solutions
Local Optimum: Equal/better than all solutions in a
certain neighborhood
3
Local Search
Definitions:
N (neighborhood) : S2s
The size of a neighborhood is the number of solutions in
the neighborhood set
Algorithm:
1. Pick a starting solution s // by other algorithm
2. If f(s) < mins in N(s) f(s), stop // what is the time complexity?
3. Else set s = s s.t. f(s) = mins in N(s) f(s)
4. Goto Step 2
Local Search
An iteration in a local search
Current
Solution
Generate
Neighborhood
Choose
Neighbor
Evaluate
Neighborhood
3. Extensions
Escaping from locally optimal tours
Meta-heuristics
10
11
13
Insertion Heuristics
Basic Idea:
Starting with a subtour made of 2 arbitrary chosen cities
grow a tour by inserting cities which fulfill some criteria.
The criteria usually depends on the cities which are already
in the tour
Step 1: Form a subtour by choosing 2 cities at random
Step 2: Scan remaining cities for the city which fulfills
insertion criteria
Step 3: Insert the city to the subtour
Step 4: Repeat Step 2 until all cities are in the tour
14
Farthest Insertion:
Insert the city whose shortest distance to a tour city is
maximized
Cheapest Insertion:
Choose the city whose insertion causes the least
increase in length of the resulting subtour
Random Insertion
Select the city to be inserted at random
15
16
Analysis
Number of steps:
n-2 cities are to be inserted
For each insertion the set of remaining cities has to be
scanned : O(n)
17
18
2-opt
Delete 2 edges, add 2 edges to restore the tour
19
2-opt Analysis
How many possible ways?
Size of the 2-opt neighborhood:
Number of edge pairs = n(n-3)/2
Exactly 1 way to restore tour
20
3-opt
Delete 3 edges, add 3 edges to restore the tour
3-opt Analysis
Size of the 3-opt neighborhood:
n
3
(1)
(2)
22
k-opt
It may seem sensible to increase k even further.
However, once k edges of a tour have been
deleted there are O(nk) new tours. Expensive!
Observe that some 3-opt moves are equivalent to
applying two 2-opt moves
23
Lin-Kernighan Heuristic
First observed by Lin and Kernighan
One of most successful heuristics to solve
combinatorial optimization problems
Adaptive k-Opt
Build k-opt moves (with variable k) by a sequence of
2-opt moves
"Tabu lists for edges added and deleted
Multiple restarts
24
25
11:30~12:30
20units
Depot
10:30~12:00
10units
26
Distribute:
Exchange:
27
Question:
What is the time complexity of each neighborhood?
28
Solution:
Apply Local Search to more than one tours
Search Diversification
29
Multi-start
Step 1: Create a feasible tour (randomly or by
using a construction heuristic)
Step 2: Apply Local Search until local optimality
Step 3: Repeat from Step 1 until some stopping
criteria is met
Step 4: Output best tour found during this process
30
Construction Heuristics
creating a new tour is slow (best case O(n2))
Local Search is usually quite fast
Local Search usually performs better when started from
good tours
31
32
33
GSat( C)
Guess initial assignment
While unsatisfied or not time out do
Flip value assigned to the variable that yields the greatest # of
satisfied clauses. (Note: Flip even if no improvement)
If sat assignment not found, repeat entire process, starting from a
different initial random assignment.
34
Example:
35
36
37
Time in seconds
Effectiveness: probability that random initial
assignment leads to a solution.
Test instances up to 400 variables
MixedWalkSat better than Simulated Annealing
MixedWalkSat better than Basic GSAT
38
Population-based
Evolutionary (Genetic) Algorithms
Ant Colony Optimization
Particle Swarm
Cycling:
40
41
Tabu Search
Fred Glover (1986)
A tabu list is maintained for forbidden (tabu)
moves, to avoid cycling
Tabu moves are based on the short- and long-term
history of the search process.
Aspiration criteria is the condition which allows the
tabu status of a tabu move to be overwritten so that
the move can be considered at the iteration.
The next move is the best move among the
feasible moves from the neighborhood of the
current solution. A tabu move is taken if it satisfies
the aspiration criteria.
42
Step 1
Set Current
Solution
Step 2
Define the
Neighborhood
Stopping conditions
meet
Update the TabuList and
Tabu search
trigger any related events
Stopped
Step 6
Step 5
Move operate on the bestpicked neighbor to
generate new solution
Step 3
ObjectiveFunction used
to evaluate each of the
neighbors
Step 4
TabuList and
AspirationCriteria are
consulted
44
6
x1
9 x2
x3
x4
x5
x6
x7
18
Constraints:
12
(1) x1 + x2 + x6 1
(2) x1 x3
Penalty of each constraint
violation= 50
45
Example
Neighborhood: standard edge swap
An edge is tabu if it was added within the last two
iterations
The aspiration criteria is satisfied if the tabu move
would create a tree that is better than the best tree
so far
46
Example
Iteration 1 (initial MST)
cost = 16 + 100
Drop
6
x1
2
x4
8
x6
9 x2
18
x3 Add
0
x5
Constraints:
x7
(1) x1 + x2 + x6 1
12
(2) x1 x3
Penalty of each constraint
violation= 50
47
Example
Iteration 2
cost = 28
6
x1
Drop
9 x2
x3
x4
x5
x6
x7
18
Tabu
12
Add
48
Example
Iteration 3
6
x1
2
x4
8
x6
Drop
Add
18
9 x2 x3
Tabu
0
cost = 32
x5
x7
Tabu
12
Edge x3 is aspired.
49
Example
Iteration 4
cost= 23
6
Tabu
x1 9 x
2
x3
2
x4
0
x5
x6
x7
Tabu
18
12
50
51
Intensification
Focus on solutions with good characteristics
Idea is to find better solutions around elite solutions
Usually applied on local optimal
Diversification
Forcing search to move away from specific characteristics
Idea is to explore new regions of the search space
Usually applied when potential of finding better solution is low
52
Simulated Annealing
Kirkpatrick et al. (1982); Metropolis et al. (1953)
Let T be non-negative
Loop:
pick random neighbor s in N(s)
let D = f(s) f(s) (improvement)
if D > 0 (better), s = s
else with probability eD/T , set s = s
As T tends to 0, hill climbing
As T tends to , random walk
In general, start with large T and decrease it slowly.
53
54
dE
-43
e dE/10
0.01
-13
0.27
1.00
56
e -13/T
0.000002
50
0.56
1010
0.9999
57
T very high
0.5
-20
T near 0
20
dE
58
Annealing Schedule
After each step, how to update T?
Logarithmic: Ti = /log(i+2)
provable properties, slow
Geometric: Ti = T0 int(i/L)
lots of parameters to tune
Adaptive
change Ti dynamically
59
60
Next Week
Population-Based Meta-Heuristics
Hybrid Meta-Heuristics
61