notes 4

advertisement
Escaping local optimas
• Accept nonimproving neighbors
– Tabu search and simulated annealing
• Iterating with different initial solutions
– Multistart local search, greedy randomized adaptive search procedure
(GRASP), iterative local search (ILS)
• Changing the neighborhood
– Variable neighborhood search
• Changing the objective function or the input to the problem in
a effort to solve the original problem more effectively.
– Guided local search
1
Iterating with different initial solutions
• Multi-start local search
– Several initial solutions
– Obtain several local optimas using local search, tabu or SA
• ILS- Iterative local search
– Improves the classical Multi-start local search
– It does by perturbing the local optimas from the Multi-start local search
and reconsidering them as initial solutions
– Perturbation
• Perturbation is a large random move of the current solution
• Keep some part of the current solution and significantly change the other
parts of the current solution.
• Complete change must be avoided because the perturbed solution loses the
properties of the local optima (current solution) that was derived from the
most recent search history. Such an extreme change is a random restart
approach.
• The length of the perturbation can be fixed or varied during the search.
– Acceptance Criteria: Deterministic simple conditions such as solution is
better than previous best (tabu) or probabilistic (SA).
2
GRASP- greedy randomized adaptive search procedure
• A randomized greedy heuristic generates initial solutions
– Adaptive because the greedy heuristic takes into account the
precedent solutions
• Using the initial solution perform a local search
• Repeat the above steps several times (iterations)
• Greedy solution examples
– TSP- selection of the nearest neighbor
– Knapsack- Choosing objects that minimize wi/bi where wi is the weight
and bi is the benefit
– Minimum spanning tree- choose least costly edges
3
GRASP for capacitated minimum spanning tree
• Minimum cost spanning tree
• There is a root (central node) r
• Initially all nodes are connected to the root (called gates) in a star
arrangement and every node is a sub-tree. Value Gj is the cost of
links to the root
• Objective- find the minimum cost spanning tree such that every
sub-tree has a certain capacity c where c could be the max number
of nodes in a sub-tree or is a range (min and max value)
– If every node has a weight, then c could be the maximum weight or a
range for the weight
• Exhaustive enumeration would be to find all possible spanning trees
and select the best, which will take exponential time
• Application: networking of computers, communication between cell
phone towers.
• Solution is to apply GRASP
4
GRASP for capacitated minimum spanning tree
• Start with different partial initial solutions and expand the
tree
• Create a Restricted Candidate List RCL with edges that meet
the capacity constraints.
• Randomly select from these edges to add to the sub-tree at
each iteration
– For multiple edges from node i to all j’s, (i,j) in RCL, find ti = Gj-cij and
choose the edge with the largest ti
– Add the edge only if the capacity constraint is met
– If edge cij is added then delete gate Gj (link r-j) from the list of gates.
– Continue until no further pair of nodes with positive ti or not
exceeding capacity constraint can be added to the sub-tree
5
GRASP for capacitated minimum spanning tree
• Max Capacity = 5
RCL 1-2, 2-3, 1-3, 3-4
Capacity ≤ 5
3
2
2
2
1
1
5
2
3
3
1
r
3
3
4
2-4, 1-4 exceeds capacity
Capacity 2-4 = 7
Capacity 1-4 = 6
1
1
4
4
6
GRASP for capacitated minimum spanning tree
Node j
1
2
1
2
Node i
3
-1
3
-2
3
4
x
x
3
4
0
x
1
x
3
ti = Gj-cij
2
RCL 1-2, 2-3, 1-3, 3-4
Capacity ≤ 5
For multiple edges from node i to all j’s, (i,j) in RCL, find ti = Gj-cij
Choose the edge with the largest positive ti (in red)
2-4, 1-4 exceeds capacity
Capacity 2-4 = 7
Capacity 1-4 = 6
7
GRASP for capacitated minimum spanning tree
• Connect 1-2 remove r-2 link
• Connect 3-4 remove r-4 link
• Connect 3-2 remove r-2 link
• All nodes are connected. Choose the smallest gate among the
leftover gates. Choose r-1
• Total cost = r-1-2-3-4 = 6
• In a star type connection total cost is (r-1)+(r-2)+(r-3)+(r-4) =
13
• A regular spanning tree without capacity would be r-1-4-3-2
=5
8
Variable Neighborhood Search
• VNS exploits the second idea: change neighborhood structure.
• VNS uses different neighborhood structures during search
• A neighborhood is substituted by another neighborhood as
soon as local search can not improve the current best
solution.
9
Guided Local Search
• Change the objective function with respect to an already
obtained local optima
• f’(s) = f(s) + penalty function
• Ex. TSP
– Create a penalty matrix P=pij for every edge (i,j). Initialize P to 0.
– After a local optima is found, all future solutions having edges that are
in the local optima will be penalized.
– A utility value is calculated for the edges in the new solution.
– If the edge is not in the local optima then its utility is 0, otherwise its
utility uij = cij/1+pij .
– Penalize the edge with the highest utility. The penalty is incremented
by 1.
– In TSP, the above increases the obj function for solutions that are
similar to the local optima and therefore, those solutions are not
selected during the search. Thus, the search escapes local optima.
• 𝑓′ 𝑠 = 𝑓 𝑠 + 𝞴
𝑝𝑖𝑗 where l is a scaling parameter
10
Dispatching Rules
• See page 442 of text.
• The rules are useful to obtain initial solutions
11
Download