Local Search Foundations of Constraint Processing CSCE421/821, Fall2015 www.cse.unl.edu/~choueiry/F15-421-821/

advertisement
Local Search
Foundations of Constraint Processing
CSCE421/821, Fall2015
www.cse.unl.edu/~choueiry/F15-421-821/
Berthe Y. Choueiry (Shu-we-ri)
Avery Hall, Room 360
Tel: +1(402)472-5444
Foundations of Constraint Processing
Local Search for CSPs
1
Lecture sources
Required reading
1. Dechter: Chapter 7, Section 7.1 and 7.2
Recommended
1. R. Bartak’s online guide:
http://kti.ms.mff.cuni.cz/~bartak/constraints/stochastic.html
1. AIMA:
Section 4.4 (1st edition) Section 4.3 (2nd edition)
1. Paul Morris 93:
The Breakout Method for Escaping From Local
Minima
AAAI 1993, pages 40—45
Foundations of Constraint Processing
Local Search for CSPs
2
Solving CSPs
CSPs are typically solved with a combination of
1. Constraint propagation (inference)
2. Search (conditioning)
• Backtrack search
• Local search
We focus on local search
Foundations of Constraint Processing
Local Search for CSPs
3
Outline
•
•
•
•
General principle
Main types: greedy & stochastic
When nothing works…
Evaluation methods
Foundations of Constraint Processing
Local Search for CSPs
4
Backtrack search
•
Properties
– Systematic and exhaustive
– Deterministic or heuristic
– Sound and complete
•
Shortcomings
– worst-case time complexity prohibitive
– often unable to solve large problems. Thus, theoretical
soundness and completeness do not mean much in practice
•
Idea
– Use approximations: sacrifice soundness and/or completeness
– Can quickly solve very large problems (that have many solutions)
Foundations of Constraint Processing
Local Search for CSPs
5
Local search: the picture
•
•
•
•
States are laid up on a surface
State quality/cost is its height
State space forms a landscape
Optimum state:
– maximizes solution quality
– minimizes solution cost
• Move around from state to state and try to reach
the optimum state
• Exploration restricted to neighboring states, thus
‘local’ search (ref. Holger & Hoos)
Foundations of Constraint Processing
Local Search for CSPs
6
Components of a local search
• State
– is a complete assignment of values to variables, a possibly
inconsistent solution
• Possible moves
– are modifications to the current state, typically by changing the
value of a single variable. Thus, ‘local’ repair (ref. Dechter)
– Examples:
• SAT: Flipping the value of a Boolean variable (GSAT),
• CSPs: Min-conflict heuristic (variations)
• Evaluation (cost) function
– rates the quality of a state, typically in the number of violated
constraints
Foundations of Constraint Processing
Local Search for CSPs
7
Generic Mechanism
• Cost function: number of broken constraints
• General principle
– Start with a full but arbitrary assignment of values to variables
– Reduce inconsistencies by local repairs (heuristic)
– Repeat until
• A solution is found
• The solution cannot be repaired
• … You run out of patience
(global minimum)
(local minimum)
(max-tries)
• A.k.a.
– Iterative repair (decision problems)
– Iterative improvement (optimization problems)
Foundations of Constraint Processing
Local Search for CSPs
8
Outline
• General principle
• Main types: greedy & stochastic
– Greedy: hill climbing, local beam
– Stochastic:
• RandomWalk (stochastic noise), Tabu Search
 Simulated AnnealingGeneric algorithms, Breakout method
(constraint weighting), ERA (multi-agent search)
• When nothing works…
• Evaluation methods
Foundations of Constraint Processing
Local Search for CSPs
9
Main types of local search
1. Greedy:
– Use a heuristic to determine the best move
2. Stochastic (improvement)
– Sometimes (randomly) disobey the heuristic
Foundations of Constraint Processing
Local Search for CSPs
10
Greedy local search
• At any given point,
– make the best decision you can given the
information you have and proceed.
– Typically, move to the state that minimizes the
number of broken constraints
• Example:
– hill climbing (a.k.a. gradient descent/ascent)
– Local beam search: keep track of k states
Foundations of Constraint Processing
Local Search for CSPs
11
Greedy local search
• Problems:
– local optima (stuck),
– plateau (errant),
– ridge (oscillates from side to side, slow progress)
Foundations of Constraint Processing
Local Search for CSPs
12
Stochastic Local Search
• Sometimes (randomly) move to a state that is
not the best: use randomization to escape local
optimum
• Examples:
• RandomWalk (stochastic noise)
• Tabu Search
 Simulated Annealing
 Generic algorithms
 Breakout method (constraint weighting)
 ERA (multi-agent search)
Foundations of Constraint Processing
Local Search for CSPs
13
Simulated Annealing: idea
• Analogy to physics:
– Process of gradually cooling a liquid until it freezes
– If temperature is lowered sufficiently slowly, material
will attain lowest-energy configuration (perfect order)
• Basic idea:
– When stuck in a local optimum, allow few steps
towards less good neighbors to escape the local
maximum
Foundations of Constraint Processing
Local Search for CSPs
14
Simulated Annealing: Mechanism
• Start from any state at random, start countdown and loop
until time is over:
– Pick up a neighbor at random
– Set d = quality of neighbor – quality of current state
– If d>0 (there is improvement)
• Then move to neighbor & restart countdown
• Else, move to neighbor with a transition probability p<1
• Transition probability proportional to ed/t
– d is negative, and t is time countdown
– As times passes, less and less likely to make the move towards
unattractive neighbors
• Under some very restrictive assumptions, guaranteed to
find optimum
Foundations of Constraint Processing
Local Search for CSPs
15
Properties
• Non-systematic and non-exhaustive
• Liable to getting stuck in local optima (optima/minima)
• Non-deterministic:
– outcome may depend on where you start
• Typically, heavy tailed:
– probability of improving solution as time goes by quickly
becomes small but does not die out
Foundations of Constraint Processing
Local Search for CSPs
16
Genetic Algorithms
• Basic step: Combinations two complete assignments
(parents) to generate offsprings
• Mechanism
– Starts from an initial population
– Encodes assignments in a compact manner (a string)
– Combines partial solutions to generate new solutions (next
generation)
Foundations of Constraint Processing
Local Search for CSPs
17
Genetic Algorithm (2)
• Fitness Function ranks a state’s quality, assigns probability for
selection
• Selection randomly chooses pairs for combination depending on
fitness
• Crossover point randomly chosen for two individuals, offsprings are
generated
• Mutation randomly changes a state
Foundations of Constraint Processing
Local Search for CSPs
18
Outline
• General principle
• Main types: greedy & stochastic
– Greedy: hill climbing, local beam
– Stochastic:
• RandomWalk (stochastic noise), Tabu Search
 Simulated AnnealingGeneric algorithms, Breakout method
(constraint weighting), ERA (multi-agent search)
• When nothing works…
• Evaluation methods
Foundations of Constraint Processing
Local Search for CSPs
19
Breakout strategies
[Bresina]
• Increase the weights of the broken
constraints so that they are less likely to
be broken in subsequent iterations
• Quite effective for recovering from local
optima
Foundations of Constraint Processing
Local Search for CSPs
20
ERA: Environment, Rules, Agents
[Liu et al, AIJ 02]
•
•
•
•
•
Environment is an n × a board
Each variable is an agent
Each position on board is a value of a domain
Agent moves in a row on board
Each position records the number of violations
caused by the fact that agents are occupying
other positions
• Agents try to occupy positions where no
constraints are broken (zero position)
• Agents move according to reactive rules
Foundations of Constraint Processing
Local Search for CSPs
21
Reactive rules
[Liu et al, AIJ 02]
Reactive rules:
– Least-move: choose a position with the min. violation value
– Better-move: choose a position with a smaller violation value
– Random-move: randomly choose a position
Combinations of these basic rules form different behaviors.
Reactive rules
Behavior
designer
LR
least-move with 1-p and random-move with p
BR
better-move with 1-p and random-move with p
BLR
first better-move, if fail then apply LR
rBLR
first apply better-move r times, if fail then apply LR
FrBLR
apply rBLR in the first r iterations, then apply LR
Foundations of Constraint Processing
Local Search for CSPs
22
Big picture
• Agents do not communicate but share a
common context
• Agents keep kicking each other out of their
comfortable positions until every one is happy
• Charecterization:
[Hui Zou, 2003]
– Amazingly effective in solving very tight but solvable
instances
– Unstable in over-constrained cases
• Agents keep kicking each other out (livelock)
• Livelocks may be exploited to identify bottlenecks
Foundations of Constraint Processing
Local Search for CSPs
23
ERA performance
70
Spring 2001b (B)
# agents in zero position
65
Fall 2001b
60
55
50
Spring 2003
45
40
Observation:
35
Fall 2002 (B)
30
25
20
iteration
15
1
20
39
58
77
96
115
191
172
153
134
Solvable vs. unsolvable instances:
ERA performance on solvable instances
# agents in zero position
45
• stable on solvable instances
• oscillates on unsolvable cases
Spring 2001b (O)
40
35
30
25
20
15
Fall 2002 (O)
iteration
10
1
20
39
58
77
96
115
134
153
172
191
ERA performance on unsolvable instances
Foundations of Constraint Processing
Local Search for CSPs
24
Agent’s movement
Motion of agents
variable
40
• variable
• stable
• constant
20
0
index of position
1
51
101
151
201
251
301
351
401
451
stable
20
Observations:
10
0
1
51
101
151
201
251
301
351
401
Solvable
451
constant
Variable
None
Most
Stable
A few
A few
Constant
Most
None
30
20
10
Unsolvable
0
1
51
101
151
201
251
301
351
401
451
iteration
Foundations of Constraint Processing
Local Search for CSPs
25
Outline
•
•
•
•
General principle
Main types: greedy & stochastic
When nothing works…
Evaluation methods
Foundations of Constraint Processing
Local Search for CSPs
26
Random restart
• Principle
– When no progress is made, restart from a new
randomly selected state
– Save best results found so far (anytime algorithm)
• Repeat random restart
– For a fixed number of iterations
– Until best results have not been improved on for a
certain number of iterations. E.g., Geometric law
Foundations of Constraint Processing
Local Search for CSPs
27
Outline
•
•
•
•
General principle
Main types: greedy & stochastic
When nothing works…
Evaluation methods
Foundations of Constraint Processing
Local Search for CSPs
28
Evaluation: empirical
• Test on
– a given problem instance
– an ensemble of problem instances
(representative of a population)
• Experiment
– Run the algorithm thousands of times
– Measure some metric as a random variable
(e.g., the time needed to find a solution)
Foundations of Constraint Processing
November 2, 2005
Local Search for CSPs
29
Comparing techniques
• Provide
– The probability distribution function (approximated by
the number of runs that took a given time to find a
solution)
– The cumulative distribution function (approximated by
the number of runs that found a solution within a
given time)
Foundations of Constraint Processing
Local Search for CSPs
30
Comparing techniques
• Compare algorithms’ performance using
statistical tests (for confidence levels)
– t-test: assumes normal distribution of the measured metrics
– Nonparametric tests do not. Some match pairs, some do not.
• Consult/work with a statistician
1000
100.00%
Comparing
distributions
800
700
80.00%
Method1
Method2
60.00%
600
Frequency
Cumulative
Distribution
curves
Cumulative frequency
900
Method1
Method2
500
40.00%
400
300
20.00%
200
100
.00%
0
0
-1
1
3
5
7
9
11
13
2
4
6
8
10
12
CPU time
CPU time
Foundations of Constraint Processing
Local Search for CSPs
31
Related documents
Download