topics_02_10.ppt

advertisement
Topic 2
Solving Scheduling
Problems
Lecture Notes for Planning and Scheduling
Prepared by Siggi Olafsson
Classic Scheduling Theory



Look at a specific machine environment with
a specific objective
Analyze to prove an optimal policy or to show
that no simple optimal policy exists
Thousands of problems have been studied in
detail with mathematical proofs!
June 28, 2016
Lecture Notes for Planning and Scheduling
Prepared by Siggi Olafsson
2
Example: single machine

Lets say we have



Single machine (1), where
the total weighted completion time should
be minimized (SwjCj)
We denote this problem as
1 ||  w j C j
June 28, 2016
Lecture Notes for Planning and Scheduling
Prepared by Siggi Olafsson
3
Optimal Solution

Theorem: Weighted Shortest
Processing time first - called the WSPT
rule - is optimal for
1 ||  w j C j

Note: The SPT rule starts with the job that
has the shortest processing time, moves on
the job with the second shortest processing
time, etc.
June 28, 2016
Lecture Notes for Planning and Scheduling
Prepared by Siggi Olafsson
4
Proof (by contradiction)


Suppose it is not true and schedule S is optimal
Then there are two adjacent jobs, say job j followed by
job k such that
wj w
pj


k
pk
Do a pairwise interchange to get schedule S ’
j
k
t  p j  pk
t
k
June 28, 2016
j
t  p j  pk
t
Lecture Notes for Planning and Scheduling
Prepared by Siggi Olafsson
5
Proof (continued)
The weighted completion time of the two jobs under S is
(t  p j ) w j  (t  p j  pk ) wk
The weighted completion time of the two jobs under S ‘ is
Now:
(t  pk ) wk  (t  p j  pk ) w j
(t  p j ) w j  (t  p j  pk ) wk  (t  p j ) w j  p j wk  (t  pk ) wk
 (t  p j ) w j  pk w j  (t  pk ) wk
 (t  pk ) wk  (t  pk  p j ) w j
Contradicting that S is optimal.
June 28, 2016
Lecture Notes for Planning and Scheduling
Prepared by Siggi Olafsson
6
Complexity Theory



Classic scheduling theory draws heavily
on complexity theory
The complexity of an algorithm is its
running time in terms of the input
parameters (e.g., number of jobs and
number of machines)
Big-Oh notation, e.g., O(n2m)
June 28, 2016
Lecture Notes for Planning and Scheduling
Prepared by Siggi Olafsson
7
Polynomial versus NP-Hard
Time
n
O(2 )
O(n 2 )
O (n)
O (log n)
Problem size (n)
June 28, 2016
Lecture Notes for Planning and Scheduling
Prepared by Siggi Olafsson
8
Scheduling in Practice


Practical scheduling problems cannot be
solved this easily!
Need:




Heuristic algorithms
Knowledge-based systems
Integration with other enterprise functions
However, classic scheduling results are
useful as a building block
June 28, 2016
Lecture Notes for Planning and Scheduling
Prepared by Siggi Olafsson
9
General Purpose
Scheduling Procedures

Some scheduling problems are easy



Simple priority rules
Complexity: polynomial time
Most scheduling problems are hard


Complexity: NP-hard, strongly NP-hard
Finding an optimal solution is infeasible in
practice  heuristic methods
June 28, 2016
Lecture Notes for Planning and Scheduling
Prepared by Siggi Olafsson
10
Types of Heuristics







Simple Dispatching Rules
Composite Dispatching Rules
Branch and Bound
Beam Search
Construction
Methods
Simulated Annealing
Tabu Search
Genetic Algorithms
Improvement
Methods
June 28, 2016
Lecture Notes for Planning and Scheduling
Prepared by Siggi Olafsson
11
Topic 3
Dispatching Rules
Lecture Notes for Planning and Scheduling
Prepared by Siggi Olafsson
Dispatching Rules

Prioritize all waiting jobs





job attributes
machine attributes
current time
Whenever a machine becomes free:
select the job with the highest priority
Static or dynamic
June 28, 2016
Lecture Notes for Planning and Scheduling
Prepared by Siggi Olafsson
13
Release/Due Date Related

Earliest release date first (ERD) rule


Earliest due date first (EDD) rule


variance in throughput times
maximum lateness
Minimum slack first (MS) rule
max d j  p j  t ,0

maximum lateness
Deadline
June 28, 2016
Lecture Notes for Planning and Scheduling
Prepared by Siggi Olafsson
Current
Time
Processing
Time
14
Processing Time Related

Longest Processing Time first (LPT) rule



Shortest Processing Time first (SPT) rule



balance load on parallel machines
makespan
sum of completion times
WIP
Weighted Shortest Processing Time first
(WSPT) rule
June 28, 2016
Lecture Notes for Planning and Scheduling
Prepared by Siggi Olafsson
15
Processing Time Related

Critical Path (CP) rule



precedence constraints
makespan
Largest Number of Successors (LNS)
rule


precedence constraints
makespan
June 28, 2016
Lecture Notes for Planning and Scheduling
Prepared by Siggi Olafsson
16
Other Dispatching Rules


Service in Random Order (SIRO) rule
Shortest Setup Time first (SST) rule


Least Flexible Job first (LFJ) rule


makespan and throughput
makespan and throughput
Shortest Queue at the Next Operation
(SQNO) rule

machine idleness
June 28, 2016
Lecture Notes for Planning and Scheduling
Prepared by Siggi Olafsson
17
Discussion





Very simple to implement
Optimal for special cases
Only focus on one objective
Limited use in practice
Combine several dispatching rules
Composite Dispatching Rules
June 28, 2016
Lecture Notes for Planning and Scheduling
Prepared by Siggi Olafsson
18
Example




Single Machine with Weighted Total Tardiness
1 ||  w jT j
No efficient algorithm (NP-Hard)
Branch and bound can only solve very small
problems (<30 jobs)
Are there any special cases we can solve?
June 28, 2016
Lecture Notes for Planning and Scheduling
Prepared by Siggi Olafsson
19
Case 1: Tight Deadlines


Assume dj=0
Then
T j  max 0, C j  d j 
 max 0, C j   C j
w T  w C
j

j
j
j
We know that WSPT is optimal for this
problem!
June 28, 2016
Lecture Notes for Planning and Scheduling
Prepared by Siggi Olafsson
20
Conclusion

The WSPT is optimal in the extreme
case and should be a good heuristic
whenever due dates are tight

Now lets look at the opposite
June 28, 2016
Lecture Notes for Planning and Scheduling
Prepared by Siggi Olafsson
21
Case 2: “Easy” Deadlines

Theorem: If the deadlines are
sufficiently spread out then the MS rule
jselect  arg max d j  p j  t ,0

is optimal (proof a bit harder)
Conclusion: The MS rule should be a
good heuristic whenever deadlines are
widely spread out
June 28, 2016
Lecture Notes for Planning and Scheduling
Prepared by Siggi Olafsson
22
Composite Rule

Two good heuristics

Weighted Shorted Processing Time (WSPT )


Minimum Slack (MS)



optimal with due dates zero
Optimal when due dates are “spread out”
Any real problem is somewhere in between
Combine the characteristics of these rules
into one composite dispatching rule
June 28, 2016
Lecture Notes for Planning and Scheduling
Prepared by Siggi Olafsson
23
Apparent Tardiness Cost (ATC)
Dispatching Rule

New ranking index
 max d j  p j  t ,0

I j (t ) 
exp  
pj
K  p (t )


wj
Scaling constant

When machine becomes free:


Compute index for all remaining jobs
Select job with highest value
June 28, 2016
Lecture Notes for Planning and Scheduling
Prepared by Siggi Olafsson
24
Special Cases (Check)

If K is very large:


If K is very small and no overdue jobs:


ATC reduces to WSPT
ATC reduces to MS
If K is very small and overdue jobs:

ATC reduces to WSPT applied to overdue jobs
June 28, 2016
Lecture Notes for Planning and Scheduling
Prepared by Siggi Olafsson
25
Choosing K


Value of K determined empirically
Related to the due date tightness factor
d
  1
Cmax
and the due date range factor
d max  d min
R
Cmax
June 28, 2016
Lecture Notes for Planning and Scheduling
Prepared by Siggi Olafsson
26
Choosing K


Usually 1.5  K  4.5
Rules of thumb:




Fix K=2 for single machine or flow shop.
Fix K=3 for dynamic job shops.
Adjusted to reduce weighted tardiness
cost in extremely slack or congested job
shops
Statistical analysis/empirical experience
June 28, 2016
Lecture Notes for Planning and Scheduling
Prepared by Siggi Olafsson
27
Topic 4
Branch-and-Bound
& Beam Search
Lecture Notes for Planning and Scheduling
Prepared by Siggi Olafsson
Branch and Bound



Enumerative method
Guarantees finding the best schedule
Basic idea:



Look at a set of schedules
Develop a bound on the performance
Discard (fathom) if bound worse than best
schedule found before
June 28, 2016
Lecture Notes for Planning and Scheduling
Prepared by Siggi Olafsson
29
Classic Result



The EDD rule is optimal for 1 || Lmax
If jobs have different release dates,
which we denote
1 | rj | Lmax
then the problem is NP-Hard
What makes it so much more difficult?
June 28, 2016
Lecture Notes for Planning and Scheduling
Prepared by Siggi Olafsson
30
1 | rj | Lmax
Jobs
EDD
1
2
0
r1
r3
2
2
3
5
rj
0
3
5
dj
8
14
10
3
5
r2
pj
1
4
10
d1
d3
15
d2
Lmax  max L1 , L2 , L3   max C1  d1 , C2  d 2 , C3  d 3 
 max 4  8,10  14,15  10  max  4,4,5  5
June 28, 2016
Lecture Notes for Planning and Scheduling
Prepared by Siggi Olafsson
Can we
improve?
31
Delay Schedule
Add a delay
1
3
0
r1
5
r2
r3
2
10
d1
d3
15
d2
Lmax  max L1 , L2 , L3   max C1  d1 , C2  d 2 , C3  d 3 
 max 4  8,16  14,10  10  max  4,2,0  3
What makes this problem hard is that
the optimal schedule is not necessarily
a non-delay schedule
June 28, 2016
Lecture Notes for Planning and Scheduling
Prepared by Siggi Olafsson
32
Final Classic Result

The preemptive EDD rule is optimal for
the preemptive (prmp) version of the
problem
1 | rj , prmp | Lmax

Note that in the previous example, the preemptive
EDD rule gives us the optimal schedule
June 28, 2016
Lecture Notes for Planning and Scheduling
Prepared by Siggi Olafsson
33
Branch and Bound

The problem
1 | rj | Lmax

cannot be solved using a simple dispatching
rule so we will try to solve it using branch and
bound
To develop a branch and bound procedure:


Determine how to branch
Determine how to bound
June 28, 2016
Lecture Notes for Planning and Scheduling
Prepared by Siggi Olafsson
34
Data
Jobs
1
2
3
4
pj
4
2
6
5
rj
0
1
3
5
dj
8
12
11
10
June 28, 2016
Lecture Notes for Planning and Scheduling
Prepared by Siggi Olafsson
35
Branching
(•,•,•,•)
(1,•,•,•)
(2,•,•,•)
(3,•,•,•)
June 28, 2016
Lecture Notes for Planning and Scheduling
Prepared by Siggi Olafsson
(4,•,•,•)
36
Branching
(•,•,•,•)
(1,•,•,•)
(2,•,•,•)
(3,•,•,•)
(4,•,•,•)
Discard immediately because
r3  3
r4  5
June 28, 2016
Lecture Notes for Planning and Scheduling
Prepared by Siggi Olafsson
37
Branching
(•,•,•,•)
(1,•,•,•)
(2,•,•,•)
(3,•,•,•)
(4,•,•,•)
Need to develop lower bounds on
these nodes and do further branching.
June 28, 2016
Lecture Notes for Planning and Scheduling
Prepared by Siggi Olafsson
38
Bounding (in general)


Typical way to develop bounds is to relax the
original problem to an easily solvable problem
Three cases:



If there is no solution to the relaxed problem there is
no solution to the original problem
If the optimal solution to the relaxed problem is
feasible for the original problem then it is also
optimal for the original problem
If the optimal solution to the relaxed problem is not
feasible for the original problem it provides a bound
on its performance
June 28, 2016
Lecture Notes for Planning and Scheduling
Prepared by Siggi Olafsson
39
Relaxing the Problem

The problem
1 | rj , prmp | Lmax
is a relaxation to the problem 1 | r j | Lmax


Not allowing preemption is a constraint in the
original problem but not the relaxed problem
We know how to solve the relaxed problem
(preemptive EDD rule)
June 28, 2016
Lecture Notes for Planning and Scheduling
Prepared by Siggi Olafsson
40
Bounding



Preemptive EDD rule optimal for the
preemptive version of the problem
Thus, solution obtained is a lower
bound on the maximum delay
If preemptive EDD results in a nonpreemptive schedule all nodes with
higher lower bounds can be discarded.
June 28, 2016
Lecture Notes for Planning and Scheduling
Prepared by Siggi Olafsson
41
Lower Bounds

Start with (1,•,•,•): r4  5


Job with EDD is Job 4 but
Second earliest due date is for Job 3
Lmax  5
Job 1
Job 2
Job 3
Job 4
0
June 28, 2016
10
20
Lecture Notes for Planning and Scheduling
Prepared by Siggi Olafsson
42
Branching
(•,•,•,•)
Lmax  5
(1,•,•,•)
(2,•,•,•)
(1,2,•,•)
(1,3,•,•)
Lmax  6
June 28, 2016
(1,3,4,2)
(3,•,•,•)
(4,•,•,•)
Lmax  7
Lmax  5
Lecture Notes for Planning and Scheduling
Prepared by Siggi Olafsson
43
Beam Search

Branch and Bound:




Considers every node
Guarantees optimum
Usually too slow
Beam search




Considers only most promising nodes
(beam width)
Does not guarantee convergence
Much faster
June 28, 2016
Lecture Notes for Planning and Scheduling
Prepared by Siggi Olafsson
44
Single Machine Example
(Total Weighted Tardiness)
Jobs
1
2
3
4
pj
10
10
13
4
dj
4
2
1
12
wj
14
12
1
12
June 28, 2016
Lecture Notes for Planning and Scheduling
Prepared by Siggi Olafsson
45
Branching
(Beam width = 2)
(•,•,•,•)
(1,•,•,•)
(2,•,•,•)
(3,•,•,•)
(4,•,•,•)
(1,4,2,3)
(2,4,1,3)
(3,4,1,2)
(4,1,2,3)
408
436
814
440
ATC Rule
June 28, 2016
Lecture Notes for Planning and Scheduling
Prepared by Siggi Olafsson
46
Beam Search
(•,•,•,•)
(1,•,•,•)
(1,2,•,•)
(2,•,•,•)
(1,3,•,•)
(1,4,2,3)
June 28, 2016
(1,4,•,•)
(1,4,3,2)
(3,•,•,•)
(2,1,•,•)
(4,•,•,•)
(2,3,•,•)
(2,4,1,3)
Lecture Notes for Planning and Scheduling
Prepared by Siggi Olafsson
(2,4,•,•)
(2,4,3,1)
47
Discussion

Implementation tradeoff:



Careful evaluation of nodes (accuracy)
Crude evaluation of nodes (speed)
Two-stage procedure

Filtering



crude evaluation
filter width > beam width
Careful evaluation of remaining nodes
June 28, 2016
Lecture Notes for Planning and Scheduling
Prepared by Siggi Olafsson
48
Topic 5
Random Search
Lecture Notes for Planning and Scheduling
Prepared by Siggi Olafsson
Construction versus
Improvement Heuristics

Construction Heuristics




Improvement Heuristics



Start without a schedule
Add one job at a time
Dispatching rules and beam search
Start with a schedule
Try to find a better ‘similar’ schedule
Can be combined
June 28, 2016
Lecture Notes for Planning and Scheduling
Prepared by Siggi Olafsson
50
Local Search
0.
1.
Start with a schedule s0 and set k = 0.
Select a candidate schedule
2.
from the neighborhood of sk
If acceptance criterion is met, let
3.
Otherwise let sk 1  sk .
Let k = k+1 and go back to Step 1.
s  N sk 
sk 1  s.
June 28, 2016
Lecture Notes for Planning and Scheduling
Prepared by Siggi Olafsson
51
Defining a Local Search
Procedure

Determine how to represent a schedule.

Define a neighborhood structure.

Determine a candidate selection procedure.

Determine an acceptance/rejection criterion.
June 28, 2016
Lecture Notes for Planning and Scheduling
Prepared by Siggi Olafsson
52
Neighborhood Structure

Single machine



A pairwise interchange of adjacent jobs
(n-1 jobs in a neighborhood)
Inserting an arbitrary job in an arbitrary
position
(  n(n-1) jobs in a neighborhood)
Allow more than one interchange
June 28, 2016
Lecture Notes for Planning and Scheduling
Prepared by Siggi Olafsson
53
Neighborhood Structure

Job shops with makespan objective


Neighborhood based on critical paths
Set of operations start at t=0 and end at t=Cmax
Machine 1
Machine 2
Machine 3
Machine 4
June 28, 2016
Lecture Notes for Planning and Scheduling
Prepared by Siggi Olafsson
54
Neighborhood Structure

The critical path(s) define the
neighborhood



June 28, 2016
Interchange jobs on a critical path
Too few better schedules in the
neighborhood
Too simple to be useful
Lecture Notes for Planning and Scheduling
Prepared by Siggi Olafsson
55
One-Step Look-Back
Interchange
Jobs on critical path
Machine h
Machine i
Interchange jobs
on critical path
Machine h
Machine i
One-step look-back
interchange
Machine h
Machine i
June 28, 2016
Lecture Notes for Planning and Scheduling
Prepared by Siggi Olafsson
56
Neighborhood Search

Find a candidate schedule from the
neighborhood:


Random
Appear most promising
(most improvement in objective)
June 28, 2016
Lecture Notes for Planning and Scheduling
Prepared by Siggi Olafsson
57
Acceptance/Rejection

Major difference between most
methods



Two popular methods:



Always accept a better schedule?
Sometimes accept an inferior schedule?
Simulated annealing
Tabu search
Same except the acceptance criterion!
June 28, 2016
Lecture Notes for Planning and Scheduling
Prepared by Siggi Olafsson
58
Topic 6
Simulated Annealing
Lecture Notes for Planning and Scheduling
Prepared by Siggi Olafsson
Simulated Annealing (SA)

Notation




S0 is the best schedule found so far
Sk is the current schedule
G(Sk) it the performance of a schedule
Note that
G( S k )  G( S0 )
June 28, 2016
Lecture Notes for Planning and Scheduling
Prepared by Siggi Olafsson
Aspiration Criterion
60
Acceptance Criterion


Let Sc be the candidate schedule
If G(Sc) < G(Sk) accept Sc and let
S k 1  Sc

If G(Sc)  G(Sk) move to Sc with
probability
Always  0
G ( S k ) G ( S c )
P(Sk 1 , Sc )  e
June 28, 2016
k
Lecture Notes for Planning and Scheduling
Prepared by Siggi Olafsson
Temperature > 0
61
Cooling Schedule

The temperature should satisfy
1   2   3  ...  0


Normally we let
k  0
but we sometimes stop when some
predetermined final temperature is reached
If temperature is decreased slowly
convergence is guaranteed
June 28, 2016
Lecture Notes for Planning and Scheduling
Prepared by Siggi Olafsson
62
SA Algorithm
Step 1:
Set k = 1 and select the initial temperature 1.
Select an initial schedule S1 and set S0 = S1.
Step 2:
Select a candidate schedule Sc from N(Sk).
If G(S0) < G(Sc) < G(Sk), set Sk+1 = Sc and go to Step 3.
If G(Sc) < G(S0), set S0 = Sk+1 = Sc and go to Step 3.
If G(Sc) < G(S0), generate Uk Uniform(0,1);
If Uk  P(Sk, Sc), set Sk+1 = Sc; otherwise set Sk+1 = Sk; go to
Step 3.
Step 3:
Select k+1  k.
Let k=k+1. If k=N STOP; otherwise go to Step 2.
June 28, 2016
Lecture Notes for Planning and Scheduling
Prepared by Siggi Olafsson
63
Discussion



SA has been applied successfully to
many industry problems
Allows us to escape local optima
Performance depends on


Construction of neighborhood
Cooling schedule
June 28, 2016
Lecture Notes for Planning and Scheduling
Prepared by Siggi Olafsson
64
Topic 7
Tabu Search
Lecture Notes for Planning and Scheduling
Prepared by Siggi Olafsson
Tabu Search


Similar to SA but uses a deterministic
acceptance/rejection criterion
Maintain a tabu list of schedule
changes



A move made entered at top of tabu list
Fixed length (5-9)
Neighbors restricted to schedules not
requiring a tabu move
June 28, 2016
Lecture Notes for Planning and Scheduling
Prepared by Siggi Olafsson
66
Tabu List

Rational


Disadvantage


Avoid returning to a local optimum
A tabu move could lead to a better
schedule
Length of list


Too short  cycling (“stuck”)
Too long  search too constrained
June 28, 2016
Lecture Notes for Planning and Scheduling
Prepared by Siggi Olafsson
67
Tabu Search Algorithm
Step 1:
Set k = 1. Select an initial schedule S1 and set S0 = S1.
Step 2:
Select a candidate schedule Sc from N(Sk).
If Sk  Sc on tabu list set Sk+1 = Sk and go to Step 3.
Enter Sc  Sk on tabu list.
Push all the other entries down (and delete the last one).
If G(Sc) < G(S0), set S0 = Sc.
Go to Step 3.
Step 3:
Select k+1  k.
Let k=k+1. If k=N
STOP;
otherwise
goScheduling
to Step 2.
Lecture
Notes for
Planning and
June 28, 2016
Prepared by Siggi Olafsson
68
Single Machine Total Weighted
Tardiness Example
Jobs
1
2
3
4
pj
10
10
13
4
dj
4
2
1
12
wj
14
12
1
12
June 28, 2016
Lecture Notes for Planning and Scheduling
Prepared by Siggi Olafsson
69
Initialization


Tabu list empty L={}
Initial schedule S1 = (2,1,4,3)
20 - 4 = 16
10 - 2 = 8
Deadlines
37 - 1 = 36
24 - 12 = 12
0
10
w T
j
June 28, 2016
j
20
30
40
 14 16  12  8  1 36  12 12  500
Lecture Notes for Planning and Scheduling
Prepared by Siggi Olafsson
70
Neighborhood



Define the neighborhood as all
schedules obtain by adjacent pairwise
interchanges
The neighbors of S1 = (2,1,4,3) are
Select the best
(1,2,4,3)
non-tabu schedule
(2,4,1,3)
(2,1,3,4)
Objective values are: 480, 436, 652
June 28, 2016
Lecture Notes for Planning and Scheduling
Prepared by Siggi Olafsson
71
Second Iteration



Let S0 =S2 = (2,4,1,3). G(S0)=436.
Let L = {(1,4)}
New neighborhood
Sequence
(4,2,1,3)
(2,1,4,3)
(2,4,3,1)
w T
460
500
608
j
June 28, 2016
Lecture Notes for Planning and Scheduling
Prepared by Siggi Olafsson
72
Third Iteration



Let S3 = (4,2,1,3), S0 = (2,4,1,3).
Let L = {(2,4),(1,4)}
New neighborhood
Sequence
(2,4,1,3)
(4,1,2,3)
(4,2,3,1)
w T
436
440
632
j
Tabu!
June 28, 2016
Lecture Notes for Planning and Scheduling
Prepared by Siggi Olafsson
73
Fourth Iteration



Let S4 = (4,1,2,3), S0 = (2,4,1,3).
Let L = {(1,2),(2,4)}
New neighborhood
Sequence
(1,4,2,3)
(4,2,1,3)
(4,1,3,2)
w T
402
460
586
j
June 28, 2016
Lecture Notes for Planning and Scheduling
Prepared by Siggi Olafsson
74
Fifth Iteration





Let S5 = (1,4,2,3), S0 = (1,4,2,3).
Let L = {(1,4),(1,2)}
This turns out to be the optimal
schedule
Tabu search still continues
Notes:


Optimal schedule only found via tabu list
We never know if we have found the optimum!
June 28, 2016
Lecture Notes for Planning and Scheduling
Prepared by Siggi Olafsson
75
Topic 8
Genetic Algorithm
Lecture Notes for Planning and Scheduling
Prepared by Siggi Olafsson
Genetic Algorithms





Schedules are individuals that form
populations
Each individual has associated fitness
Fit individuals reproduce and have
children in the next generation
Very fit individuals survive into the next
generation
Some individuals mutate
June 28, 2016
Lecture Notes for Planning and Scheduling
Prepared by Siggi Olafsson
77
Total Weighted Tardiness
Single Machine Example

Initial population
selected randomly
First Generation
Individual
(sequence)
Fitness
 w T 
j
June 28, 2016
(2,1,3,4)
(3,4,1,2)
(4,1,3,2)
652
814
586
j
Lecture Notes for Planning and Scheduling
Prepared by Siggi Olafsson
78
Mutation Operator
Fittest
Individual
Individual
(2,1,3,4)
(3,4,1,2)
(4,1,3,2)
Fitness
652
814
586
Reproduction
via mutation
(4,3,1,2)
June 28, 2016
Lecture Notes for Planning and Scheduling
Prepared by Siggi Olafsson
79
Cross-Over Operator
Fittest
Individuals
Individual
(2,1,3,4)
(4,3,1,2)
(4,1,3,2)
Fitness
652
758
586
(4,1,3,2)
(2,1,3,2)
Reproduction
via cross-over
(2,1,3,4)
June 28, 2016
Infeasible!
(4,1,3,4)
Lecture Notes for Planning and Scheduling
Prepared by Siggi Olafsson
80
Representing Schedules

When using GA we often represent
individuals using binary strings
(0 1 1 0 1 0 1 0 1 0 1 0 0 )
(1 0 0 1 1 0 1 0 1 0 1 1 1 )
(0 1 1 0 1 0 1 0 1 0 1 1 1 )
June 28, 2016
Lecture Notes for Planning and Scheduling
Prepared by Siggi Olafsson
81
Discussion


Genetic algorithms have been used very
successfully
Advantages:



Very generic
Easily programmed
Disadvantages:

A method that exploits special structure is
usually faster (if it exists)
June 28, 2016
Lecture Notes for Planning and Scheduling
Prepared by Siggi Olafsson
82
Topic 9
Nested Partitions
Method
Lecture Notes for Planning and Scheduling
Prepared by Siggi Olafsson
The Nested Partitions Method

Partitioning of all schedules


Random sampling & local search used to find
which node is most promising



Similar to branch-and-bound & beam search
Similar to beam search
Retains only one node (beam width = 1)
Allows for possible backtracking
June 28, 2016
Lecture Notes for Planning and Scheduling
Prepared by Siggi Olafsson
84
Single Machine Total Weighted
Tardiness Example
Jobs
1
2
3
4
pj
10
10
13
4
dj
4
2
1
12
wj
14
12
1
12
June 28, 2016
Lecture Notes for Planning and Scheduling
Prepared by Siggi Olafsson
85
First Iteration
(•,•,•,•)
(1,•,•,•)
(2,•,•,•)
Most promising region
(3,•,•,•)
(4,•,•,•)
Random Samples (uniform sampling, ATC rule, genetic algorithm)
(1,4,2,3)
(1,2,4,3)
(1,4,3,2)
June 28, 2016
402
480
554
Promising Index
I(1,•,•,•) = 402
Lecture Notes for Planning and Scheduling
Prepared by Siggi Olafsson
86
Notation

We let
s(k) be the most promising region in the k-th
iteration
sj(k) be the subregions, j=1,2,…,M
sM+1(k) be the surrounding region
S denote all possible regions (fixed set)
S0 all regions that contain a single schedule
I(s) be the promising index of sS
June 28, 2016
Lecture Notes for Planning and Scheduling
Prepared by Siggi Olafsson
87
Second Iteration
Most promising region s(2)
ss(2))
(•,•,•,•)
s4(2)
(1,•,•,•)
s1(2)
(1,2,•,•)
{(2,•,•,•),(3,•,•,•), (4,•,•,•)}
s2(2)
s3(2)
(1,3,•,•)
(1,4,•,•)
Best Promising Index
June 28, 2016
Random Samples
(4,2,1,3)
(2,4,1,3)
(4,1,3,2)
460
436
586
I(s4(2)) = 436
Lecture Notes for Planning and Scheduling
Prepared by Siggi Olafsson
88
Third Iteration
(•,•,•,•)
s3(3)
ss(3))
{(1,2,•,•), (1,4,•,•),(2,•,•,•),(3,•,•,•), (4,•,•,•)}
(1,•,•,•)
Random Samples
s(3)
(1,3,•,•)
s1(3)
s2(3)
(4,2,3,1)
(2,4,1,3)
(1,4,2,3)
(1,3,2,4)
(1,3,4,2)
June 28, 2016
Lecture Notes for Planning and Scheduling
Prepared by Siggi Olafsson
632
436
402
I(s3(3)) = 402
Best Promising Index
89
Fourth Iteration
ss(4))
(•,•,•,•)
s(4)
s4(4)
(1,•,•,•)
s1(4)
(1,2,•,•)
{(2,•,•,•),(3,•,•,•), (4,•,•,•)}
s2(4)
s3(4)
(1,3,•,•)
(1,4,•,•)
Backtracking!
June 28, 2016
Lecture Notes for Planning and Scheduling
Prepared by Siggi Olafsson
90
Finding the Optimal Schedule

The sequence
s (k )

k 1

is a Markov chain
Eventually the singleton region soptS0
containing the best schedule is visited


Absorbing state (never leave sopt)
Finite state space
 visited after finitely many iterations.
June 28, 2016
Lecture Notes for Planning and Scheduling
Prepared by Siggi Olafsson
91
Discussion

Finds best schedule in finite iterations


If we can calculate/estimate/guarantee the
probability of moving in right direction:




Only branch-and-bound can also guarantee
Expected number of iterations
Probability of having found optimal schedule
Stopping rules that guarantee performance
(unique)
Works best if incorporates dispatching rules
and local search (genetic algorithms, etc)
June 28, 2016
Lecture Notes for Planning and Scheduling
Prepared by Siggi Olafsson
92
Topic 10
Mathematical
Programming
Lecture Notes for Planning and Scheduling
Prepared by Siggi Olafsson
Review of Mathematical
Programming

Many scheduling problems can be
formulated as mathematical programs:




Linear Programming
Nonlinear Programming
Integer Programming
See Appendix A in book.
June 28, 2016
Lecture Notes for Planning and Scheduling
Prepared by Siggi Olafsson
94
Linear Programs
Minimize
subject to
c1 x1  c2 x2  ...  cn xn
a11 x1  a12 x2  ...  a1n xn  b1
a21 x1  a22 x2  ...  a2 n xn  b1

am1 x1  am 2 x2  ...  amn xn  b1
xj  0
June 28, 2016
for j  1,..., n
Lecture Notes for Planning and Scheduling
Prepared by Siggi Olafsson
95
Solving LPs

LPs can be solved efficiently




Simplex method (1950s)
Interior point methods (1970s)
Polynomial time
Has been used in practice to solve huge
problems
June 28, 2016
Lecture Notes for Planning and Scheduling
Prepared by Siggi Olafsson
96
Nonlinear Programs
Minimize
f ( x1 , x2 ,..., xn )
subject to
g1 ( x1, x2 ,..., xn )  0
g 2 ( x1, x2 ,..., xn )  0

g m ( x1, x2 ,..., xn )  0
June 28, 2016
Lecture Notes for Planning and Scheduling
Prepared by Siggi Olafsson
97
Solving Nonlinear Programs

Optimality conditions



Karush-Kuhn-Tucker
Convex objective, convex constraints
Solution methods



Gradient methods (steepest descent,
Newton’s method)
Penalty and barrier function methods
Lagrangian relaxation
June 28, 2016
Lecture Notes for Planning and Scheduling
Prepared by Siggi Olafsson
98
Integer Programming




LP where all variables must be integer
Mixed-integer programming (MIP)
Much more difficult than LP
Most useful for scheduling
June 28, 2016
Lecture Notes for Planning and Scheduling
Prepared by Siggi Olafsson
99
Example: Single Machine



One machine and n jobs
Minimize n

j 1
w jC j
Define the decision variables
1 if job j starts at time t
x jt  
0 otherwise.
June 28, 2016
Lecture Notes for Planning and Scheduling
Prepared by Siggi Olafsson
100
IP Formulation
Minimize
n C max 1
  w (t  p ) x
j 1
subject to
t 0
j
j
C max 1
x
t 0
n

jt
t 1
x
js
j 1 s  max{ t  p j , 0}
jt
 1 j
 1 t
x jt  0,1 j , t
June 28, 2016
Lecture Notes for Planning and Scheduling
Prepared by Siggi Olafsson
101
Solving IPs

Solution Methods

Cutting plane methods



Branch-and-bound methods



Linear programming relaxation
Additional constraints to ensure integers
Branch on the decision variables
Linear programming relaxation provides bounds
Generally difficult
June 28, 2016
Lecture Notes for Planning and Scheduling
Prepared by Siggi Olafsson
102
Disjunctive Programming

Conjuctive


Disjunctive



All constraints must be satisfied
At least one of the constraints must be
satisfied
Zero-one integer programs
Use for job-shop scheduling
June 28, 2016
Lecture Notes for Planning and Scheduling
Prepared by Siggi Olafsson
103
Download