Complexity :

advertisement
IEM (GRAD)
Production Scheduling
2012-02
References:
1. Pinedo, M., Scheduling: Theory, Algorithms, and Systems, 3rd edition,
Prentice-Hall, 2008.
2. Sule, D.R., Industrial Scheduling, PWS Publishing Company, 1997.
3. Morton T.E., and Pentico, D.W., Heuristic Scheduling Systems, Wiley,
1993.
4. Blazewicz, J., Ecker, K.H., Schmidt, G., and Weglarz, J., Scheduling
in Computer and Manufacturing Systems, Springer, 1994.
5. Parker, R.G., Deterministic Scheduling Theory, Chapman & Hall,
1995.
6. Brucker, P., Scheduling Algorithms, 5th ed., Springer, 2007.
7. Baker, K.R., Elements of Sequencing and Scheduling, 1998.
Topics:
1. Classes of Scheduling Problems
2. Computational Complexity
3. General Solution Methods:
(1) Optimization (Integer Programming, Network Programming,
Dynamic Programming, Branch-and-bound)
4.
5.
6.
7.
(2) Approximation (Dispatching Rules, Local search, Lagrangian
Relaxation)
Single Machine Models
Parallel Machine Models
Shop Models: Flow Shops, Job Shops, and Open Shops
Other Scheduling Problems
Grading:
Homework 40%, Midterm 40%, Project and Presentation 20%
10
Introduction
Operations
Research Research
Operations
Mathematical Programming
Mathematical
Programming
Combinatorial
Optimization
Scheduling Theory
Combinatorial Optimization
Scheduling Theory
“Scheduling” handles the problems of optimal assignment, arrangement,
sequencing and timetabling.
Schedule: A plan for the timing of certain activities.
Scheduling: The process of generating schedules.
Scheduling problems: There are a set of tasks to be carried out and a set of
resource available to perform those tasks. The scheduling problem is to
determine the detailed timing of the tasks within the capacity of resources.
Here,
Tasks: operations in a production process
take offs and landings in an airport
stages in a construction project
executions of a computer program.
Resources: machines in a workshop.
runways in an airport.
crews at a construction site.
CPUs in a computer.
The objective of scheduling problem:
Ex: Determine a schedule that minimizes the completion time of the last
task.
Ex: Determine a schedule that minimizes the # of tasks completed after
their due dates.
11
Scheduling exists in: manufacturing
information processing
transportation
distribution
service (ex: nurse scheduling, examination scheduling).
The place of scheduling within an organization:
Figure 1
Framework and Notations:
Here, resources are called machines.
tasks are called jobs.
A feasible schedule requires that:
· a job cannot be processed by two or more machines at a time,
· a machine cannot process two or more jobs at the same time.
Sometimes, jobs may consist of several elementary tasks called operations.
12
We shall focus on deterministic scheduling models: all information about
machines and jobs are known and fixed. (ex: pij =5)
(Stochastic scheduling): pij ~ N (10,3)
Notations:
n: # of jobs
m: # of machines
pij : processing time of job j on machine i
( p j : single-machine case)
rj : release time( or ready time) of job j
d j : due date of job j
The completion of job j after d j is allowed, but a penalty is incurred.
When the due day absolutely must be met, it is called deadline ( d j ).
w j : weight of job j (relative importance of job j)
A scheduling problem is denoted by a triplet  /  /  :
 : machine environment
 : processing characteristics and constraints
 : performance measure
:
Single machine(1)
Identical Machines in parallel ( Pm ): m identical machines in parallel; each
job j requires a single operation on any one of the machines.
Machines in Parallel with different speeds (uniform machines) ( Qm ): the
p
processing time pij of job j on machine i is pij  j where vi =speed of
vi
machine i ( vi doesn’t depend on jobs).
Unrelated machines in parallel ( Rm ): the speeds of machines depend on
jobs.( pij is specified for each job j on each machine i)
Flow shop ( Fm ): there are m machines in series. Each job has to be
processed on each one of the m machines. All jobs have the same routing.
Ex: machine1→machine 2→machine 3…→machine m.
13
m1
1
m2
m3
3
2
1
3
2
1
3
2
1
3
m4
2
Figure 2
What is the total # of feasible schedules in this case?
Permutation Flow Shop ( Fm / prmu /? ): All machines have the same
processing order.
What is the total # of feasible schedules in this case?
Flexible Flow Shop (FFs): Flow shop with parallel machines. There are s
stages in series with a number of machines in parallel at each stage. (Each
job at each stage can be processed by any one of the available machines.)
Job Shop ( J m ): Each job has its own route to follow.
What is the total # of feasible schedules in this case?
Open Shop ( Om ):There are no restrictions with regard to the routing of
each job through the machines.
What is the total # of feasible schedules in this case?
Ex: maintenance, inspection, testing.
:
rj : release time. (if rj does not appear, rj =0 for all j)
s jk : ( sequence dependant setup times) setup time required between jobs j
and k (j→k).
14
(If s jk doesn’t appear, all setup times are zero or sequence independent, in
which case they can be added to their processing times).
Setup
j
k
Figure 3
sequence independent: fixed setup time
i
k
Figure 4
Prmp or Pmtn: (preemption) The schedule is allowed to interrupt the
processing of a job at any time. If prmp does not appear, preemption is not
allowed.
j
j
k
j
Figure 5
Prec: (precedence constraint) Requiring that one or more jobs may have to
be completed before another job can start its processing.
If each job has at most one immediate predecessor and one successor, the
constraints are referred as chains. If each job has at most one successor,
the constraints are referred as intrees. If each job has at most one
predecessor, then the constraints are referred as outtrees.
15
ex:
chains
1
3
4
5
2
Figure 6
ex:
intree
1
2
3
4
5
6
7
Figure 7
ex:
outtree
9
4
5
6
10
7
2
8
3
1
Figure 8
16
M j : machine eligibility constraints.
M j =set of machines that can process job j.
block: blocking (occurs in shops with a limited buffer between 2
successive machines).
m1
m2
12
11
…
10
…
Figure 9
nwt: (no wait) jobs are not allowed to wait between 2 successive machines.
This implies that the starting time of a job at the first machine has to be
delayed.
idle
m1
1 1
m2
2
1
Figure 10
recre: (recirculation or reentry) Jobs may visit a machine more than once.
Ex: Semiconductor industry

 p j  p implies that all processing times are equal.
Also 

d j  d implies that all due dates are equal
The performance measure of a scheduling problem is always a function of
the completion times of jobs.
Define:
Cij = Completion time of job j on machine i
17
C j = Completion time of job j = max Cij
1i m
L j = Lateness of job j = C j  d j (may be <0)
d j = Due date of job j
T j = Tardiness of job j = max(0, C j  d j )  0
E j = Earliness of job j = max(0, d j  C j )  0
1 if C j  d j
U j = Unit penalty of job j = 
0 otherwise
Performance measure f( C1 , C2 ,...Cn ):we are minimizing f( C1 , C2 ,...Cn ).
γ:( performance measure )
(1) Cmax=max(C1,C2,…,Cn)=makespan
min Cmax  max machine utilization  max throughput.
(2) Lmax=max(L1,L2,…,Ln)
- min Lmax  min worst violation of due date.
Ex:

 L1  10

 L2  4  Lmax  10 ( )

 L3  1


 L1  16

 L2  0  Lmax  16

 L3  0

(3) ΣWjCj:weighted completion time
ΣWjFj:weighted flowtime
Fj  C j  rj  flowtime of job j  time job j spent in the system.
※If r j = 0,then ΣWj Cj=ΣWj Fj
n
C
Cj
j 1
n
n
, F
F
j 1
n
j




C
F
j
j
 n C
  holding or inventory cos t
 n F 
Consider 1 // ΣC j:
Let J (t ) # of jobs in system at time t.
Assume the jobs are processed in the order 1, 2 , … , n.
18
Note: Cmax  P1  P2  .....  Pn (constant)
J(t)
n
n-1
n-2
n-3
Let J 

Cmax
0
J (t )dt
Cmax
 time average of J (t ) 
Area
Cmax
n
Area  C1  C2  ....  Cn   C j  J  Cmax
j 1
J
Cmax is constant.  min
C
j
 min J
3
2
1
0
P1 P1+P2
(C1) (C2)
Cn-1 Cn
(C max)
(4) ΣWjTj:weighted tardiness <衡量程度 >
(5) ΣWjUj:weighted # of tardy jobs <衡量個數 >
Ex:






Uj={0,1}
T1  2
T2  1  W jT j  4 ( if W j  1 ) ()
W U
j
j
3
T3  1
 T 5
 1

 T2  0  W jT j  5 ( if W j  1 )

 T3  0
Ex: 1/ ri , prmp / WiCi
1/ S jk / Cmax
1/ rj / Cmax
W U
j
j
 1 ( )
Qm / prec / W jT j
O2 / no  wait /  C j
We shall focus on static scheduling models:
=> Set of jobs available for scheduling does not change.
19
Dynamic models:New jobs arrive probabilistically over time.
( Approximation: ex: rolling horizon )
new jobs
time
Regular performance measure:
Functions that are non-decreasing in (C1 ,C2 ,…, Cn).
=> wish to complete jobs as early as possible.
Ex:
f (C1 , C2 ,..., Cn ) and f (C1' , C2' ,..., Cn' )
f (C1 , C2 ,..., Cn )  f (C1' , C2' ,..., Cn' )
if and only if at least one C j  C j ' for some j and Ci  Ci' for all other i.
C
Ex: Cmax ,
j
,
 T , U
j
j
?
Non-regular performance measure: Ex:
 (E
j
 Tj )  JIT
 early completion of jobs may not be advantageous.
Classes of schedules:
Def:A feasible schedule is called non-delay ( or dense ) if no
machine is kept idle when there is an operation available for processing.
1
2
M1
1,
M2
2
3
2
1
Not non-delay
J1:M1→M2
J2:M2→M1 Not non-delay
20
Note:1. An opt. schedule may NOT be a non-delay schedule.
1
2
3
Job 2, released later than Job 3, is more important.
置 54 製置
2.Non-delay may lead to unexpected anomalies.
Ex:P2 / prec / Cmax
i 1 2 3 4 5 6 7 8 9 10
Pi 8 7 7 2 3 2 2 8 8 15
Precedence constraints:
2
1
10
3
4
5
6
8
7
9
1. Opt. schedule:
M1
1
2.
4
M2
6
2
5
7
8
3
9
10
Cmax=31
2. if Pj  Pj 1 , j  1, 2,...,10
M1
M2
1
4
6
5
 non  delay :
2
7
8
9
3
10
Cmax=32
3. if add one more machine with original processing times =>opt. Cmax=36
21
Def:A feasible schedule is called active if no operation can be
completed earlier by altering processing sequence on machines, and
not delay any other operation.
Ex:J3 // Cmax
Pij
M1
M2
M3
J1
1
3
0
J2
0
3
2
J1:M1→ M2
J2:M3→ M2
Not non-delay
M1
1
active
1
2
M2
2
M3
1
2
What if P12=1?
Def:A feasible schedule is called semi-active if no operation can be
completed earlier without altering processing sequence on machine.
Ex:J3 // Cmax
M1
1
Not semi-active
M2
M3
2
2
1
1
- For regular perform measures, an opt. schedule must be an active
schedule.( May not be non-delay )
22
Set of all feasible schedules
Semi-active
Active
*Opt.
Non-delay
Complexity:
The complexity of an algorithm is its running time in terms of the
input parameters (e.g., number of jobs and number of machines).
1. Polynomially Solvable Problem (P):Polynomial time algorithms exist.
Ex: O(n2)、O(nlogn)、O(n3m4)
Ex: Bubble Sort (in descending order)
For i=1 to n-1
For j=i+1 to n
If (Pi<Pj)
temp=Pi
4次
Pi=Pj
1→ n-1
2→ n-2
………
n-1→ 1
(n-1)n
2
*4 =>2(n2-n))
=>n2
Pj=temp
Ex: Quick Sort => O(nlogn)
2. NP-hard Problem (NP):No polynomial time algorithms exist.
Ex: O(2n)、O(nm)
 n or m appears as an exponential ◦
23
 optimal solution method: Branch-and-bound、 Dynamic
Programming
 NP-hard in the ordinary sense: Pseudopolynomial time
algorithms exist. (Ex: O(m p ) )
j
 NP-hard in the strong sense: No pseudopolynomial time
algorithms exist.
What is the difference?
O(n)
O(n log n)
O(n2)
O(2n)
1
0
1
2
10
10
100
1024
20
26
400
1048576
50
85
2500
1,125,899,906,842,624
100
200
10000
1.268 × 1030
1000
3000
0
1,000,00
1.072 × 10301
Note: Age of universe~ 1018 seconds.
Fastest computer today: 1014 operations/second
24
Time
n
O(2 )
O(n 2 )
O (n)
O (log n)
Problem size (n)
Are exponential time algorithms useless?
 May be we can do it in a reasonable time in most cases?
 May be we can get a good (but not necessarily best possible) solution
in a reasonable amount of time?
Research topics for Scheduling:
 determine boarder line between polynomially solvable and NP-hard
models
 for polynomially solvable models find the most efficient solution
method (low complexity)
 for NP-hard models
 develop enumerative methods (DP, branch and bound, branch
and cut, ...)
 develop heuristic approaches (priority based, local search, ...)
 consider approximation methods (with quality guarantee)
25
Problem Solving:
1. Define the problem:
* Assumptions
* Approximation
Real problem
Problem to solve
2. Mathematical Modeling
3. Complexity Analysis : P, NP or Open ? ( P → Approximation )
4. Develop Solution Methods
5. Performance Evaluation: average-case or worst-case
26
Single Machine Models ( SMM ):
SMM is important:
1. Easy:Optimal or efficient algorithms can be developed.
2. Special cases of other systems:
Results obtained provide a basis for analyzing other complex
systems.
3. More complex systems are often decomposed into simple submodels.
Ex: “bottleneck” machine
1
2
4
bottleneck
In SMM, preemptions are not necessary if all ri=0, i = 1,…,n .
S
S’
b
a
a
b
a
=> S dominates S’
=> C’a=Ca and C’b>Cb
1. 1 // ∑Ci ( 1 // C )
Ex:n=2, P1=3, P2=9
1→2
1
C1+C2=3+12=15 (√ )
2
3
2→1
12
2
1
9
C2+C1=9+12=21
12
Def:SPT ( Shortest Processing Time ) rule
Job are ordered in non-decreasing of Pi .
Theorem:SPT rule is optimal to 1 // ∑Ci
pf:By contradiction!
27
Suppose there exists an opt. schedule S which is NOT in SPT order.
Then, there are 2 adjacent jobs j → k in S,such that Pj > Pk .
Construct a schedule S’ by interchanging j and k only.
……
…
……
…
t+Pj t+Pj +Pk
……
……
k
j
… t
…
t+Pk t+Pk +Pj
S
S’
j
k
Ci  Ci' i  j , k
S : C j  Ck  (t  Pj )  (t  Pj  Pk )
S ' : C j ' Ck '  (t  Pk )  (t  Pk  Pj )
Pj  Pk  S ' is better than S  Contradiction
2. 1 // ∑WiCi
Ex:n=2, P1=3, P2=9 , W1=1, W2=5
1→2
1
2
∑WiCi =W1 C1+ W2 C2=1*3+5*12=63
3
2→1
12
2
1
9
∑WiCi =W2 C2+ W1 C1=5*9+1*12=57(√ )
12
 weights make a difference.
Def: WSPT ( Weighted Shortest Processing Time ) rule
Jobs are ordered in non-increasing order of Wi / Pi .
Theorem : WSPT rule is optimal to 1 // ∑WiCi
pf :HW#1-1
3. 1 /prec/ ∑WiCi
28
For arbitrary precedence constraints,the problem is NP-hard.
However, the case where precedence constraints take the form of
chains in parallel is easy to solve .
1→4→5
Ex:1→4→5
2→6
2→6
3→8→7→9
3→8→7→9 ( not parallel chains=>NP-hard )
Consider 2 parallel chains:
1 → 2 →……→ k (1)
(k+1) → (k+2) →……→ n (2)
Suppose jobs in one chain have to be completed before switching over
to the other chain. Which chain should go first?
Note: In this case, a chain can be viewed as a single ‘huge’ job.
k
Theorem:If
 Wi
i 1
k
P
i 1
n

i
W
i
i  k 1
n
P
i  k 1
, then chain (1) precedes chain (2).
i
Consider a chain 1 → 2 →……→ k , let l* satisfies
l*
 Wi
i 1
l*
 Pi
l
 max {
1l  k
i 1
 Wi
i 1
l
 Pi
i 1
Ex:
i
Wi
Pi
}    factor of chain 1  2    k
1
1
3
2
1
1
3
2
4
4
5
8
l
1
2
3
4
∑Wi/∑Pi
1/3
2/4
4/8
9/16
l *=4, ρ-factor=9/16
L*
29
Assume now we do not need to complete all jobs in one chain before
switching to another chain .
Theorem:If l * determines the ρ-factor of one chain then there exists
an opt. schedule that processes jobs 1,2,..,l * continuously without
interruption by jobs from other chains .
Algorithm for 1/chains in parallel / ∑WiCi :
Whenever the machine is freed , select among the remaining chains,
the one with the highest ρ-factor,and process the corresponding jobs
from first job up to the one determining the ρ-factor (l *) continuously .
Ex:
i
Wi
Pi
1
6
3
2
18
6
3
12
6
4
8
5
5
8
4
Chain(1):l1*=2 , ρ-factor=2.67
l
6
17
8
2.67
2.4
Precedence constr:
1→2→3→4
5→6→7
Chain(2):l2*=6 , ρ-factor=2.08
1 2
3
4
6/3 24/9 36/15 44/20
(2
7
18
10
l
2.2 )
5
6
7
2 25/12 43/22
(2
2.08
1.95 )
2.67  2.08  Seq :1  2
Chain(1):l1*=3 , ρ-factor=2
l 3
4
2
20/11
 2.08  2  Seq : 1  2  5  6
Chain(2):l2*=7 , ρ-factor=1.8 (18/10)
 2  1.8  Seq : 1  2  5  6  3
30
Chain(1):l1*=4 , ρ-factor=1.6 (8/5)
1.8  1.6  Seq : 1  2  5  6  3  7  4
HW#1-2
i
1 2 3 4 5 6 7 8 9 10
Wi 2 6 5 1 7 2 1 4 3 4
Pi 10 8 8 4 5 3 3 9 10 8
Precedence constraints :
1→2→3→4
5→6→7
8→9→10
4. 1 /prec/ h max
hmax = max {h1(C1), h2(C2),…hn(Cn)}, where hi(Ci) can be any
function of Ci.
ex:hi (Ci)=Ci ….. → 1/prec/Cmax
hi (Ci)=Ci-di →
C1-d1 ,C2-d2 , ……. Cn-dn = L1 , L2 , …. Ln →1/prec/Lmax
max (0, C1-d1), max (0, C2-d2), ….. , max (0,Cn-dn)=T1,T2,….,Tn→
1/prec/ Tmax
Lawler’s Backward Dynamic Programming Algorithm for 1 /prec/ h max:
Let J=set of scheduled jobs
Jc={1,2,…,n}\J=complement of J=set of unscheduled jobs
J '  J c  set of jobs can be scheduled immediately before J
 set of schedulable jobs
i→j ,
k→l
Jc
J’ i
*
i
*
J
 P  C P
n
i
iJ c
J’=Jc\{i , k}
i
i 1
i
31
Step1:Set J=  ,Jc={1,2,….,n} and J’=set of jobs with no successors.
{hi (  Pi )} . Add i * to J,delete
Step2:Let i* be such that hi* (  Pi )  min
i J '
i J c
i J c
i* from Jc and modify J’.
Step3:If Jc =  ,STOP;Otherwise,go to Step2.
Ex:
i
1
2
Pi
2
3
hi (Ci) 1+C1 1.3C2
3
5
10
precedence:1→3
3
 P =2+3+5=10
i 1
i
J’={2,3};h3(  Pi )=10 (ˇ),h2(  Pi )=1.3×10=13
iJ c
iJ c
{ hi (  Pi )}= h3(  P )=10
 Pi )  min
iJ '
c
→ hi* (
i
iJ c
iJ c
iJ
2
1
3
10
t
 P =2+3=5
i
iJ c
∴ J’={1,2}; h1(ΣPi)=1+5=6 (ˇ),h2(ΣPi)=1.3×5=6.5
=>Opt. Schedule:2→1→3
What if no precedence constraint ? (this algorithm still applies)
 Lawler’s algorithm has a complexity of O(n2). Why?
 For 1 // Lmax, Lawler’s algorithm reduces to EDD rule. Why?
hi(Ci)=Ci-di=Li ;Step2: hi* (  Pi )  min {hi (  Pi )} , hi(t)=t-di,
iJ c
iJ '
iJ c
 i* J ' is the one with largest due date.
32
Theorem:
For 1// Lmax  EDD is optimal.
For 1// ΣLi  SPT rule is optimal.
n
n
n
n
n
i 1
i 1
i 1
i 1
i 1
∵  Li   (Ci  d i )   Ci   d i   Ci  constant
For 1 // Tmax  EDD rule is optimal.
∵ Tmax= max(T1,T2,….,Tn)
If Lmax < 0,Tmax = 0
If Lmax >0,Tmax = Lmax → min Lmax  min Tmax
For 1 // ΣUi:Moore’s algorithm
Step1:Index jobs in EDD sequence and place them all in B. Set
A=  . (A and B are 2 sets.)
Step2:Compute completion times (in EDD sequence) of jobs in B. If
no job in B is late,STOP. B is in optimal sequence.
Otherwise, identify the first job that is late in B. Suppose this
is the kth job in B.
Step3:Identify the longest job among the first k jobs in B. Move this
job from B to A and return to step2.
At end:A=set of late jobs  Ui  A
The schedule is obtained by placing jobs in B (in EDD sequence) first
followed by jobs in A(in any sequence).
Ex:
i 1 2 3 4 5
Pi 1 7 6 3 4
di 2 8 9 12 10
 EDD
i’ 1’ 2’ 3’ 4’ 5’
Pi 1 7 6 4 3
di 2 8 9 10 12
Step1:B={1’,2’,3’,4’,5’}, A=  .
33
Step2:
i’ 1’ 2’ 3’ 4’ 5’
Ci 1 8 14 18 21  B  {1' ,3' ,4' ,5'}, A  {2'}
di 2 8 9 10 12
i’ 1’ 3’ 4’ 5’
Ci 1 7 11 14  B  {1' ,4' ,5'}, A  {2' ,3'}
di 2 8 10 12
 No job is late in B.
i’ 1’ 4’ 5’
Ci 1 5 8  optimal schedule
di 2 10 12  U i | A | 2
B
‥‥‥
1’
4’
5’
3’
2’
A
2’
3’
HW# 2-1:1 // ΣUi
i 1 2 3 4 5 6 7
Pi 7 8 4 6 5 9 6
di 9 17 18 16 19 21 30
Note:1 // ΣWiUi is NP-hard.
1 // Σ Ti: (classical single machine model)
1. Dominance rules
2. DP algorithm
3. B&B algorithm
34
1.
Dominance Rules:
(1) Adjacent Pairwise interchange:
Let S be a schedule where jobs i and j are adjacent and let S’ be the
same as S except that jobs i and j are interchanged.
Let B be the set of jobs processed before job i and j, and A be the set
of jobs processed after job i and j.
B
S
i
j
P(B)
S’
B
P(B)+Pi
j
A
P(B)+Pi+Pj
i
P(B)
P(B)+Pj
A
P(B)+Pj+Pi
Note:The contribution of the other jobs in B and A are the same.
Let Tij  Ti ( S )  T j ( S )  max( 0, P( B)  Pi  d i )  max( 0, P( B)  Pi  Pj  d j )
T ji  T j (S ')  Ti ( S ')  max(0, P( B)  Pj  d j )  max(0, P( B)  Pj  Pi  d i )
Cj(S’)
Ci(S’)
Case1:agreeable case- Pi  Pj and d i  d j
Case1.1: P( B)  Pi  d i
Tij  0  max(0, P( B)  Pi  Pj  d j )
T ji  max(0, P( B)  Pj  d j )  max(0, P( B)  Pj  Pi  d i )
X
Y
Note:
Tij  X and Tij  Y ( di  d j ),
If X  0 or Y  0, thenTij  T ji  j  i
Otherwise (X≠0 and Y≠0), T ji  P( B)  Pj  d j  P( B)  Pj  Pi  d i
Tij  P( B)  Pi  Pj  d j
P(B)
P(B)+Pi
P(B)+Pi+Pj
i
S
S'
j
P(B)
dj
j
i
P(B)+Pj
P(B)+Pj+Pi
di
35
 T ji  Tij  P( B)  d i  Pj  P( B)  d i  Pi  0  j  i
Case1.2: P( B)  Pi  di , Tij  T ji  0  j  i ( HW #2  2)
Theorem:If jobs i and j are adjacent and Pi  Pj , d i  d j  j  i
(Global dominance rule)
For any pair of jobs, suppose processing times and due dates are
agreeable,then ΣTi is minimized by SPT or equivalently EDD rule.
Case2:Not agreeable case, suppose Pi  Pj and d i  d j
Case2.1: P( B)  Pi  d i
P(B)
P(B)+Pi
P(B)+Pi+Pj
i
S
S'
j
j
P(B)
i
P(B)+Pj
P(B)+Pj+Pi
di
dj
Tij  0  max( 0, P( B)  Pi  Pj  d j )
T ji  0  max( 0, P( B)  Pj  Pi  d i )
 d j  d i  Tij  T ji
(same as EDD seq. )
i j
Case2.2: P( B)  Pi  d i
Case2.2.1: P( B)  Pi  Pj  d j
P(B)
P(B)+Pi
P(B)+Pi+Pj
j
i
S
S'
i
j
P(B)
P(B)+Pj
P(B)+Pj+Pi
di
dj
Tij  P( B)  Pi  di  0
T ji  0  P( B)  Pi  Pj  di
( same as EDD seq. )
T ji  Tij  i  j
36
Case2.2.2: P( B)  Pi  Pj  d j  P( B)  Pj , d i  d j
P(B)
P(B)+Pi
P(B)+Pi+Pj
i
S
S’
j
j
P(B)
i
P(B)+Pj
di
P(B)+Pj+Pi
dj
Tij  P( B)  Pi  d i  P( B)  Pi  Pj  d j
T ji  0  P( B)  Pj  Pi  d i
 Tij  T ji  P( B)  d j  Pi
 i  j ( EDD seq )
unless P(B)  Pi  d j in which case j  i ( SPT seq )
(*)
Case2.2.3: P( B)  Pj  d j
P(B)
P(B)+Pi
P(B)+Pi+Pj
i
S
S'
j
P(B)
di
j
i
P(B)+Pj
P(B)+Pj+Pi
dj
Tij  P( B)  Pi  d i  P( B)  Pi  Pj  d j
T ji  P( B)  Pj  d j  P( B)  Pj  Pi  d i
 Tij  T ji  Pi  Pj  0
 j i
( SPT seq. )
 Pj  Pi
 P( B)  Pj  d j  P( B)  Pi  d j
So, the condition P( B)  Pi  d j in case 2.2.2(*) can be replaced by
P( B)  Pj  d j .
※ Summary of NOT agreeable cases:
Use EDD seq. except when P(B)+Pj>dj in which case use SPT seq.
(Note: job j is the shorter job with a larger due date)
37
※ Combining agreeable and NOT agreeable cases,we have the
following theorem:
Theorem:If jobs i and j are candidates to start at time t,the job with
earlier due date should go first,except when
t + min(Pi ,Pj ) > max( di , dj ),in which case the shorter job should go
first. (Local dominance rule)
Agreeable case: Pi  Pj , d i  d j  i  j
NOT agreeable case: Pi  Pj , d i  d j
Suppose we define the Modified Due Date (MDD) as di’=max( t+Pi , di )
 agreeable case: Pi  Pj , di  d j  i  j  di  d j
For not agreeable case:
Pi  Pj , di  d j , d j  max(t  Pj , d j )
If t  Pj  d j , di  t  Pi  t  Pj  d j  j  i
=> consist with MDD
Note:The theorem (and the MDD rule) is only a necessary condition.
Optimal sequence  satisfies the theorem (or MDD rule)
(There are other non-optimal schedules in which any pair of jobs may
satisfy this theorem.)
Hence, it is used as a dominance rule to eliminate non-optimal sequences.
*Construct a schedule by applying MDD rule =>may not lead to an
optimal sequence
38
If we give priority to the job with earlier modified due date,the result
obtained will be consistent with the theorem.
Remark: Global dominance rule versus Local dominance rule
Dynamic Programming Algorithms for 1//ΣTi:
DP:iterative decomposition
Feasible solution
=> continue to decompose the problem until it is solvable
Backward DP:
last job
l
k
Classical DP:( Held and Karp )
Remove j from J
J\j
J
j
Set of scheduled jobs
 P  P(J )
iJ
i
39
Let J  set of jobs precede the other jobs.
G ( J )  min. cos t of the jobs in J (in opt. seq.).
If job j is the last job in J ,
 P  P( J ).
it is completed at time
iJ
i
Decomposition : G ( J )  min{max( P( J )  d j , 0)  G ( J \ j )}  min{T j  G ( J \ j )}
jJ
jJ
Recursive Equation:
Start with J  1  J  2 
 J  n.
Ex:
i 1
Pi 5
di 9
2 3 4
6 9 8
7 11 13
Stage1: J  1
J {1} {2} {3} {4}
j
1
2
3
4
5
6
9
8
G(J\j) 0
0
0
0
Tj
0
0
0
0
G(J) 0
Stage2: J  2
J {1,2} {1,3} {1,4} {2,3} {2,4} {3,4}
0
0
0
P(J)
j
1,2
1,3
1,4
2,3
2,4
3,4
11
14
13
15
14
17
G(J\j) 0,0
0,0
0,0
0,0
0,0
0,0
P(J)
Tj
G(J)
2, 4
5,3
4,0
8,4
7,1
6,4
2
3
0
4
1
4
Stage3: J  3
J {1,2,3} {1,2,4} {1,3,4} {2,3,4}
j
P(J)
1,2,3
1,2,4
20
19
22
1, 0, 2
4, 0, 3
G(J\j) 4, 3, 2
Tj
G(J)
1,3,4
2,3,4
8
11
j
1, 2, 3, 4
23
P(J)
4, 1, 4
G(J\j) 13,11, 8,11
11,13,9 10,12,6 13,11,9 16,12,10
11
Stage4: J  4
J {1,2,3,4}
13
Tj
G(J)
28
19,21,17,15
25
min tardiness=25
Opt. seq =2→1→4→3
40
n
Note:If a job j satisfies d j   Pi ,then there exists an optimal
i 1
sequence in which job j is the last job.
DP Algorithm ( Lawler );
Renumber the jobs in EDD sequence.( d1  d2 
 dk 
 dn ).
Pi .
Assume that Pk  max
1i  n
By agreeable result:jobs 1,2, …,k-1 precede job k  some of the jobs in
{ k+1,k+2,…,n } may precede job k.
Note: It can be shown that the agreeable result ( Pi  Pj , di  d j  i  j )
applies not only for adjacent jobs, but also for non-adjacent jobs.
Theorem:There exists an integer δ, 0    n  k such that there exists an
optimal schedule in which job k is preceded by jobs j with j  k   .
Ex:δ=2 => job k is preceded by jobs k+1 and k+2 (also jobs 1,2,..,k-1).
So at time t,the set of jobs can be decomposed into:
(a) jobs {1,2,…,k-1,k+1,k+2,…,k+δ} start at time t,followed by
(b) job k,followed by
(c) jobs { k+δ+1,k+δ+2,…,n }
k 
Also,the completion time of job k is t   Pi
i 1
1,2,…,k-1,k+1,…k+δ
k
k+δ+1, k+δ+2,…,n
t
41
Let J( j , l , k )=set of jobs { j,j+1,…,l } with processing time≦ Pk,but
excluding job k even if j≦k≦l .
Ex:If J{2,5,3}={2,4,5}:  P2  P3 , P4  P3 , P5  P3
Ex:
i
1
2
3
Pi
4
15 9
4
5
12 10
 J(1,5,2)={1,3,4,5}
J(1,2,2)={1}
J(3,5,2)={3,4,5}
Let V( J ( j , l , k ), t ) = min. tardiness of J ( j , l ,k ) if it starts at time t.
Initialization:
 V ( , t )  0

 V ({ j}, t )  max( t  Pj  d j ,0)
Recursive Equation:
k ' 
k ' 
i j
i j
V ( J ( j , l , k ), t )  min {V ( J , ( j , k '  , k '), t )  max(0, t   Pi  d k ' )  V ( J (k '   1, l , k '), t   Pi )}
0   n  k
where k ' is such that Pk '  max Pi
iJ ( j ,l , k )
optimal value  V ({1, 2,..., n}, 0)
Ex:
i 1 2 3 4 5
Pi 121 79 147 83 130
di 260 266 266 336 337
Step 1:Renumber by EDD rule.
Pk  Pmax  P3
0   ( n  k )  2
(  0) , V ( J (1,3,3),0)  max( 347  266,0)  V ( J (4,5,3),347)

V ({1,2,3,4,5},0)  min  (  1) , V ( J (1,4,3),0)  max( 430  266,0)  V ( J (5,5,3), 430)
 (  2) , V ( J (1,5,3),0)  max( 560  266,0)  V ( ,560)

42
V ( J {1,3,3}, 0) :
 C  121 , T1  0
1 2  1
 C2  200 , T2  0
V ( J {4,5,3},347) :
 C  79 , T1  0
2 1  1
 V ( J {1,3,3}, 0)  0
 C2  200 , T2  0
 C  430 ,
45  4
by agreeable result
 C5  560 ,
 V ( J {4,5,3},347)  (430  336)  (560  337)  317
 (  0) , V ( J (1,3,3), 0)  max(347  266, 0)  V ( J (4,5,3),347)  0  81  317  398
(  1) , V ( J (1, 4,3), 0)  max(430  266, 0)  V ( J (5,5,3), 430) :
1  2  4 : Ti  0
2  1  4 : Ti  0
2  4  1: Ti  23
V ( J (5,5,3), 430)  560  337  223
 V ( J (1, 4,3), 0)  max(430  266, 0)  V ( J (5,5,3), 430)  0  164  223  387
 (  2) , V ( J (1,5,3), 0)  max(560  266, 0)  V ( ,560)  76  294  0  370
 min{398,387,370}  370
 opt. seq  1  2  4  5  3 or 2  1  4  5  3
Note : Lawler’s algorithm is more efficient than the previous classical one.
Dominance rule ( Elimination rules ):
Branch and bound:
 Dynamic rules ( Local ) : depend on time t ( MDD )

 Static rules (Global ) : ex : Agreeable result ( Pi  Pj , di  d j  i  j )
43
Emmon’s rule for 1//ΣTi:
There is an opt. schedule in which job i precedes job j if one of the
following conditions holds:
(1) Pi  Pj , and d i  max ( d j , P( B j )  Pj )
(2) di  d j , and d j  P( Ai)  Pj
(3) d j  P( Ai)
where Ai=set of jobs that have been shown to follow job i in an opt.
schedule. (After set)
Bi=set of jobs that have been shown to precede job i in an opt.
schedule.(Before set )
Ai  complement of Ai  {1, 2,..., n}  Ai
P( Ai )   Pj
jAi'
Ex:
i
Pi
di
1
2
5
2
4
7
3
3
8
4
7
9
1  2 , 1  3 , 1  4,
Agreeable result: 2  4 , 3  4
1 ? ? 4
By theorem:
t2
2  min(4,3)  max(7,8)
 EDD seq.
2 or 3
1
2=t
1 2  3  4
By Emmon’s rule (2):
d 2  7  d 3  8 , A2  {4}, A2'  {1,2,3} , P( A2' )  P1  P2  P3  9
d 3  8  P( A2' )  P3  9  3  6
23
1 2  3  4
Suppose jobs are numbered in SPT order and ties are broken by EDD rule.
 P P

j
So, i  j  Pi  Pj or  i
 d i  d j
SPT
EDD
44
Theorem1:If job 1 satisfies d1  max( Pi , d i ), i  1 then job 1 is first in an opt.
schedule.
pf:by MDD rule or Emmon’s rule (1) with Bj=φ
Theorem2:If job n satisfies max( Pn , d n )  di , i  n ; then job n is last in an opt.
schedule.
n
d i and d j  Pj   Pk then job j is last in an opt.
Theorem3:If d j  max
1i  n
k 1
schedule.
pf:(1) for i<j: Pi  Pj and d i  d j  i  j <SPT rule>
 j is last !
n
(2) for i>j: d i  d j , d j   Pk  Pj  i  j <Emmon’s rule (2) with
k 1
Ai=φ>
HW# 3-1: 1//ΣTi
i 1 2 3 4 5 6 7 8 9 10
Pi 6 12 16 23 32 49 61 66 80 97
di 25 73 31 67 32 31 57 64 15 55
Use all the theorems and results to determine an optimal schedule.
Emmon’s rule for 1//ΣWiTi:
There is an opt. schedule in which job i precedes job j if one of the
following conditions holds:
(1) Pi  Pj ,Wi  W j , and d i  max ( d j , P( B j )  Pj )
(2) di  d j ,Wi  Wj , and d j  P( Ai )  Pj
(3) d j  P( Ai )
Agreeable result: if Pi  Pj ,Wi  Wj , and di  d j , then ij.
Note: For agreeable cases, condition (1) holds.
45
Branch and Bound Method:(Divide-and-Conquer)
 Branching : partition a large problem into 2 or more subproblems.
Mutually exclusive and exhausitive.
A
B
C
A
B
C
 Bounding : calculate a lower bound on the opt . solution of a give subproblem.
Guide the search and cut  down search space.
min  LB ; max  UB
Branch-and-Bound for 1//ΣTi:
- determine a sequence ( _ _ _ … _ )
- Backward Branching:
Level 0
Level 1
1
21
…
…
..
…
.
n1
1n
2n
Level 2
….
.
…
…
- Bounding:
n
2
Level n
S’
i
n!
S (scheduled jobs)
………..
P
kS '
k
Let T(S) =total tardiness of jobs in S.
LB:lower bound --- (1): LB  0  T ( S )
{max(  Pk  d i ,0)}  T ( S )
--- (2): LB  min
iS '
kS '
UB:Initial upper bound
- Can use MDD rule ( It is replaced whenever a better complete
solution is found )
46
Ex:
i 1 2 3 4 5
Pi 4 3 7 2 2 Agreeable result:1  3,2  3,4  3,4  5
di 5 6 8 8 17  job 1, 2, 4 can' t be the last job.
MDD rule =>ΣTi=11 (UB)
3
LB  10  0  10
5
LB  1  8  9
53
LB  10  0  1  11
Fathomed
15
25
45 LB  1  8  6  15
35
LB  12
Fathomed
Fathomed
LB  13 135
235
Fathomed
Fathomed
435 LB  11
LB  12
1435
2435
1243
5
47
Now consider job release time ( 1 / ri / ΣTi ):
Note:If all jobs have the same release time, then No need to consider idle
time or preemption.
Ex:
j
ri
Pi
di
1
0
5
7
2
1
2
4
Case1:No idle time, No preemption
1
T1=0,T2=3
ΣTi=3
2
5
7
Case2:with idle time
2
1
1
3
8
T1=0,T2=1
ΣTi=1
Case3:with preemption
1
2
1
1
3
7
T1=0,T2=0
ΣTi=0
1 / ri , prmp / L max or T max :
Note:EDD rule is optimal for 1// Lmax or Tmax
Preemptive EDD rule:At each decision point given by a release time or
job completion time, select the unfinished job with earliest due date.
Ex:
i
ri
di
Pi
1
0
5
3
2
1
4
2
3
5
8
3
4
4
9
4
Theorem: The preemptive EDD rule is optimal for 1 / ri ,prmp /L max or T max.
Similarly , preemptive SPT rule is optimal for 1/ ri , prmp / ΣCi
48
Earliest Release Time(ERT) rule:Select the job with earliest release time
(or sort the jobs in non-decreasing order of release times).
Note:ERT rule is optimal for 1/ ri / Cmax
Note:The schedule generated by ERT rule consists of one or more blocks.
Ex:
i 1 2 3 4
ri 0 1 9 8
Pi 2 3 3 4
Sort by release times:1→2→4→3
Idle time
Block1
1
Block2
2
2
4
5
8
3
12
15
49
Related documents
Download