Nothing Special   »   [go: up one dir, main page]

CH 15 Production Scheduling

Download as pdf or txt
Download as pdf or txt
You are on page 1of 64

Production Scheduling

Ch 15.1-2

ISE 251

Supplements:
Alharkin, Algorithms for Sequencing and Scheduling
1 Pinedo, Ch 4, Scheduling, 3rd Ed.
The Hierarchy of Production Decisions

The logical sequence of operations in factory


planning corresponds to the sequencing
of chapters in the book.

 All planning starts with the demand Forecast


forecast. Demand
 Demand forecasts are the basis for the
|
top level (aggregate) planning. MPS
|
 The Master Production Schedule (MPS) is
the result of disaggregating aggregate MRP
plans down to the individual item level. |
|
 Based on the MPS, MRP is used to
determine the size and timing of JOB
component and subassembly production. SCHEDULING
 Detailed shop floor schedules are required
2
to meet production plans resulting from
the MRP.
Sequencing and Scheduling

 Basic Problem: develop a plan to guide the release of


work into the system and coordination with needed
resources (e.g., machines, staffing, materials).

 Methods:
– Sequencing:
» Gives order of releases but not times.
» Adequate for simple CONWIP lines
where FISFO is maintained.
» The “CONWIP backlog.”

– Scheduling:
» Gives detailed release times.
» Attractive where complex routings make simple sequence impractical.
» MRP/Capacity Planning. 3
Types of Shops

 Job Shop

Process 2

Process 4

Process 1

Process 3

 Flow Shop

Process 1 Process 2 Process 3

4
Sequencing for Single Machine

 Variables: for each job i

ti : processing time
di : due date
Wi : waiting time
Fi : flow time Fi = W i + t i
Li : lateness Li = Fi - di
Ti : tardiness Ti = max[Li,0]
Ei : earliness Ei = max[-Li,0]

Additional
Reference: 5
Alharkan,
Objectives in Job Shop Scheduling

 Meet due dates

 Minimize work-in-process (WIP) inventory

 Minimize average flow time

 Maximize machine/worker utilization

 Reduce set-up times for changeovers

 Minimize direct production and labor costs

(note that these objectives can be conflicting)


6
Common Sequencing Rules

 FCFS. First Come First Served.


– Jobs processed in the order they come
to the shop.
 SPT. Shortest Processing Time.
– Jobs with the shortest processing time
are scheduled first.
 EDD. Earliest Due Date.
– Jobs are sequenced according to their
due dates.
 CR. Critical Ratio.
– Compute the ratio of processing time of the
job and remaining time until the due date.
– Schedule the job with the largest CR
value next. 7
Dispatching Rules

 Prioritize all waiting jobs


– job attributes
– machine attributes
– current time

 Whenever a machine becomes free: select


the job with the highest priority

 Static or dynamic

8
Example

 Fine grooves with different lengths and widths


need to be made in four workpieces with an
ultrasonic machine:
Process Time Due Date
(1) 37 49
(2) 27 36
(3) 1 1
(4) 28 37

Single Machine: n! possible sequences.

9
Sequencing Solutions

 First Come First Served:


– Assuming ordered in arrival sequence.
– ORDER: 1, 2, 3, 4
– AVG. Flow Time: 64.7 1 2 3 4
Time: 37 64 65 93
 Shortest Processing Time
– ORDER: 3, 2, 4, 1
– AVG Flow Time: 44.5
 Earliest Due Date
– ORDER: 3, 2, 1, 4
– AVG Flow Time: 46.8
Process Time Due Date
(1) 37 29
(2) 27 26
(3) 1 1
10
(4) 28 37
Sequencing Solutions

 Critical Ratio: This is dynamic.


 t=0
Job Time Date CR
1 37 49 37/49 = .755
2 27 36 .750
3 1 1 1.0
4 28 37 .757

11
Sequencing Solutions

 Assume 3 is completed, forward to t=1.


Job Time Date CR
1 37 49 37/(49-1) = .771
2 27 36 .7716
4 28 37 .7782

 Assume 4 is completed, forward to t=29.


Job Time Date CR
1 37 49 37/(49-29) = 1.85
2 27 36 3.86

ORDER: 3, 4, 2, 1 AVG Flow Time: 44.8

12
Alharkin, Ch 2
Shortest Processing Time (SPT)
Example
 If your goal is to:
Minimize mean flow time
Minimize waiting time
Minimize mean lateness
 Use: SPT
 Fine grooves with different length and width
need to made in four work pieces with an
ultrasonic machine:
Job Process Due Date
Time
(1) 8 17
(2) 14 53
(3) 27 49
(4) 19 32 13

(5) 3 8
Alharkin, Ch 2
SPT Solution

 Sequence: 5, 1, 2, 4, 3
 Processing Times:
5 1 2 4 3
Time: 3 11 25 44 71

 Average Flow Time: 30.8 Job Process Due


Time Date
 Average Waiting Time: 16.6 (1) 8 17
 Average Lateness: -1.0 Max Late: 22(2) 14 53
(3) 27 49
 Late Jobs: 2
(4) 19 32
(5) 3 8

14
Release/Due Date Related

 Earliest release date first (ERD) rule


– variance in throughput times
 Earliest due date first (EDD) rule
– maximum lateness
 Minimum slack first (MS) rule
– maximum lateness

Current
Time

 
max d j   p j  t  ,0
Processing
Time
Deadline
15
Alharkin, Ch 2
EDD Solution

 If your goal is to: Job Process Due


Minimize Maximum Lateness Time Date
(1) 8 17
 Use: EDD (2) 14 53
(3) 27 49
 Sequence: 5, 1, 4, 3, 2 (4) 19 32
(5) 3 8
 Processing Times:
5 1 4 3 2
Time: 3 11 30 57 71
 Average Flow Time: 34.4
 Average Waiting Time: 20.2
 Average Lateness: 2.6 Max Late: 18
 Late Jobs: 2 16
Minimize Maximum Lateness
with arbitrary job release times, rj

 Lmax mixed integer optimization


model
min L max

s.t.
t j  rj j
tti  j  p j  M gyi , j  n
 (i,j) pairs  2  pairs
ttj  i  pi  M g(1 yi , j )  
t j  p j  d j  Lmax j
t j  0 yi , j  0,1 Lmax  ,  M   pj
j 17
Alharkin, Ch 2
Sequencing for Single Machine

 If your goal is to:


Minimize number of tardy jobs

 Use: Moore’s (Hodgson’s) Algorithm


1. Sequence jobs by EDD.
2. Find first tardy job in sequence. If none, go to 4.
3. Consider jobs up to and including job found in (2),
reject job with largest processing time. Go to 2.
4. Form schedule with order of sequence in (2) then
add rejected jobs found in (3).

18
Alharkin, Ch 2

Moore’s Algorithm

 Step 1: EDD Sequence: 5, 1, 4, 3, 2


 Processing Times:
5 1 4 3 2
Time: 3 11 30 57 71
 Step 2: First Late Job is Job 3
 Take Job 3 out of the sequence (put at end).
(Note that it will be late regardless now.)
Job Process Due
Time Date
(1) 8 17
(2) 14 53
(3) 27 49
(4) 19 32
19
(5) 3 8
Alharkin, Ch 2
Moore’s Algorithm

 New Sequence: 5, 1, 4, 2
 Processing Times:
5 1 4 2
Time:
3 11 30 44
 No late jobs in this sequence.
 Job 3 placed at end. Only one tardy job.

Job Process Due


Time Date
(1) 8 17
(2) 14 53
(3) 27 49
(4) 19 32
(5) 3 8 20
Pinedo, Ch 4
Total Earliness and Tardiness

 definition: Earliness of job i = Ei = max { di-Ci , 0 }


 Objective is to minimize total Earliness and
Tardiness
n n

 E  T
i 1
i
i 1
i

 Not a “regular” measure of performance


– if Ci decreases, earliness could increase
– leads to the possibility of deliberately inserting
idle time
– minimizing earliness tends to minimize work-in-
process inventory, a core objective in Just-in-Time
production scheduling 21
Total Earliness and Tardiness Pinedo, Ch 4
Common Due Date

 all di = d
 no inserted idle time
– once processing begins, jobs continually
scheduled

 Two sets of jobs can be defined:


– J1 : all jobs with completion times ≤ d
– J2 : all jobs with completion times > d
w1 w2
  ...
p1 p2
 Optimal sequencing rule:
– Sequence J1 using WLPTw1  w2  ...
p1 p2 22
Total Earliness and Tardiness Pinedo, Ch 4
Common “Loose” Due Date

 If d is large enough such that no job begins at t < 0,


– an optimal schedule exists with a job i with Ci = d
 If all earliness and tardiness penalties = 1,
– order all jobs using LPT
– assign first job from list to J1, next job to J2, and
alternate thereafter until all jobs have been
assigned
– J1 already in LPT order; reverse the order of jobs
in J2 to achieve SPT ordering
 Example (with all weights = 1):
job i 1 2 3 4 5
pi 5 7 4 8 12
LPT
4 3 5 2 1
order LPT ordered J1 : 5-2-3
23
k:Jk 2 1 1 2 1 SPT ordered J2 : 1-4
Total Earliness and Tardiness Pinedo, Ch 4
Common “Loose” Due Date

 Job b in the optimal sequence is completed at the


due date where b is the smallest integer satisfying
n

 (w  w)   w


iJ1
i i
i 1
i
 Example:
d = 50

 w  53
i 1
i
• order by non-increasing
wi’+ wi’’
job i 1 2 3 4 5 4,5,2,3,1
b 1 2

pi 5 7 4 8 12 32<5 62>5
J1 sum(1,b)
3 3
w i’ 4 8 8 12 15
• WLPT J1 : 5-4 24
wi’’ 7 7 4 20 15
• WSPT J2 : 1-2-3 or 1-3-2
Total Earliness and Tardiness Pinedo, Ch 4
Common “Tight” Due Date

 “tight” due date implies a job must begin at t = 0


– an optimal sequence no longer has a job
completing at t = d and no optimal rule-based
procedure exists

 Heuristic procedure for constructing sequence (no


earliness or tardiness penalties)
0. LPT order jobs
1. set t1d , t2Σ(pj)-d , k1
2. if t1>t2 then assign job k to earliest available slot …
t1 t1-pk
else assign job to latest available slot … t 2
t2-pk
3. if k<n then kk+1 , go to (2) 25

else STOP
Total Earliness and Tardiness Pinedo, Ch 4
Common “Tight” Due Date

 Example: d = 15
job i 1 2 3 4 5
pi 5 7 4 8 12

LPT
order k 4 3 5 2 1

k t1 t2 assignment partial seq.


1 15 36-15=21 15<21job 5 pushed late {*,*,*,*,5}

2 15 21-12=9 15>9job 4 pushed early {4,*,*,*,5}

3 15-8=7 9 7<9job 2 pushed late {4,*,*,2,5}

4 7 9-7=2 7>2job 1 pushed early {4,1,*,2,5}

5 {4,1,3,2,5}
26
Alharkin, Ch 4
Multiple Machines

 General case: n jobs on m machines


 For one machine:
– n jobs can be sequenced in n! ways
 For m machines:
– n jobs can be sequenced in (n!)m ways
 Due to the large number of solutions, we
often resort to heuristics for solutions
(50!)5  3.05(10)322
Assuming sequence evaluations on
a computer at a rate of 1,000,000,000/sec
# seconds = 3.05(10) 313
# seconds / yr. = 3.154(107 )
# years to evaluate (50!)10 = 9.7(10) 307
27
[Age of universe = 14.2(10)9 years!!]
Alharkin, Ch 4
Results for Multiple Machines

 The optimal solution for scheduling n jobs on


two machines is always a permutation schedule
– that is, jobs are done in the same order on
both machines.
– This is the basis for Johnson’s algorithm.

 For three machines, a permutation schedule is


still optimal if we restrict attention to total flow
time only.
– Under rare circumstances, the two machine
algorithm can be used to solve the three
machine case.
28
Gantt Charts

 Way to visualize machine schedules.


 Assume two jobs must be processed on two
successive machines. Data:
Machine 1 Machine 2
Job 1 2 4
Job 2 3 2

Machine 1 1 2

Machine 2 1 2
Time: 2 56 8
29
Alharkin, Ch 4
Classic Multi Machine Results

 Minimizing “Makespan” on Two Machines: given a set of


jobs that must go through a sequence of two machines,
what sequence will yield the minimum makespan?
– Makespan is sequence dependent.
– Simple algorithm (Johnson 1954):
1. Sort the times of the jobs on the two machines in
two lists.
2. Find the shortest time in either list and remove
job from both lists.
 If time came from first list, place job in first

available position.
 If time came from second list, place job in last

available position in sequence.


3. Repeat until lists are exhausted.
30
Johnson’s Algorithm Example

 Data:
Job Time on M1 Time on M2 List 1 List 2
1 4 9 4 (1) 5 (3)
2 7 10 6 (3) 9 (1)
3 6 5 7 (2) 10 (2)

 Iteration 1: min time is 4 (job 1 on M1); place this job


first and remove from lists:

31
Johnson’s Algorithm Example (cont.)

List 1 List 2
6 (3) 5 (3)
7 (2) 10 (2)
 Iteration 2: min time is 5 (job 3 on M3); place this job
last and remove from lists:

 Iteration 3: only job left is job 2; place in remaining


position (middle).

 Final Sequence: 1-2-3


 Makespan: 28

32
Gantt Chart for Johnson’s Algorithm
Example

Machine 1 1 2 3

Machine 2 1 2 3

Time 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28

Short task on M1 to Short task on M2 to


“load up” quickly. “clear out” quickly.

33
Alternative Algorithmic Form
 Define set U as all jobs with processing time on
machine 1 (t1) < processing time on machine 2
(t2)
– U = {jobs | t1 < t2}

 Define set V as remaining jobs


– V = {jobs | t1 ≥ t2}

 Sequence jobs in U by non-decreasing t1

 Sequence jobs in V by non-increasing t2


34
 Order {U, V} is the optimal sequence
Example

A = Milling Machine B = NC Grinder


1 15 3
2 8 20
3 18 5
4 25 8
5 17 20
6 22 30

35
Solution

Milling Machine NC Grinder


1 15 3
2 8 20
3 18 5
4 25 8
5 17 17
6 22 30
Order: U={2,6}V={5,4,3,1}
Sequence: 2 6 5 4 3 1

36
Another Example

Milling Machine NC Grinder


1 15 3
2 8 20
3 18 5
4 25 8
5 17 17
Ties…
6 17 30

37
Solution

Milling Machine NC Grinder


1 15 3
2 8 20
3 18 5
4 25 8
5 17 17
6 17 30
Order: U={2,6} V={5,4,3,1}
Sequence: 2 6 5 4 3 1

38
Alharkin, Ch 4
Three Machines

 Permutation schedule optimal for minimizing


total makespan

 True if either:
min Ai ≥ max Bi or min Ci ≥ max Bi
– (If the second machine is not a bottleneck)

 Reduces to a two machine problem

39
Example

Milling Grinder Polishing


1 15 2 1
2 8 6 14
3 18 2 3
4 25 5 5
5 17 10 20
6 22 10 20

40
Machine 2 Bottleneck?

Min Ai = 8
Max Bi = 10
Min Ci = 1

Our solution may not be optimal, as


machine 2 may be a bottleneck, but we
will proceed anyway…

41
Solution
Milling Grinder Polishing
1 15 2 1
2 8 6 14
3 18 2 3
4 25 5 5
5 17 10 20
6 22 10 20
Add t1 and t2 and treat as single machine.

42
Solution
Milling Grinder Polishing
1 15 2 1
2 8 6 14
3 18 2 3
4 25 5 5
5 17 10 20
6 22 10 20
Add t2 and t3 and treat as single machine.

43
Solution

M1’ M2’
1 17 3
2 14 20
3 20 5
4 30 10
5 27 30
6 32 30

Now treat as typical two machine problem.

44
Solution

M1’ M2’
1 17 3
2 14 20
3 20 5
4 30 10
5 27 30
6 32 30
Order: U={2,5} V={6,4,3,1}
Sequence: 2 5 6 4 3 1

45
Gantt Chart Solution

2 5 6 4 3 1

2 5 6 4 3 1
2 5 6 4 3 1
8 14 25 28 35 45 47 55 57 72 77 82 90 92 95 105 108
107

No waiting for machine 2, so it is not a


bottleneck.
Makespan = 105 + 2 + 1 = 108 (optimal
schedule)
46
Alharkin, Ch 4
Flowshop: n-Job/m-Machine

 Various heuristics, including:


– Minimize Machine Idle Time
– Palmer’s Method
– The Nawaz Heuristic

– CDS (Campbell, Dudek and Smith)


Procedure

 Not guaranteed optimal solutions, but very good


solutions, achieved quickly.

47
CDS Procedure

 Reduce the m-machine problem to (m-1) 2-machine


problems.
Ail = ti,j and Bil = ti,(m-j+1)

 Given an l=1,2,...,m-1, find m-1 sequences


according to Johnson’s rule and choose the best.

TRIAL 1: 1 2 3 4
TRIAL 2: 1 2 3 4
TRIAL 3: 1 2 3 4 48
Example

 Trial 1:
Machines
Job 1 2 3 4
1 25 45 52 40
2 7 41 22 66
3 41 55 33 21
4 74 12 24 48
5 7 15 72 52
6 12 14 22 32

49
Example

 Trial 2:
Machines
Job 1 2 3 4
1 25 45 52 40
2 7 41 22 66
3 41 55 33 21
4 74 12 24 48
5 7 15 + 72 52 +
6 12 14 22 32

50
Example

 Trial 3:
Machines
Job 1 2 3 4
1 25 45 52 40
2 7 41 22 66
3 41 55 33 21
4 74 12 24 48
5 7 15 72 52
6 12 14 22 32

51
Solution
l=1 l=2 l=3
Job Ai,1 Bi,1 Ai,2 Bi,2 Ai,3 Bi,3
1 25 40 70 92 122 137
2 7 66 48 88 70 129
3 41 21 96 54 129 109
4 74 48 86 72 110 84
5 7 52 22 124 94 139
6 12 32 26 54 48 68
Sequences:
256143 562143 625134

Makespans:
335 353 322
52
Flow Shop Scheduling
 Calculating permutation schedule job completion
times
– tabular format
– let rows be ordered set of machines
– let columns be the jobs ordered as in the solution
sequence

– first row: add job processing time to completion


time of preceding job on first machine (no
inserted idle time possible)
– first column: add processing time of first job on
a machine to the completion time of the first job
on the previous machine
– all other entries: add the processing time of job i
on machine k to the maximum of
» completion time of job i on machine k-1 53
» completion time of previous job on machine k
Flow Shop Scheduling

i Job p1,i p2,i


 First row: C1,[ i ]   p1, j i=1,...,n A 6 8
j 1
B 11 6
k C 7 3
 First column: Ck ,[1]   ph,[1] k=1,...,m D 9 7
h 1 E 5 10
Opt Seq E A D B C
m1 5 5+6=11 11+9=20 20+11=31 31+7=38
m2 5+10=15
Other entries: C  max  C
k 1,[ i ] , Ck ,[ i 1]   pk ,[ i ]

k ,[ i ]

Opt Seq E A D B C
m1 5 5+6=11 11+9=20 20+11=31 31+7=38
m2 5+10=15 15+8=23 23+7=30 31+6=37 38+3=41

makespa 54
n
Parallel Machine
Alharkin, Ch 3
Scheduling

Situation
m parallel machines
machine
- Identical

machine - Nonidentical
n jobs .
.
.
machine

How to allocate and sequence the n jobs to optimize


a given performance measure ?
55
Parallel Machine Scheduling
Minimizing makespan
o Identical machines and independent jobs
o With preemption

1 n
M * max{  t j , max{t j }}
m j 1
Job 1 2 3 4 5 6 7 8
Processing time 1 2 3 4 5 6 7 8

m 3
36 1 2 3 4 5
M * max{ ,8} 12 5 6 7
3
7 8

3 10
56
Parallel Machine Scheduling
Minimizing makespan (heuristic)

o Identical machines and independent Jobs


o Without preemption

Step 1. LPT order of jobs


Step 2. Assign the jobs in order to the machine with less workload

Job 1 2 3 4 5 6 7 8
Processing time 1 2 3 4 5 6 7 8

8 3 2

LPT order: 7 4 1

8 7 6 5 4 3 2 1 6 5
57
13
Alharkin, Ch 3
Parallel Machine Scheduling
Minimizing mean flow time

o Identical machines and independent Jobs


o Without preemption

Step 1. Construct SPT ordering


Step 2. To the machine with the least amount of processing already
allocated, assign the next job on the ordered list of jobs
Job 1 2 3 4 5 6 7 8
Processing time 4 7 6 8 1 4 3 6

m 2 5 1 3 2

SPT order: 7 6 8 4
5 7 1 6 3 8 2 4
3 7 21 58
Static Stochastic Scheduling
Practical Worst Case Assumption

Single and Parallel Machine cases.


 Suppose that processing times are random
variables.
 If the objective is to minimize average
weighted flow time, jobs are sequenced
according to expected weighted SPT.
– That is, if job times are t1, t2, . . ., and the
respective weights are w1, w2, . . . then job i
precedes job i+1 if

wi / E(ti) > wi+1 / E(ti+1)


59
Static Stochastic Scheduling
Practical Worst Case Assumption
 Multiple Parallel Machines
– Requires the assumption that the distribution of job times is
exponential, (memoryless property).
– Assume parallel processing of n jobs on m machines.
– Then the optimal sequence is to to schedule the jobs
according to LEPT (longest expected processing time
first).

 Flow Shop
– Johnsons algorithm for scheduling n jobs on two machines in
the deterministic case has a natural extension to the
stochastic case as long as the job times are exponentially
distributed.
Job i before i+1 if
min(Ai,Bi+1) < min(Ai+1,Bi) 60
Johnson’s Stochastic Case

 Expectation of two exponential random variables:


1
E  min( Ai , Bi 1 ) 
ai  bi 1
1
E  min( Ai 1 , Bi ) 
ai 1  bi
Job i before i+1 if
min(Ai,Bi+1) < min(Ai+1,Bi)
 Decision Rule
– Schedule in order of decreasing values of the
difference in rates on each machine. Or, i before
i+1 if:
ai  bi  ai 1  bi 1
61
Example for Deterministic Data

Milling Machine NC Grinder


1 15 3
2 8 20
3 18 5
4 25 8
5 17 20
6 22 30
Before: 2, 5, 6, 4, 3,1
62
Example

Rates A Rates B
1 1/15 = .0667 .333
2 1/8 = .125 .05
3 .056 .20
4 .04 .125
5 .059 .05
6 .045 .033

63
Example

A B Diff
1 .0667 .333 -.2667
2 .125 .05 .075
3 .056 .20 -.1444
4 .04 .125 -.085
5 .059 .05 .00882
6 .045 .033 .01212
PWC Schedule: 2, 6, 5, 4, 3, 1
64

You might also like