A Markov Decision Model for Optimal Outpatient

advertisement
A Markov Decision Model for
Determining Optimal
Outpatient Scheduling
Jonathan Patrick
Telfer School of Management
University of Ottawa
Motivation
 The unwarranted skeptic and the uncritical
enthusiast
 Outpatient clinics in Canada receiving strong
encouragement to switch to open access
 Basic operations research would claim that
there is a cost to providing same day access
 Does the benefit outweigh the costs?
Trade-off
 Any schedule needs to balance systemrelated benefits/costs - revenue, overtime,
idle time,… versus patient related benefits
– access, continuity of care,….
 Available levers include the decision as to
how many new requests to serve today and
how many requests to book in advance into
each day.
New Demand
Day 5
Day 4
3 3
Day
Day
Day 2
Day 1
Scheduling Decisions
Literature
 Plenty of evidence that overbooking is
advantageous in the presence of no-shows
(work by Lawley et al and by Lawrence et al)
 Also evidence that a two day booking window
outperforms open access (work by Liu et al
and by Lawrence and Chen)
 Old trade-off between tractability of the model
and complexity
Model Aims
 To create a model that
• Incorporates a show rate that is dependent
on the appointment lead time
• Gives managers the ability to determine
• the number of new requests to serve today
• The number of requests to book into each future
day (called the Advanced Booking Policy – ABP)
• Allows the policy to depend on the current
booking slate and demand.
Markov Decision Process Model
 Decision Epochs
• Made once a day after today’s demand has arrived but before
any appointments
 State
• Current ABP (w), queue size (x) and demand (y)
 Actions
• How many of today’s demand to serve today (b)
• Whether to change the current ABP (a)
Markov Decision Process Model
 Transitions
• Stochastic element is new demand
• New queue size is equal to current queue
size (x) minus today’s slate (x  w) plus any
new demand not serviced today (y-b)
• New demand represented by random
variable D.
Markov Decision Process Model
 Costs/Rewards
• System Related: revenue, overtime, idle time
• Patient Related: lead time
• For switching the ABP
Bellman Equation
 Used a discounted (but with a discount
rate of 0.99), infinite horizon model to
avoid arbitrary terminal rewards
 Can be solved to optimality
Assumptions/Limitations
 Advance bookings are done on a FCFS basis
 Today’s demand arrives before any booking
decisions need to be made
 Service times are deterministic
 Show rate dependent on size of queue at time
of service instead of at time of booking
 Immediate changes to ABP may mean that
previous bookings need to be shifted
 Does not account for fact that some bookings
have to be booked in advance
Clinic Types Considered
Clinic #1: f R  20, f OT  10, f IT  0, f LT  0
Clinic #2 : f R  20, f OT  10, f IT  0, f LT  1
Clinic #3 : f R  20, f OT  10, f IT  0, f LT  5
Clinic #4 : f R  20, f OT  10, f IT  5, f LT  0
Clinic #5 : f R  20, f OT  10, f IT  5, f LT  1
Clinic #6 : f R  20, f OT  10, f IT  5, f LT  5
Clinic #7 : f
R
 0, f
OT
 10, f
IT
 5, f
LT
0
Clinic #8 : f R  0, f OT  10, f IT  5, f LT  1
Clinic #9 : f  0, f
R
OT
 10, f
IT
 5, f
LT
5
Six Scenarios for each Clinic Type
1. Base scenario
•
•
•
2.
3.
4.
5.
6.
Demand equal to capacity
Show rate based on research by Gallucci
All requests can be serviced the same day
Demand > Capacity
Demand < Capacity
Some requests must be booked in advance
Same day bookings given a show probability of 1
Show probability with a steeper decline
Performance Results
 Clinics #1,2,3:
• OA and MDP policy result in almost identical profits
• Same day access ranges from 89% to 100% (max lead time 1 day)
 Clinics #4,5,6:
• MDP slightly outperforms OA (by less than 2%)
• Same day access ranges from 84% to 100% (max lead time 2 days)
 Clinics #7,8,9:
• MDP vastly outperforms OA in all scenarios (by as much as 70%)
• Same day access ranges from 28% to 98% (max lead time 4 days)
 For all clinics, MDP provides a significant reduction in
throughput variation and peak workload
Optimal Policy (base scenario,
w=11, x=0)
 10,
f
IT
 5,
f
LT
0
20
18
17
16
15
14
13
12
11
Day 1
f
OT
19
10
9
8
7
6
5
4
3
2
1
Day 1
f  0,
R
f
OT
 10,
f IT  5,
f
LT
5
Day 1
f R  0,
20
19
17
15
12
11
10
9
8
7
6
5
4
3
2
1
Day 2
Optimal Policy (base scenario,
w=11, x=0)
18
16
14
13
Performance Trends
 MDP performed best when demand was high (e.g. when
demand > capacity and when same day show rate was
guaranteed).
 MDP approaches OA as the lead time cost increases
 Presence of revenue makes OA much more attractive
 Maximum booking window in any scenario tested was 4
days
 MDP manages to perform as well even when revenue is
present by sacrificing some throughput in order to reduce
overtime and idle time costs.
Conclusion
 Model provides a booking policy that takes into account
no-shows and reacts to the congestion in the system
 Simulation results suggest that it achieves better results
(same or higher objective, more predictable throughput)
than open access with minimal cost to the patient in terms
of lead times
 Enhancements to the model certainly possible including
the inclusion of stochastic services times, the transition to
a continuous time setting, the possibility of a multi-doctor
clinic….
 Currently in discussion with local clinic to build enhanced
model and test it.
Thank You!
Optimal Policy (base scenario, w=11, x=0)
f R  0, f OT  10, f IT  5, f LT  5
Number of New Requests Given Same Day Service
y
0
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
0
1
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
1
0
1
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
2
0
0
1
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
3
0
0
0
1
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
4
0
0
0
0
1
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
5
0
0
0
0
0
1
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
6
0
0
0
0
0
0
1
0
0
0
0
0
0
0
0
0
0
0
0
0
0
7
0
0
0
0
0
0
0
1
0
0
0
0
0
0
0
0
0
0
0
0
0
8
0
0
0
0
0
0
0
0
1
0
0
0
0
0
0
0
0
0
0
0
0
9
0
0
0
0
0
0
0
0
0
1
0
0
0
0
0
0
0
0
0
0
0
10
0
0
0
0
0
0
0
0
0
0
1
0
0
0
0
0
0
0
0
0
0
11
0
0
0
0
0
0
0
0
0
0
0
1
0
0
0
0
0
0
0
0
0
12
0
0
0
0
0
0
0
0
0
0
0
0
1
1
1
0
0
0
0
0
0
13
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
1
1
0
0
0
0
14
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
1
1
0
0
15
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
1
0
16
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
1
Optimal Policy (base scenario, w=11, x=0)
f R  0, f OT  10, f IT  5, f LT  0
Number of New Requests Given Same Day Service
y
'0'
0
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
'1'
1
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
'2'
0
1
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
'3'
0
0
1
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
'4'
0
0
0
1
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
'5'
0
0
0
0
1
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
'6'
0
0
0
0
0
1
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
'7'
0
0
0
0
0
0
1
0
0
0
0
0
0
0
0
0
0
0
0
0
0
'8'
0
0
0
0
0
0
0
1
0
0
0
0
0
0
0
0
0
0
0
0
0
'9'
0
0
0
0
0
0
0
0
1
0
0
0
0
0
0
0
0
0
0
0
0
'10'
0
0
0
0
0
0
0
0
0
1
0
0
0
0
0
0
0
0
0
0
0
'11'
0
0
0
0
0
0
0
0
0
0
1
1
1
1
1
1
1
1
1
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
1
1
Average Daily
Cost/Profit
Lead
Percent
Time
diff from
Scenario Policy Costs TH
OT
IT
Actual
OA
OA
100.0% 12.5% 12.5%
-18.75
Show Rate
0
91.6% 1.0% 9.4%
-5.67
69.8%
with Same
MDP
1
93.4% 1.8% 8.4%
-8.92
52.4%
Day = 100%
5
97.1% 6.1% 9.0%
-16.93
9.7%
88.0% 15.7% 10.1%
-20.81
Increased OA
0
78.0% 2.9% 9.2%
-7.50
64.0%
Demand
(Demand = MDP
1
83.9% 6.9% 6.2%
-14.29
31.3%
12)
5
87.8% 14.7% 9.4%
-20.67
0.7%
OA
88.0% 6.9% 18.9%
-16.34
0
84.1% 0.7% 16.5%
-8.92
45.4%
Base Case
MDP
1
85.9% 1.7% 15.8%
-11.45
26.8%
5
87.4% 4.5% 17.1%
-15.64 -59.5%
OA
88.0% 6.9% 18.9%
-16.34
0
82.7% 0.8% 18.1%
-9.81
40.0%
Steep
Decline
MDP
1
84.3% 1.5% 17.2%
-11.60
29.0%
5
86.6% 4.0% 17.4%
-15.51
5.1%
OA
84.6% 5.6% 21.0%
-16.16
0
81.5% 0.6% 19.1%
-10.11
37.4%
Advanaced
Bookings MDP
1
82.9% 1.4% 18.5%
-12.03
25.5%
5
84.2% 3.7% 19.6%
-13.89
14.0%
0
OA
88.0% 2.1% 31.7%
-17.89
0
86.9% 0.0% 30.5%
-15.28
14.6%
Demand = 8
MDP
1
87.2% 0.2% 30.4%
-15.93
11.0%
5
87.6% 0.8% 30.7%
-17.32
3.2%
Appointment Lead Times
0
1
2
3
4
63.89% 34.76% 1.34% 0.01% 0.00%
71.51% 28.19% 0.30% 0.00% 0.00%
87.37% 12.63% 0.00% 0.00% 0.00%
27.76% 44.56% 23.06% 4.37% 0.25%
64.82% 34.43% 0.76% 0.00% 0.00%
97.91% 2.09% 0.00% 0.00% 0.00%
66.44% 32.29% 1.26% 0.01% 0.00%
81.52% 18.36% 0.12% 0.00% 0.00%
94.89% 5.11% 0.00% 0.00% 0.00%
78.01% 21.83% 0.16% 0.00% 0.00%
85.06% 14.92% 0.03% 0.00% 0.00%
94.35% 5.65% 0.00% 0.00% 0.00%
43.64% 52.99% 3.32% 0.05% 0.00%
55.14% 44.36% 0.50% 0.00% 0.00%
65.98% 34.01% 0.00% 0.00% 0.00%
90.39% 9.60% 0.01% 0.00% 0.00%
92.63% 7.37% 0.00% 0.00% 0.00%
97.09% 2.91% 0.00% 0.00% 0.00%
Average Daily
Cost/Profit
Lead
% diff
Time
from
Scenario
Policy Costs TH
OT
IT
Actual OA
OA
88.0% 15.7% 10.2% 190.33
Increased
0
86.2% 10.3% 6.8% 193.21 1.5%
Demand
MDP
1
87.1% 12.3% 7.8% 191.70 0.7%
(Demand = 12)
5
88.0% 15.7% 10.2% 190.33 0.0%
OA
100.0% 12.5% 12.5% 181.25
0
97.0% 6.0% 9.0% 183.44 1.2%
Show Rate with
Same Day = 100% MDP
1
98.1% 8.1% 10.0% 182.38 0.6%
5
100.0% 12.5% 12.5% 181.25 0.0%
OA
88.0% 6.9% 18.9% 159.63
0
86.6% 2.6% 16.0% 162.49 1.8%
Base Case
MDP
1
86.9% 3.5% 16.5% 161.29 1.0%
5
87.8% 6.2% 18.3% 159.61 0.0%
OA
88.0% 6.9% 18.9% 159.63
0
86.6% 4.0% 17.4% 160.46 0.5%
Show Rate with
Steep Decline
MDP
1
87.0% 4.7% 17.7% 160.11 0.3%
5
88.0% 6.9% 18.9% 159.63 0.0%
OA
84.6% 5.6% 21.0% 153.03
0
83.3% 1.9% 18.6% 155.46 1.6%
Advanaced
Bookings
MDP
1
83.8% 2.8% 19.0% 154.69 1.1%
5
84.4% 4.8% 20.4% 153.05 0.0%
0
OA
88.0% 2.1% 31.7% 122.90
Decreased
0
87.5% 0.5% 30.5% 124.21 1.1%
Demand
MDP
1
87.6% 0.6% 30.5% 123.95 0.9%
(Demand =8)
5
87.8% 1.3% 31.0% 123.16 0.2%
Appointment Lead Times
0
1
2
3
4
83.88% 16.12% 0.00% 0.00% 0.00%
91.33% 8.67% 0.00% 0.00% 0.00%
100.00% 0.00% 0.00% 0.00% 0.00%
87.10% 12.90% 0.00% 0.00% 0.00%
91.96% 8.04% 0.00% 0.00% 0.00%
100.00% 0.00% 0.00% 0.00% 0.00%
87.43% 12.56% 0.02% 0.00% 0.00%
91.54% 8.46% 0.00% 0.00% 0.00%
98.64% 1.36% 0.00% 0.00% 0.00%
94.35% 5.65% 0.00% 0.00% 0.00%
95.98% 4.02% 0.00% 0.00% 0.00%
100.00% 0.00% 0.00% 0.00% 0.00%
61.19% 34.48% 4.19% 0.14% 0.00%
63.14% 36.82% 0.05% 0.00% 0.00%
68.61% 31.39% 0.00% 0.00% 0.00%
95.87% 4.13% 0.00% 0.00% 0.00%
96.27% 3.73% 0.00% 0.00% 0.00%
98.52% 1.48% 0.00% 0.00% 0.00%
Average Daily
Scenario
Increased Demand
(Demand = 12)
Policy
OA
MDP
Lead
Time
Costs TH
0
1
5
OA
Show Rate with
Same Day = 100%
MDP
0
1
5
OA
Base Case
MDP
0
1
5
OA
Show Rate with
Steep Decline
MDP
0
1
5
OA
Advanaced Bookings
MDP
OA
Decreaded Demand
(Demand = 8)
MDP
0
1
5
0
0
1
5
OT
88.0%
86.7%
87.5%
88.0%
100.0%
98.2%
99.1%
100.0%
88.0%
86.8%
88.0%
88.0%
88.0%
87.2%
87.8%
88.0%
84.6%
83.7%
84.0%
84.6%
88.0%
87.5%
87.6%
88.0%
IT
15.7%
11.4%
13.7%
15.7%
12.5%
8.3%
10.1%
12.5%
6.9%
3.1%
6.9%
6.9%
6.9%
4.9%
6.4%
6.9%
5.6%
2.6%
3.4%
5.6%
2.1%
0.5%
0.7%
1.9%
Cost/Profit
Actual
10.2% 195.40
7.4% 196.69
8.7% 195.69
10.2% 195.40
12.5% 187.50
10.1% 188.06
11.1% 187.58
12.5% 187.50
18.9% 169.08
16.3% 170.51
18.9% 169.08
18.9% 169.08
18.9% 169.08
17.8% 169.39
18.6% 169.11
18.9% 169.08
21.0% 160.60
18.9% 164.80
19.4% 161.16
21.0% 160.60
31.7% 138.73
30.5% 139.45
30.7% 139.12
31.5% 138.79
Percent
diff from
OA
Appointment Lead Times
1
2
0.7%
0.1%
0.0%
88.59% 11.41%
95.46% 4.54%
100.00% 0.00%
0.00%
0.00%
0.00%
0.3%
0.0%
0.0%
92.43%
95.82%
100.00%
7.57%
4.18%
0.00%
0.00%
0.00%
0.00%
0.8%
0.0%
0.0%
89.82% 10.18%
100.00% 0.00%
100.00% 0.00%
0.00%
0.00%
0.00%
0.2%
0.0%
0.0%
96.48%
99.11%
100.00%
70.00%
62.27%
65.12%
70.00%
100.00%
96.04%
96.74%
99.78%
0.00%
0.00%
0.00%
0.00%
0.09%
0.01%
0.00%
0.00%
0.00%
0.00%
0.00%
2.6%
0.3%
0.0%
0.5%
0.3%
0.0%
0
3.52%
0.89%
0.00%
30.00%
37.64%
34.87%
30.00%
0.00%
3.96%
3.26%
0.22%
Download