Queuing models

advertisement
Queuing models
Basic definitions, assumptions, and identities
Operational laws

Little’s law
Queuing networks and Jackson’s theorem
The importance of think time
Non-linear results as saturation approaches
Warnings
References
What problem are we solving?
Describe a system’s mean
Throughput
Response time
Capacity
Predict
Changes in these quantities when system
characteristics change
See Stallings, Figure 1
Stallings, Figure 1
One queue
Server
arrivals
departures
waiting
serving
= a job
See Stallings, Figure 2
Stallings,
Figure 2,
Table 2
Basic assumptions
One kind of job
Inter-arrival times are independent of system
state
Service times are independent of system state
No jobs lost because of buffer overflow
Stability: λ < 1 / Ts
In a network,


no parallel processing of a given job
visit ratios are independent of system state
Queuing definitions
A / S / m / B / K / SD
A = Inter-arrival time distribution
S = Service time distribution
m = Number of servers
B = Number of buffers (system capacity)
K = Population size
SD = service discipline
Usually specify just the first three: M/M/1
Usual assumptions
A is often the Exponential distribution
S is often Exponential or constant
B is often infinite (all the buffer space you
need)
K is often infinite
SD is often FCFS (first come, first served)
Exponential distribution
Also known as “memoryless”
Expected time to the next arrival is always
the same, regardless of previous arrivals
When the interarrival times are
independent, identically-distributed, and
the distribution is exponential, the arrival
process is called a “Poisson” process
Poisson processes
Popular because they are tractable to analyze
You can merge several Poisson streams and get
a Poisson stream
You can split a Poisson stream and get Poisson
streams
Poisson arrivals to a single queue with
exponential service times => departures are
Poisson with same rate
Same is true of departures from a M/M/m queue
Basic formulas: Stallings, Table 3b
Assumptions
Poisson arrivals
No dispatching preference based on
service times
FIFO dispatching
No items discarded from queue
Basic multi-server formulas
Expected response time
For a single M/M/1 queue, expected
residence (response) time is 1/(μ-λ),
where


μ is the server’s maximum output rate (1/Ts)
λ is the mean arrival rate
Example:



Disk can process 100 accesses/sec
Access requests arrive at 20/sec
Expected response time is 1/(100-20) =
0.0125 sec/access
Example: near saturation
80% utilization



Disk can do 100 accesses/sec
Access requests arrive at 80/sec
Expected response time: 1/(100-80) = 0.05 sec
90% utilization

Expected response time: 1/(100-90) = 0.1 sec
95% utilization

Expected response time: 1/(100-95) = 0.2 sec
99% utilization

Expected response time: 1/(100-99) = 1 sec
1.2
1
0.8
0.6
response
0.4
0.2
utilization (%)
98
95
92
89
86
83
0
80
expected response time (sec)
Nearing saturation (M/M/1)
Operational law
Little’s Law
r = λ Tr
Example:
Tr = 0.3 sec average residence in the
system
λ = 10 transactions / sec
r = 3 average transactions in the system
Queuing network
Queue 1
Queue 3
Queue 2
Jackson’s theorem
Assuming
Each node in the network provides an
independent service
Poisson arrivals
Once served at a node, an item goes
immediately to another node, or out of the
system
Then
Each node is an independent queuing system
Each node’s input is Poisson
Mean delays at each node may be added to
compute system delays
Queuing network example
Tr = 50 msec
Tr = 60 msec
P = 0.4
λ = 12 jobs/sec
Servlet
History
data
Tr = 80 msec
P = 0.6
Order
data
Average system residence time
(Jackson’s theorem)
= 0.4 * (50+60) + 0.6 * (50+80)
= 122 msec
Average jobs in system
(Little’s Law)
= 12 * 0.122 = 1.464
An example in a spreadsheet
Think time
“think time”
between
transactions
request a
transaction
Think time can have a huge effect on
the arrival rate for a system.
Main Reference
Queuing Analysis, William Stallings
Cached copy on the resource page:
http://cs.franklin.edu/~swartoud/650/QueuingA
nalysis.pdf
Download