cache.ppt

advertisement
CS61C Midterm #2 Review
Session
A little Cache goes a long way
The Ideal Memory System
Fast
Cheap (Large)
Actual Memory Systems
Fast, Expensive (Small)
Slow, Cheap (Large)
Idea: Multilevel Memory (cache)
+
=
The Cache
CPU
Tag
Data
A
Mem[A]
B
Mem[B]
Main
Memory
• Store recently used data in
fast memory
• Cache Hit
– Address we’re looking for is in
cache
• Cache Miss
– Not found… read memory and
insert into cache
• This works because…
Locality
Just referenced x
address
Spatial Locality
data
Reference to data
near x likely
Temporal Locality
stack
Likely to reference
x again soon
code
time
Computing Average Access Time
Q: Suppose we have a cache with a 5ns
access time, main memory with a 60ns
access time, and a cache hit rate of
95%. What is the average access time?
Cache Design Issues
• Associativity
– Fully associative, direct-mapped, n-way set
associative
• Block Size
• Replacement Strategy
– LRU, etc.
• Write Strategy
– Write-through, write-back
An Example
Multiple Choice (1)
•
LRU is an effective cache replacement policy
primarily because programs
a)
b)
c)
d)
exhibit locality of reference
usually have small working sets
read data much more frequently than writing data
can generate addresses that collide in the cache
Multiple Choice (2)
•
Increasing the associativity of a cache
improves performance primarily because
programs
a)
b)
c)
d)
exhibit locality of reference
usually have small working sets
read data much more frequently than writing data
can generate addresses that collide in the cache
Multiple Choice (3)
•
Increasing the block size of a cache
improves performance primarily because
programs
a)
b)
c)
d)
exhibit locality of reference
usually have small working sets
read data much more frequently than writing data
can generate addresses that collide in the cache
Download