next up previous
Next: Conclusions Up: Applications of the Model Previous: Block Size

Associativity

Finally, consider Figure 19 which shows how the associativity affects the access and cycle time of a 16KB and 64KB cache. As can be seen, there is a significant step between a direct-mapped and a 2-way set-associative cache, but a much smaller jump between a 2-way and a 4-way cache (this is especially true in the larger cache). As the associativity increases further, the access and cycle time begin to increase more dramatically.

 

  figure490


Figure 19: Access/cycle time as a function of associativity

The real cost of associativity can be seen by looking at the tag path curve in either graph. For a direct-mapped cache, this is simply the time to output the valid signal, while in a set-associative cache, this is the time to drive the multiplexor select signals. Also, in a direct-mapped cache, the output driver time is hidden by the time to read and compare the tag. In a set-associative cache, the tag array access, compare, and multiplexor driver must be completed before the output driver can begin to send the result out of the cache.

Looking at the 16KB cache results, there seems to be an anomaly in the data array decoder at A=2. This is due to a larger tex2html_wrap_inline1140 at this point. This doesn't affect the overall access time, however, since the tag access is the critical path.



Steve Wilton
Tue Jul 30 15:38:35 EDT 1996