Skip to main content

Cache Decay: Exploiting Generational Behavior to Reduce Cache Leakage Power

01 January 2001

New Image

Power dissipation is increasingly important in CPUs ranging from those intended for mobile use, all the way up to high-performance processors for high-end servers. While the bulk of the power dissipated is dynamic switching power, leakage power is also beginning to be a concern. In upcoming chip generations, the proportion of total chip power due to leakage is expected to increase significantly. This paper examines methods for reducing leakage power within the cache memories of the CPU. Because cache memories comprise a large proportion of chip area and chip transistor counts, they are a reasonable target for attacking leakage power. We discuss policies, mechanisms and implementations for reducing cache leakage by invalidating and "turning off" cache lines in periods where they hold data not likely to be reused. In particular, our approach is targeted at the generational nature of cache line usage. That is, cache lines typically have a flurry of frequent use when first brought into the cache, and then have a period of "dead time" before they are evicted. By devising effective, low-power ways of deducing dead time, our results show that we can reduce cache leakage energy by a factor of 2 in many SPEC and MediaBench applications. Because our decay-based techniques have notions of competitive on-line algorithms at their roots, their energy usage can be theoretically bounded at within a factor of two of the optimal oracle-based policy. We also examine adaptive decay-based policies which make energy minimizing policy choices on a per-application basis. Overall, these policies are effective at reducing leakage power with only negligible degradations in miss rate or performance.