Thursday, November 30, 2006
In this talk I will discuss recent research where my students and myself have had success experimenting with applications of data access prediction to improve cache performance and reduce device energy consumption. One such algorithm reduced average response times by approximately 50% compared to a basic LRU cache, while requiring less than half the I/O operations that traditional predictive prefetchers would require to achieve similar hit rates. This is particularly significant when considering the energy savings that can be achieved by avoiding excessive and unnecessary device activity. Such a result considered disks that could enter low-power states when inactive, but we were also successful in finding a mechanism to reduce energy consumption in a disk without exploiting periods of inactivity, through the use of predictive grouping to improve data layout. In this particular application it is interesting to notice that increased power savings go hand-in-hand with reduced data access latencies.