The research paper Developing a High Performance Cache System Paper Presentation describes the cache memory as a key mechanism in improving the overall performance of the system. The paper aims at developing a high performance cache system.
What cache does: Cache exploits the reference stream of the typical application. The research paper describes how cache functions in a given locality. Two types of localities are discussed in the research paper
- Temporal locality
- Spatial locality
The research paper also talks about ‘prefetching mechanism’ that reduces cache misses. Hardware-based prefetching requires some modification to the cache, but almost no modification to the processor core. Its main advantage is that prefetches are handled dynamically at run time without compiler intervention. In contrast software-based approaches rely on compiler technology to perform static program analysis and to selectively insert prefetch instructions.
A SMI cache is constructed in three parts; a conventional direct mapped cache With a small block size, a fully associative buffer with a large block size at the same cache level, and a hardware prefetching unit. The improvement in performance is achieved by exploiting the basic characteristic of locality.
Benefits of cache: Common design objective for the cache are to improve utilization of the temporal and spatial locality inherent in applications. However no single cache organization exploit both temporal and spatial locality optimally because of their contradictory characteristics.
The research paper basically aims at designing a simple but high performance cache system with low cost, a new caching mechanism for exploiting two types of locality effectively and adaptively is designed: A direct mapped cache with a small block size for exploiting temporal locality and a fully associative spatial buffer with a large block size for exploiting spatial buffer.
Download Developing a High Performance Cache System Paper Presentation.