🤖 AI Summary
To address low cache set utilization and suboptimal hit rates of conventional cache replacement policies under complex memory access patterns, this paper proposes an adaptive cache replacement algorithm that— for the first time—integrates a randomized allocation mechanism into an extensible V-Way architecture, enabling dynamic co-optimization of tag management and data storage. Evaluated on ChampSim with a 16-way set-associative cache (2,048 sets), the design achieves up to 80.82% cache hit rate across four major benchmarks—including SPEC CPU2017—significantly reducing memory access latency and delivering robust IPC improvement. The key contributions are: (i) deep integration of randomized allocation with the V-Way architecture; (ii) a dynamic set utilization control mechanism; and (iii) a co-designed tag/data path that balances scalability and efficiency.
📝 Abstract
This paper presents a new hybrid cache replacement algorithm that combines random allocation with a modified V-Way cache implementation. Our RAC adapts to complex cache access patterns and optimizes cache usage by improving the utilization of cache sets, unlike traditional cache policies. The algorithm utilizes a 16-way set-associative cache with 2048 sets, incorporating dynamic allocation and flexible tag management. RAC extends the V-Way cache design and its variants by optimizing tag and data storage for enhanced efficiency. We evaluated the algorithm using the ChampSim simulator with four diverse benchmark traces and observed significant improvements in cache hit rates up to 80.82% hit rate. Although the improvements in the instructions per cycle (IPC) were moderate, our findings emphasize the algorithm's potential to enhance cache utilization and reduce memory access times.