Caches: LRU vs. Random
9 months ago
- #performance optimization
- #computer architecture
- #cache eviction policies
- Random eviction policy for caches is not as bad as commonly thought, especially in comparison to LRU (Least Recently Used).
- LRU performs well when a tight loop fits in cache but causes misses if the loop doesn't fit, whereas random eviction degrades more gracefully.
- 2-random eviction policy (choosing between two random options using LRU) shows competitive performance, especially in larger caches.
- In SPEC CPU benchmarks, 2-random performs closely to LRU, with LRU slightly better in smaller caches and 2-random better in larger ones.
- Hierarchical cache structures complicate eviction policy effectiveness, as LRU may not be optimal for higher-level caches like L3.
- Pseudo-LRU and pseudo-2-random policies offer practical implementations with performance close to true LRU and 2-random, respectively.
- Increasing cache associativity tends to widen the performance gap between LRU and 2-random, with LRU favored in smaller caches.
- The mathematical principle behind 2-random is based on 'The Power of Two Random Choices,' showing logarithmic improvements in load balancing.
- Future exploration could include different eviction policies per cache level or applying 2-random to other types of caches beyond CPU.