Introducing PaperCache, an in-memory cache that can switch its eviction policy dynamically (papercache.io)
from papercache@programming.dev to programming@programming.dev on 08 Aug 05:17
https://programming.dev/post/35292297

cross-posted from: programming.dev/post/35278281

Hi everyone, I recently created an in-memory cache in Rust called PaperCache. It’s the first in-memory cache that’s able to switch between any eviction policy at runtime, allowing it to reduce its miss ratio by adapting to changing workloads. Currently, it supports the following eviction policies:

  • LFU
  • FIFO
  • CLOCK
  • SIEVE
  • LRU
  • MRU
  • 2Q
  • ARC
  • S3-FIFO

It typically has lower tail latencies than Redis (though with the trade-off of higher memory overhead as it needs to maintain extra metadata to be able to switch between policies at runtime).

Feel free to check out the website (papercache.io) which has documentation, a high-level article I wrote on Kudos (link.growkudos.com/1f039cqaqrk), or the paper from HotStorage’25 (dl.acm.org/doi/abs/10.1145/3736548.3737836)

Here’s a direct link to the cache internals: github.com/PaperCache/paper-cache

In case you want to test it out, you can find installation instructions here: papercache.io/guide/getting-started/installation

There are clients for most of the popular programming languages (papercache.io/guide/usage/clients), though some may be a little unpolished (I mainly use the Rust client for my own work, so that one is kept up-to-date).

If you have any feedback, please let me know!

#programming

threaded - newest