Key Value Cache

Cache is a component that makes access to data faster. KV cache usually is in-memory data store which uses an associative array (think of a map or dictionary) as the fundamental data model.

Caching strategies

Cache Aside

In concurrent scenes, especially when there are frequent writes, dirty data could occur. We can solve this by:

  1. Add a distributed lock when write data to cache
  2. Set a short cache expire time.

Read Through

Write Through

We seldom use read/write through because popular distributed kv system like memcached/redis does not support write through. Also, write through is synchronous which may influence performance.

Write Back

This strategy is commonly used in OS designs like Page cache, or shuffling logs. However, it is seldom used in database-cache scenes because data loss could happen due to physical damage.

Write Around

References:

Author: huadonghu

Leave a Reply

Your email address will not be published. Required fields are marked *