Caching is a technique that stores a copy of a given resource and serves it back when requested. Here is a list of coding interview questions on Caching to help you get ready for your next data structures interview in 2021.
What is Caching?
In computing, a cache is a high-speed data storage layer which stores a subset of data, typically transient in nature, so that future requests for that data are served up faster than is possible by accessing the data’s primary storage location. Caching allows you to efficiently reuse previously retrieved or computed data.Source: medium.com
Is Redis just a cache?
Like a cache Redis offers:
- in memory key-value storage
But unlike a cash Redis:
- Supports multiple datatypes (strings, hashes, lists, sets, sorted sets, bitmaps, and hyperloglogs)
- It provides an ability to store cache data into physical storage (if needed).
- Supports pub-sub model
- Redis cache provides replication for high availability (master/slave)
- Supports ultra-fast lua-scripts. Its execution time equals to C commands execution.
- Can be shared across multiple instances of the application (instead of in-memory cache for each app instance)
What is Resultset Caching?
Resultset caching is storing the results of a database query along with the query in the application. Every time a web page generates a query, the applications checks whether the results are already cached, and if they are, pulls them from an in-memory data set instead. The application still has to render the page.
What is Cache Invalidation?
There are only two hard things in Computer Science: cache invalidation and naming things.
– Phil Karlton
HTTP caching is a solution for improving the performance of your web application. For lower load on the application and fastest response time, you want to cache content for a long period (TTL). But at the same time, you want your clients to see fresh (validate the freshness) content as soon as there is an update.
Cache invalidation gives you the best of both worlds: you can have very long TTLs, so when content changes little, it can be served from the cache because no requests to your application are required. At the same time, when data does change, that change is reflected without delay in the web representations.Source: foshttpcache.readthedocs.io
What usually should be cached?
The results for the following processes are good candidates for caching:
- Long-running queries on databases,
- high-latency network requests (for external APIs),
- computation-intensive processing
Name some Cache Writing Strategies
There are two common strategies to write data in a cache:
- Pre-caching data, for small pieces of data, usually during the application initialization, before any request.
- On-demand, checking first if the requested data is in the cache (if the data is found, it is called a cache hit), using it, improving the performance of the application. Whenever the requested data has not been written to the cache (cache miss), the application will need to retrieve it from the slower source, then writing the results in the cache, thus saving time on subsequent requests for the same data.