A cache is a component that stores data so future requests for that data can be served faster. The data stored in a cache might be the results of an earlier computation, or the duplicates of data stored elsewhere. A cache hit occurs when the requested data can be found in a cache, while a cache miss occurs when it cannot. Cache hits are served by reading data from the cache which is faster than recomputing a result or reading from a slower data store. Thus, the more requests that can be served from the cache, the faster the system performs.
One of the popular cache replacement policies is: "least recently used" (LRU). It discards the least recently used items first.
For example, if a cache with a capacity to store 5 keys has the following state(arranged from most recently used key to least recently used key) -
5 3 2 1 4
Now, If the next key comes as 1(which is a cache hit), then the cache state in the same order will be -
1 5 3 2 4
Now, If the next key comes as 6(which is a cache miss), then the cache state in the same order will be -
6 1 5 3 2
You can observe that 4 has been discarded because it was the least recently used key and since the capacity of cache is 5, it could not be retained in the cache any longer.
3 1
set 1 2
get 1
get 2
2
-1
Since, the capacity of the cache is 1, the first set results in setting up the key 1 with it's value 2. The first get results in a cache hit of key 1, so 2 is printed as the value for the first get. The second get is a cache miss, so -1 is printed.