Description
General discussion about an async caching API.
Current state and issues
At the moment the methods Cache.loadAll
and Cache.reloadAll
can be used in a read through (with cache loader) caching configuration. The interface is ugly and it does not return the loaded values.
In the presence of a CacheWriter
the Cache.put
operation will potentially have I/O latency as well. Which means Cache.put
should have an async counterpart as well.
If cache2k will support a hierarchical caching, e.g. with persistence to a file, the cache operation Cache.containsKey
would potentially have I/O latency as well.
There are potentially long running life cycle operations, which should run without blocking the caller, e.g.: Cache.clear
, Cache.close
or cache resize.
Requirements
I see two different requirements at the moment:
-
A better interface specifically for caches operated in read through, which is quite common
-
Be able to run every cache operation in async mode
Solutions / ideas
New async cache interface supporting every operation
Obvious approach. Should we use CompletableFuture
everywhere or support different
async models?
Subset of cache methods returning CompletableFuture
for loading/read through caches
The loader returns a CompletableFuture
which is hold by the cache directly.
Async entry processor
Maybe a good first start. Add the following two methods to Cache
:
<X> CompletableFuture<X> invokeAsync(K key, EntryProcessor<K, V, X> p)
<X> CompletableFuture<X> invokeAllAsync(Iterable<K> key, EntryProcessor<K, V, X> p)
This way every cache operation could be executed in an asynchronous processing model.