- Functional Caching
- Functional Caching Part 2: Concurrency
- Functional Caching Part 3: Decoration
- Functional Caching Part 4: Dependency Injection
Last time, we defined caching in terms of functions and extended the Func<,> type with the ability to create cached versions. This is powerful because it can cache the results of any operation based on a key.
However, the Dictionary<,>-based caching mechanism is rather simplistic. It doesn’t handle core caching scenarios such as expiration, dependencies, and notifications. Even more fundamental, though, is the ability to be read and written by multiple threads at the same time.
Thread safety is the art of not crossing the streams. Here, a stream is a single thread of execution, and crossing means two or more threads simultaneously operating on the same data. Data being read while also being written is inherently unstable; safety means accurately predicting the outcome of multithreaded code.
This mental model is focused on safety from the effects of threads and is heavy in mechanism. Threads are really an implementation detail of the intent to perform simultaneous actions, known as concurrency. Our goal, then, is to create cached functions which support concurrent usage.
Here is another extension method in the same style as the original Cached method. This one simply locks the cache variable during each lookup:
The cache variable is never referenced outside of this method and thus serves as the perfect synchronization object.
We identified the need for caching functions which may be invoked concurrently. We also created a new method which adds concurrency to the algorithm established by the Cached method. We are starting to see the power of the function composition pattern and its implementation in C#.