Caching frequently used objects, that are expensive to fetch from the source, makes application perform faster under high load. It helps scale an application under concurrent requests. But some hard to notice mistakes can lead the application to suffer under high load, let alone making it perform better, especially when you are using distributed caching where there’s separate cache server or cache application that stores the items. Moreover, code that works fine using in-memory cache can fail when the cache is made out-of-process. Here I will show you some common distributed caching mistakes that will help you make better decision when to cache and when not to cache.
Here are common mistakes we make that makes performance worse:
* Relying on .NET’s default serializer.
* Storing large objects in a single cache item.
* Using cache to share objects between threads.
* Assuming items will be in cache immediately after storing it.
* Storing entire collection with nested objects.
* Storing parent-child objects together and also separately.
* Caching Configuration settings.
* Caching Live Objects that has open handle to stream, file, registry, or network.
* Storing same item using multiple keys.
* Not updating or deleting items in cache after updating or deleting them on persistent storage.
For detail explanations of these, you can read these article:
Ten Caching Mistakes that Break your App[
^]