Caching frequently used objects, that are expensive to fetch from the source, makes application perform faster under high load. It helps scale an application under concurrent requests. But some hard to notice mistakes can lead the application to suffer under high load, let alone making it perform better, especially when you are using distributed caching where there’s separate cache server or cache application that stores the items. Moreover, code that works fine using in-memory cache can fail when the cache is made out-of-process. Here I will show you some common distributed caching mistakes that will help you make better decision when to cache and when not to cache.
Here are the top 10 mistakes I have seen:
- Relying on .NET’s default serializer.
- Storing large objects in a single cache item.
- Using cache to share objects between threads.
- Assuming items will be in cache immediately after storing it.
- Storing entire collection with nested objects.
- Storing parent-child objects together and also separately.
- Caching Configuration settings.
- Caching Live Objects that has open handle to stream, file, registry, or network.
- Storing same item using multiple keys.
- Not updating or deleting items in cache after updating or deleting them on persistent storage.
Let’s see what they are and how to avoid them.
http://www.codeproject.com/KB/web-cache/cachingmistakes.aspx
Please vote if you find this useful.
Very insightful article. We used to run into issues with storing large objects in memcache.
very informative article. thanks for sharing such a useful information.