Last post Aug 26, 2012 03:56 PM by sharepoint99
Aug 26, 2012 01:36 PM|sharepoint99|LINK
I am using httpruntime.cache for inseting items in the cache in a HTTP Module..what I am not sure about is whether I need to use a lock object when inserting items in the cache.
Aug 26, 2012 01:48 PM|shashank_mehta|LINK
If the data insertion into cache is a long process then you should apply lock.
Check the below link on Cache locking.
Aug 26, 2012 01:55 PM|BrockAllen|LINK
Are you doing per-request caching or cross-request (app-wide) caching?
If you're doing per-request caching then you can cache data in the HttpContext.Current.Items collection and you don't have any threading contention since HttpContext is per-request and thus per-thread (unless you're doing something odd like sharing it across
If you're doing cross-request (app-wide) caching then you can HttpRuntime.Cache (which is the same as HttpContext.Cache) and it does all the locking necessary -- it's thread safe.
Aug 26, 2012 02:19 PM|sharepoint99|LINK
Thanks for replying..The caching I am doing is a very quick process I am just retrieving some string values from sharepoint farm property bag and then storing it in cache.The httpmodule is applied at the web application level so there can be many requests
at the same time so was thinking what happens when the module is fired at the same time for 2 requests so that's why was wondering about the lock thing.I am using httpruntime.cache and the caching is for all the requests and not per request .The cache is applied
for 24 hours.
Aug 26, 2012 02:54 PM|BrockAllen|LINK
Ok, so the cache is shared across requests/users -- then the HttpRuntime.Cache is fine.
As for the locking suggestions on stackoverflow -- if you need it then those locking approaches are fine. Personally I don't do that sort of locking (usually). It really depends on how expensive the operation is to load the data. Yes, there is a chance that
multiple requests will come in at the same time and see an empty cache thus multiple loads will happen, but if the cache duration is long then that cost is negligible. If the operation to load the data is expensive then it's worth the locking.
Aug 26, 2012 02:58 PM|sharepoint99|LINK
So the only thing that will happen if 2 requests come at the same time is that one is going to see empty cache and for one it's going to create cache and it's a thread safe process so nothing disastrous iis gonna happen.The operation is very quick so i ll
leave the locking thing for now.
Thanks for the quick reply and help.
Aug 26, 2012 03:29 PM|BrockAllen|LINK
Both will see an empty cache so both will make the call to load your data, then they'll both put the data into the cache and the last writer will win (in a thread-safe way).
Aug 26, 2012 03:56 PM|sharepoint99|LINK
Got it now :)..Thanks for the explanation