You should cache as often as you can and cache data properly in every layer of your application. When using data caching, you should implement a proper strategy to ensure that data in the cache is in sync with that in the data store. You can take advantage of distributed cache managers like Memcached so that your caching strategy can also scale well and provide considerable performance gains -- you can use Memcached to store large data. You should ensure that you cache relatively stale data only -- there isn't any point in caching data that would change often over time. Also, data that is unlikely to be reused should not be stored in the cache. You shouldn't over-use SqlDependency or SqlCacheDependency.
And now, let's know the downsides of caching too. The cache object is available to the current application domain only. So, if you would like to store data in the cache and make it accessible across a web farm, that isn't a possibility. You would have to leverage distributed cache like Windows Server AppFabric Caching or other distributed caching frameworks to have the data in the cache globally accessible in a Web farm.
Caching is a powerful mechanism to boost the application's performance by storing relatively stale data in the memory so that the same can be retrieved from the cache memory at a later point of time. I'll discuss more on this topic with real life code examples in my future posts here.
Sign up for CIO Asia eNewsletters.