Today’s applications need to scale and handle extreme levels of transaction loads. But, databases are unable to scale and therefore become a bottleneck. To resolve this, many people are turning to in-memory distributed cache because it scales linearly and removes the database bottlenecks.
A typical distributed cache contains two types of data, transactional data and reference data. Transactional data changes very frequently and is therefore cached for a very short time. But, caching it still provides considerable boost to performance and scalability.
Reference data on the other hand does not change very frequently. It may be static data or dynamic data that changes perhaps every hour, day, etc. At times, this data can be huge (in gigabytes).
It would be really nice if this reference data could be preloaded into a distributed cache upon cache start-up, because then your applications would not need to load it at runtime. Loading reference data at runtime would slow down your application performance especially if it is a lot of data.
How Should Reference Data be Preloaded into a Distributed Cache?
One approach is to design your application in such a way that during application startup, it fetches all the required reference data from the database and puts it in the distributed cache.
However, this approach raises some other issues. First, it slows down your application startup because your application is now involved in preloading the cache. Second, if you have multiple applications sharing a common distributed cache, then you either have code duplication in each application or all your applications depend on one application preloading the distributed cache. Finally, embedding cache preloading code inside your application corrupts your application design because you’re adding code that does not belong in your application. Of course, neither of these situations is very desirable.
What if we give this preloading responsibility to the distributed cache itself? In this case, preloading could be part of the cache startup process and therefore does not involve your application at all. You can configure the cache to preload all the required data upon startup so it is available for all the applications to use from the beginning. This simplifies your application because it no longer has to worry about preloading logic.
NCache provides a very powerful and flexible cache preloading capability. You can develop a cache loader and register it with NCache and then NCache calls this custom code developed by you upon cache startup. Let me demonstrate how to do this below:
- Implement a simple interface named ICacheLoader. It is called to assist the cache in answering the question “How and which data to load?”
class CacheLoader : ICacheLoader
public void Init(System.Collections.IDictionary parameters)
// Initialization of data source connection, assigning parameters etc.
public bool LoadNext(
ref System.Collections.Specialized.OrderedDictionary data,
ref object index)
// Fill ref objects with data that should be loaded in cache.
public void Dispose()
// Disposing connections and other required objects.
- Next step is to configure above implemented startup loader with cache. We can do it by using NCache Manger that comes with NCache or simply by adding following configuration in cache config.
<cache -loader="" retries="3" retry-interval="0" enable-loader="True">
<provider assembly-name="TestCacheLoader, Version=220.127.116.11, Culture=neutral,
PublicKeyToken=null" class-name="TestCacheLoader.CacheLoader" full-name="TestCacheLoader.dll"></provider>
&amp;lt; !—parameters that will be passed to Init method of the loader--&amp;gt;
<parameters name="connectionString" value="Data Source= SQLEXPRESS;
Initial Catalog=testdb;Integrated Security=True;Pooling=False"></parameters>
Any exception occurred during startup loader processing is logged without creating any problem for your application. Simple and effective!
As you can see, NCache provides you a powerful mechanism to preload your distributed cache and keep the performance of your applications always high.