In this article, we will go through how a distributed caching solution can drastically improve the overall performance and throughput of your microservices-based application.
In a typical microservices-based application, multiple microservices work together while remaining loosely coupled and scalable. The application has services that are needed to satisfy core business requirements such as keeping track of and processing critical business data. There are also additional dedicated microservices that handle identity and authentication, health and load monitoring, as well as serve as API gateways.
A key feature of such an application is that each microservice be designed, developed, and deployed independently using any technology stack you want. Because each microservice is a standalone autonomous app in its own right, it keeps its separate persistent storage as well, be it a relational database, a NoSQL DB or even a legacy file storage system. This allows the individual microservices to scale independently and makes real-time infrastructure changes much more manageable.
NCache Details Microservices with NCache Scale Pub/Sub in Microservices
Why does your microservice need NCache?
There are cases where bottlenecks still arise during increased transactions on the application. This is predominantly common in architectures where microservices store data in relational databases that do not allow scaling out. In such situations, scaling out the microservice by deploying more instances of it does not resolve the matter.
To counter these issues, you can seamlessly introduce NCache as your distributed cache at the caching layer between your microservices and the datastores. NCache also helps help as an in-memory scalable publisher/subscriber messaging broker to allow for asynchronous communications between microservices.
Scalability Through Pub/Sub
Microservices communication is frequently implemented using the Publisher/Subscriber model that allows messaging between microservices while keeping them loosely coupled. In that regard, NCache serves as an in-memory scalable Pub/Sub messaging broker through which all the microservices making up the application can publish and subscribe for events. The scalability and reliability feature inherent with NCache clustering are automatically translated when we come to pub/sub. Find out more about NCache as a message broker in the microservices environment through our blog on Scaling .NET Microservices Communication with In-Memory Pub/Sub.
NCache Details Pub/Sub Messaging in NCache Scale Pub/Sub in Microservices
Scalability Through Caching
NCache offers real-time scalability, letting you add as many server nodes as you want in your running cache cluster without incurring any application downtime. Using NCache not only improves the overall performance of the individual microservices by acting as a fast in-memory store but also facilitates a marked increase in the application response time and availability through its clustering architecture. This is especially true when considering workflows involving dozens of microservices spread across multiple hosts.
How to Use NCache for Data Caching?
With NCache in place, if a microservice requires data, it first checks the cache instead of directly accessing the database every time. Given that the most frequently accessed data usually makes up a small portion of the entire data available in the datastore, having that data already cached and available for use greatly cuts down on database-related latency as well as ease the load on the database since most of the data requests are served by the cache itself.
Seeing as how a microservices-based application is inherently slower than when built using a monolithic design structure, it’s clear that you need the gain in speed offered by NCache. NCache, at a microservice level, can help in leveraging the power of a microservices architecture while slashing down on the overall latency observed during long transactions that span multiple services working sequentially.
NCache has several out-of-the-box features that provide fine-grained control over caching operations. These operations include enforcing cache consistency using expiration and database synchronization, rich APIs that help to implement cache-aside and cache-through features using backing source providers. NCache also provides SQL-like query operations on the cache using SQL query as well as serving as caching providers for Object Relational Mappers (ORM) such as EF Core.
To get started with NCache in your microservices-based application, the first thing you need is to configure services. This provides the required information your microservices need to start using NCache. An overview of how you can create a mutual context between NCache and your microservices-based application is shown below:
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 |
public IServiceProvider ConfigureServices(IServiceCollection services) { //Add additional code here services.AddDbContext<CatalogContext>(options => { var cacheID = configuration["CatalogCache"]; if (string.IsNullOrEmpty(cacheID)) cacheID = "CatalogCache"; NCacheConfiguration.Configure(cacheID, DependencyType.Other); // Changing default behavior when client evaluation occurs to throw. // Default in EF Core would be to log a warning when client evaluation is performed. options.ConfigureWarnings(warnings => warnings.Throw(RelationalEventId.QueryClientEvaluationWarning)); //Check Client vs. Server evaluation: https://docs.microsoft.com/en-us/ef/core/querying/client-eval }); var container = new ContainerBuilder(); container.Populate(services); return new AutofacServiceProvider(container.Build()); } |
What you need to do now is to deploy a controller with the logic that lets it get an item from the cache if it finds it there. And if not, then the controller fetches the item from the database and stores it in the cache. The implementation of such a controller is shown below:
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 |
[Route("api/v1/[controller]")] [ApiController] public class CatalogController : ControllerBase { private readonly CatalogContext _catalogContext; private readonly CatalogSettings _settings; private readonly ICatalogIntegrationEventService _catalogIntegrationEventService; public CatalogController(CatalogContext context, IOptionsSnapshot<CatalogSettings> settings, ICatalogIntegrationEventService catalogIntegrationEventService) { _catalogContext = context ?? throw new ArgumentNullException(nameof(context)); _catalogIntegrationEventService = catalogIntegrationEventService ?? throw new ArgumentNullException(nameof(catalogIntegrationEventService)); _settings = settings.Value; } [HttpGet] [Route("items/{id:int}")] [ProducesResponseType((int)HttpStatusCode.NotFound)] [ProducesResponseType((int)HttpStatusCode.BadRequest)] [ProducesResponseType(typeof(CatalogItem), (int)HttpStatusCode.OK)] public async Task<ActionResult<CatalogItem>> ItemByIdAsync(int id) { if (id <= 0) { return BadRequest(); } CatalogItem item = null; var cache = _catalogContext.GetCache(); string catalogItemKey = "CatalogItem:" + id; //Getting item from cache item = cache.Get<CatalogItem>(catalogItemKey); if (item == null) { item = await _catalogContext.CatalogItems.SingleOrDefaultAsync(ci => ci.Id == id); cache.Insert(catalogItemKey, item); } // Your logic here if (item != null) return item; return NotFound(); } } |
NCache Details Microservices with NCache
Let us go through their details to get acquainted with the true charms of NCache in microservices.
Keep Cache Fresh, Always
There is an important caveat to using a cache that is related to the cache retaining “stale” data when compared to the contents of the underlying primary datastore. To make sure that any given microservice receives fresh data from its cache, you need to update the cache data regularly. Fortunately, NCache provides features such as Database Synchronization and Expiration to ensure that data remains consistent with that in the primary data store.
You can maintain a level of synchronization of the cache with the datastore by simply adding an expiration time interval to the cached items. Once expired, NCache removes the cached item so that subsequent requests for that same information results in the updated data getting cached. NCache offers both Absolute as well as Sliding expiration strategies and you can use either depending on the transient nature of the data in question.
As an example, the following code snippet shows how easily you can introduce absolute expiration on a particular cache item:
1 2 3 4 |
var cacheItem = new CacheItem(product); var expiration = new Expiration(ExpirationType.Absolute, TimeSpan.FromMinutes(5)); cacheItem.Expiration = expiration; cache.Insert(key, cacheItem); |
To use Sliding expiration, all you have to do is change the ExpirationType as such:
1 |
var expiration = new Expiration(ExpirationType.Sliding, TimeSpan.FromMinutes(5)); |
A major requirement when setting expirations in any cache to maintain cache data consistency is that the expiration times set have to be according to how fast the specific piece of data is changing on the data-store side. If you set the expiration times too short, data may be removed needless and will result in an unnecessary and expensive datastore trip; if the expiration time is too long, stale data may be used.
NCache Details Data Expiration in NCache
Finding the optimum values for expiration times, thus, requires deep knowledge of the data state change patterns which is usually not feasible. When cache consistency requirements become more stringent, database synchronization strategies are the recommended approach. NCache provides several database synchronization strategies in this regard.
Using these, you can synchronize the cache with the datastore without having to go into the data access patterns of each piece of information as is required when using expiration. Now, whenever there is any change in that item on the datastore side, the cache can then remove that item automatically without further delay.
To see this in action, the following code snippet demonstrates how you can synchronize NCache with an SQL Server database by adding NCache SQL Dependency on the cached items.
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 |
// Creating SQL Dependency string query = "SELECT ProductName, UnitPrice FROM dbo.Products WHERE CategoryID = 'Dairy';"; SqlCacheDependency sqlDependency = new SqlCacheDependency(connectionString, query); // Get orders that contain products with given category ID Order[] orders = FetchOrdersByProductCategoryID("Dairy"); foreach (var order in orders) { // Generate a unique cache key for this order string key = $"Order:ProductCategory-Dairy:{order.OrderID}"; // Create a new cacheitem and add sql dependency to it CacheItem item = new CacheItem(order); item.Dependency = sqlDependency; //Add cache item in the cache with SQL Dependency cache.Insert(key, item); } |
SQL Query on Cache
NCache offers your microservices the ability to query indexed cache data through an SQL-like querying mechanism. This feature proves valuable in cases where the values of the keys against which the required information is stored are unknown. This also abstracts away much of the lower level cache API calls and makes your application code fairly easier to understand and maintain. This feature proves to be uniquely suitable for you if you are more comfortable with SQL-like commands.
An example code snippet demonstrating the use of the NCache SQL Query feature is given below:
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 |
string query = "SELECT * FROM FQN.Product WHERE ProductID > ?"; // Use QueryCommand for query execution var queryCommand = new QueryCommand(query); // Providing parameters for query queryCommand.Parameters.Add("ProductID",50000); // Executing QueryCommand through ICacheReader ICacheReader reader = cache.SearchService.ExecuteReader(queryCommand); // Check if the result set is not empty if (reader.FieldCount > 0) { while (reader.Read()) { string result = reader.GetValue<string>(1); // Perform operations using the retrieved keys } } else { // Null query result set retrieved } |
SQL queries can work with query indexes, the NCache distributed data structures as well as cache tags. Follow the link for more information on how to use the NCache SQL Query feature.
NCache Details SQL Query in NCache
Read-Thru and Write-Thru
Using the NCache Data Source Providers feature, set NCache up as the single entry into the data access layer from the perspective of the microservice; if a microservice requires data, it only has to access the cache. The cache then furnishes the data if it is available in the cache but if it isn’t, it goes ahead and retrieves the data from the datastore using a read-thru handler on the client’s behalf, cache it and present it to the microservice.
Similarly, by utilizing a write-thru handler, a microservice only has to execute a write operation (Add, Update, Delete) on the cache and the cache then performs the relevant write operation on the datastore automatically.
What’s more, you can even force the cache to retrieve data directly from the data store irrespective of whether the cache holds a possibly stale version of it. This is critical when the microservice requires up-to-date information and builds on the cache consistency strategies mentioned previously.
Not only does the backing Data Source Provider feature streamline your application code but when used together with the many NCache database synchronization features available, the cache is kept with auto-reloaded fresh data ready for computation.
The following code snippet will help you start using Read-Thru in your microservices:
1 2 3 4 5 6 |
// Specify the readThruOptions for read through operations var readThruOptions = new ReadThruOptions(); readThruOptions.Mode = ReadMode.ReadThru; // Retrieve the data of the corresponding item with reads thru enabled Product data = cache.Get<Product>(key, readThruOptions); |
Similarly, you can implement Write-Through by using:
1 2 3 4 5 6 |
// Enable write through for the cacheItem created var writeThruOptions = new WriteThruOptions(); writeThruOptions.Mode = WriteMode.WriteBehind; // Add item in the cache with write-behind cache.Insert(key, cacheItem, writeThruOptions); |
For more detailed information on how to use these providers, refer to our documentation at Read-Through Caching and Write-Through Caching.
NCache Details Data Source Providers in NCache
EF Core Caching
Entity Framework (EF) Core is a powerful Object Relational Mapper (O/RM) frequently used in enterprise .NET applications. And because it is so popular, NCache offers an EF Core caching provider which allows you to seamlessly add caching within the EF Core related code using extension methods such as FromCache. This allows EF Core developers not intimately familiar with NCache APIs to still avail the power of NCache.
The following code demonstrates the ease of use of the NCache EF Core caching provider to introduce caching in your existing microservice application logic.
1 2 3 4 5 6 7 8 9 |
var options = new CachingOptions { // To store the result as collection in cache StoreAs = StoreAs.Collection }; options.SetAbsoluteExpiration(DateTime.Now.AddMinutes(_settings.NCacheAbsoluteExpirationTime)); // Get items from cache. If not found, fetch from database and store in cache. item = await _catalogContext.CatalogItems.DeferredSingleOrDefault(ci => ci.Id == id).FromCacheAsync(options); |
You can find more information about the EF Core Caching Provider API and how it can help your business case at NCache EF Core Provider.
NCache Details EF Core Caching in NCache
Summing it all up
Microservices are built with the specific intent that they should be autonomous; that you can develop, test and deploy them independently of the other microservices. This serves to make the whole application highly scalable as well as being open to fast Continuous Integration/Continuous Deployment (CI/CD) processes.
However, for all the advantages microservices offer in terms of scalability and fast development lifecycles, there are certain aspects of an application stack that cause hitches. Among these aspects are relational databases that don’t allow for the scale-out needed to contend with increased load and this is where a distributed caching solution like NCache shines.
NCache has numerous out-of-the-box features to aid you in making data caching a painless and intuitive addition to your microservices application. These include database synchronization, expiration, EF Core caching, SQL query and much more.