Distributed Caching Makes Cents

By Iqbal Khan

The old adage 'time is money' is especially true for today's retailers relying on their advanced information systems and server farms. As systems' response times increases, so does a retailer's productivity and revenue. Unfortunately, the vastly increasing system users and transactions play an adverse role in meeting these ends.

For example, consider payment processing and POS systems. With a payment processing system, retailers have a short and limited window of time during the night to process customer payments and transfer of funds. When they have tens of millions of customers, processing all these payments as fast as possible becomes a major issue due to this limited window of time at night.

To alleviate the problem, retailers try to add more payment processing servers, but they are unable to add more database servers proportionately due to architectural constraints in their system. Therefore, retailers reach a scalability bottleneck with the database and adding more payment processing servers making matters worse.

Read full Article

In similar fashion, POS systems are expected to process customer purchases fast. And, as the number of POS systems increases, retailers try to add more back end servers to handle more requests. But, they're not able to add more database servers proportionately due to architectural constraints in your system. And, very soon, they're not able to scale up any more and the entire system grinds to a halt during peak hours.

Ideally, retailers want to be able to scale up a retail system simply by adding more servers. However, in order to do this, retailers need to incorporate distributed cache in their application's architecture.

Payment processing, POS systems, and other retail applications can speed up their data access by fetching information from a distributed cache rather than going to the database all the time. Caching is the process of storing frequently used data close to the application. This data is stored in memory, as objects. Retrieving data from memory is faster and more efficient than from a database. Augmenting a database, this approach is considerably faster than solely going to the database. The net result is payment processing, POS systems, and other retail applications are faster and handle considerably more transactions.

Distributed caching provides a major performance and scalability boost by reducing expensive database trips. Even in an efficient database, a typical database trip is 10-100 times slower than accessing an in-memory cache. Distributed cache usually provides sub-millisecond response times. Hence, by dramatically cutting down on database trips and their costly time, a retailer achieves a substantially quicker response time and can handle more customers.

Read full Article

Signup for monthly email newsletter to get latest updates.

© Copyright Alachisoft 2002 - . All rights reserved. NCache is a registered trademark of Diyatech Corp.