Our 5-minute guide to distributed caching

What is distributed caching and when is it used?

Today's web, mobile and IoT applications need to operate at web scale, anticipating millions of users, terabytes of data and submillisecond response times, as well as operating on multiple devices around the world.

For those applications which run in clustered environments, distributed caching is a vital requirement. Distributed caching solves many common problems with data access, improving performance, manageability and scalability, but what is it and how can it benefit businesses?

What is distributed caching?

Caching is a commonly used technology to boost application performance as well as reduce costs. The primary goal of caching is to alleviate bottlenecks that come with traditional databases. By caching frequently used data in memory rather than making database round trips application response times can be dramatically improved.

Advertisement - Article continues below

Distributed caching is simply an extension of this concept, but the cache is configured to span multiple servers. It's commonly used in cloud computing and virtualised environments, where different servers give a portion of their cache memory into a pool which can then be accessed by virtual machines. This also means it's a much more scalable option.

The data stored in a distributed cache is quite simply whatever is accessed the most, and can change over time if a piece of data hasn't been requested in a while.

Advertisement
Advertisement - Article continues below

By caching frequently accessed data in memory, rather than the backend database, applications can deliver highly responsive experiences.

Distributed caching can also substantially lower capital and operating costs by reducing workloads on backend systems and reducing network usage. In particular, if the application runs on a relational database such as Oracle, which requires high-end, costly hardware in order to scale, distributed caching that runs on low-cost commodity servers can reduce the need to add expensive resources.

Common distributed caching use cases

Due to clear performance and cost benefits, distributed caching is used across numerous applications. Common use cases include:

Advertisement - Article continues below

Speeding up RDBMS: Many web and mobile applications need to access data from a backend relational database management system (RDBMS) for example, inventory data for an online product catalogue. However, relational systems were not designed to operate at internet scale and can be easily overwhelmed by the volume of requests from web and mobile applications. Caching data from the RDBMS in memory is a widely used, cost-effective technique to speed up the backend RDBMS.

Managing usage spikes: Web and mobile applications often experience spikes in usage. In these cases, caching can prevent the application from being overwhelmed and can help avoid the need to add expensive backend resources.

Mainframe offloading: Mainframes are still used widely in many industries. A cache is used to offload workloads from a backend mainframe, thereby reducing costs as well as enabling completely new services that wouldn't be possible using just the mainframe.

Web session store: Session data and web history are kept in memory for example, as inputs to a shopping cart or real-time recommendation engine on an ecommerce site, or player history in a game.

What makes distributed caching effective?

The requirements for effective distributed caching are fairly straightforward. Six key criteria are listed here, but their importance depends on an organisation's specific situation.

Advertisement - Article continues below

Performance: For a given workload, the cache must meet and sustain the application's required steady-state performance targets for latency and throughput. Efficiency of performance is a related factor that impacts cost, complexity and manageability.

Scalability: As more users, more data requests and more operations increase the workload, the cache should still deliver the same performance. It should also be able to scale easily and affordably without impacting availability.

Availability: The cache must ensure the availability of data 24/7 so that data is always available during planned and unplanned downtime. 

Manageability: Using a cache should be quick to deploy. It should also be easy to monitor and manage, without adding unnecessary extra work for the operations team.

Simplicity: If done properly, adding a cache to a deployment shouldn't make things unnecessarily complex, or add additional work for developers.

Affordability: Upfront implementation costs should always be considered with any IT decision, as well as ongoing costs. An evaluation should consider total cost of ownership, including license fees as well as hardware, services, maintenance and support.

Featured Resources

Top 5 challenges of migrating applications to the cloud

Explore how VMware Cloud on AWS helps to address common cloud migration challenges

Download now

3 reasons why now is the time to rethink your network

Changing requirements call for new solutions

Download now

All-flash buyer’s guide

Tips for evaluating Solid-State Arrays

Download now

Enabling enterprise machine and deep learning with intelligent storage

The power of AI can only be realised through efficient and performant delivery of data

Download now
Advertisement
Advertisement

Most Popular

Visit/security/privacy/355155/zoom-kills-facebook-integration-after-data-transfer-backlash
privacy

Zoom kills Facebook integration after data transfer backlash

30 Mar 2020
Visit/security/data-breaches/355173/marriott-hit-by-data-breach-exposing-personal-data-of-52-million
data breaches

Marriott data breach exposes personal data of 5.2 million guests

31 Mar 2020
Visit/security/cyber-crime/355171/fbi-warns-of-zoom-bombing-hackers-amidst-coronavirus-usage-spike
cyber crime

FBI warns of ‘Zoom-bombing’ hackers amid coronavirus usage spike

31 Mar 2020
Visit/data-insights/data-management/355170/oracle-cloud-courses-are-free-during-coronavirus-lockdown
data management

Oracle cloud courses are free during coronavirus lockdown

31 Mar 2020