If you are performing expensive computations or fetching lots of data across the network, it not only takes time, it costs money. In this article we give you some background on the different types of caching available to you, and a few tips on how best to think about implementing good caching strategies.
Joel Burch
COO
Caching is widely considered a linchpin for improving performance, reducing load and network response times, and enhancing responsiveness across all types of applications.
A cache is a copy of data that is quicker and easier to obtain than the source data.
By retrieving cached data instead of making expensive calls applications can load faster and reduce the number of expensive computational calls and database queries.
There are 2 types of operation that can be expensive on any machine: IO and CPU.
IO based operations read and/or write data to/from disk.
CPU based operations perform calculations, making use of the processor, and potentially blocking or slowing down other CPU based operations.
Caching works best for data that seldom changes, or for computations that are functional in nature, meaning that the same input will always result in the same output.
Well, the primary reason for caring about caching data is speed. Put simply, caching makes things quicker. The importance of speed may of course differ between different applications, but quicker is pretty much always better than slower if you can manage it.
How the benefits of increased speed are perceived can vary depending on the type of application.
Users don't like slow systems. If they have to wait even a couple of seconds for a response, it can be frustrating. Mozilla has some guidelines here, but in general the "under a second" rule is a good guideline. If your site or application is consistently slow, users are less likely to come back, or even stay and browse more. If your application is used in countries with bandwidth limits, then your users will thank you for not wasting their monthly allowance.
As well as speed, users may be concerned about the battery life on their mobile devices, which can be impacted if your application is constantly downloading data or performing heavy calculations.
Performing expensive computations or fetching lots of data across the network not only takes time, it costs money. A CPU struggling to perform an expensive computation, or a complex SQL query running on a database increases the running cost of the server(s) performing the actions and shortens the lifespan of the disks. Add to this the fact that if you constantly run expensive operations, you likely need to purchase more powerful servers to start with.
If these actions are running in the cloud on transient resources (eg. AWS lambda), and you can avoid calling them at all by caching results, the savings to your cloud bill could be even greater.
There is a lot of discussion these days about the environmental impact of IT systems. By reducing power consumption, being able to run smaller servers and generally offloading a lot of the expensive operations to a cache lookup, you can reduce the environmental footprint of your application.
You may come across different ways of categorising caching strategies. For most use cases though there are 3 strategies you should focus on.
This involves storing data in the server's memory. In-memory caching is exceptionally fast because it eliminates the need to fetch information from disk (an IO operation).
Common tools like Redis and Memcached offer efficient in-memory caching and are often used to store frequently accessed data, session information or database query results.
The downside of in-memory caches is that they are volatile. Data stored in these caches is not persistent and could easily be lost if the server restarts. As such they are not especially suitable for transient functionality like AWS lambda.
Caching information in the browser is aimed at speeding up the perceived performance of web pages and applications. It focuses on storing resources - HTML pages, CSS files, images and JavaScript files - locally on the user's device. When the user revisits a site, the browser then loads the local copy of these resources rather than re-fetching them from the server, saving both time and bandwidth.
Browser caches will frequently be used to store session information such as authentication tokens, session IDs and user specific information. This reduces the need for repetitive authentication checks and improves login responsiveness and overall interaction speed.
Browser caching can be controlled by developers using HTTP headers and specifying caching policies for different types of content.
CDN stands for Content Delivery Network. This is a separate system holding content which is usually distributed across multiple servers globally. They are used for static content like images, videos, scripts etc. When a user requests content, the CDN will deliver it from the server closest to their geographical location, thus reducing latency and enhancing overall performance.
Modern CDNs also offer functionality like edge caching where content is cached closer to end-users at network edges. Companies like Netflix make heavy use of this concept.
Divio's partnership enhances your application with Cloudflare application security and gives you effortless caching with minimal effort.
Sounds great! But it can't be all good.
What's the catch?
There's no real catch as such. A good caching strategy is usually a mix of in-memory caching on the server, use of a CDN and browser/application caching.
There is one aspect that needs to be considered though, and that is cache invalidation and maintenance.
Ensuring data consistency and accuracy is vital for effective caching mechanisms. Addressing challenges related to cache expiration, staleness, and maintaining up-to-date information are crucial aspects of cache management.
Cache Invalidation
Cache invalidation refers to the process of removing or updating cached data to ensure that only the most current information is served. Outdated or stale data within the cache can lead to inaccuracies and undermine the purpose of caching. Implementing robust cache invalidation mechanisms is pivotal for maintaining data integrity.
Cache Maintenance
Several strategies ensure that cached data remains accurate and consistent. Time-based expiration, where cached items have a predefined lifespan, is a common approach. Additionally, event-based invalidation triggers cache updates when specific events occur, such as data modifications or updates.
Challenges and Mitigation
Cache expiration and staleness definitely present challenges, especially when dealing with rapidly changing data. For example:
Keeping the cache lifespan short enough to maintain data freshness while avoiding excessive re-caching that impacts performance.
Managing dependencies - when one piece of data changes, it might necessitate invalidating multiple cached items.
Addressing these challenges often involves a combination of efficient cache eviction policies, employing cache invalidation strategies aligned with the application's data volatility, and implementing robust mechanisms to handle dependencies between cached items.
Versioning or tagging mechanisms can help with precise cache invalidation, allowing removal or update of specific cached items without affecting the entire cache. Additionally, using cache-control headers and proper HTTP cache directives ensures effective communication between servers and clients regarding cache management.
If you are new to caching and aren't sure where to start, think about the following:
Begin by comprehensively understanding the application's data access patterns, user interactions, and performance bottlenecks. Identify components that would benefit most from caching, such as frequently accessed data or resource-intensive processes.
Choose caching strategies aligned with specific application needs. For instance, in-memory caching might be suitable for frequently accessed data, while browser caching works well for static content. Consider a combination of caching mechanisms to cover various aspects of the application effectively.
Employ robust cache invalidation mechanisms to ensure data consistency and accuracy. Define strategies to refresh or remove cached items when data changes occur, preventing the serving of outdated information to users.
Define appropriate cache expiration and eviction policies to balance data freshness and performance. Avoid excessively short cache lifespans that could lead to frequent re-caching, which impacts performance, and prioritise cache items based on their importance and frequency of access.
Monitor caching performance metrics to identify inefficiencies or bottlenecks. Use tools to see cache hit rates, latency, and memory usage analysis. Adjust caching strategies based on these insights to optimise performance and remember that the needs might change as your application and user behavior evolves.
Prepare your application to handle cache failures or unavailability gracefully. Implement fallback mechanisms or default behaviors to ensure continued functionality even if caching systems experience issues. It's better to have a slow app, than one that doesn't work at all!
Document the caching strategies in use, configurations, and any specific implementation details. This knowledge sharing ensures consistency across development teams and aids troubleshooting or future optimisations.
Divio provides built-in solutions for caching when deploying applications onto our platform. Check out the documentation to get started or get in touch if you have more specific or complex needs and let us help you.
Stay up to date! Connect with us on LinkedIn and X/Twitter to get exclusive insights and be the first to discover our latest blog posts.