Automate your infrastructure to build, deploy, manage, and secure applications in modern cloud, hybrid, and on-premises environments.
Sitefinity comes with an advanced set of tools to help you always have your web apps at their best behavior. CDN, geo-distribution, load balancing, caching: there are no greater and lesser tasks when the best possible performance is the ultimate goal. Keeping those tools sharp is our responsibility, but using them properly is where you come in.
Your application’s speed, performance, and availability have a direct bearing on the way your users engage and how likely they are to return. You know it and we know it. In a series of recent posts, we looked up close at various features, tweaks, and settings in Sitefinity that help you deliver always-on services and compelling user experiences. This one picks up exactly where we left off with webhooks and CI/CD pipelines.
Hits and misses are a key cache performance metric but when it gets to your website’s performance – and user experience – the last thing you want them to be is hit-or-miss. More often than not, proper use of caching can prevent or resolve multiple issues with an application’s general speed and response.
In this post, we’re looking at the behind-the-scenes workings that lead up to a page being requested, compiled, rendered and served – in a way that feels fast and seamless to your end users, regardless of the platform on which they engage with your content, desktop, or hand-held.
The Output cache is one of the most important mechanisms to help you achieve this. Being the first caching layer that handles incoming requests, it’s the one that makes sure your up-to-date content is ready to be served instantly instead of being processed server-side every time.
In Sitefinity, the output cache can be configured to use the web server’s inbuilt memory or a decoupled alternative, distributed storage. Distributed output cache is your best bet if you need to ensure your application’s high availability and a readily scalable infrastructure.
Now, this take on the subject of caching and website performance has an obvious cloud angle to it but note that the environment does not actually limit or widen the caching tools at your disposal. They are pretty much uniform, whether your application is hosted on-premises or in the cloud.
When you aim for the best possible performance in a production context, you should always consider distributed output cache. In Sitefinity Cloud, we use Redis—conveniently providing a single point for all the application instances to share the output cache for each and every page they’re working with.
If you’re scaling out and have deployed a new production instance, the new server node can start retrieving items from Redis instead of having to compile and save the pages to its own output cache storage. Which is to say, adding new production instances to your setup doesn’t need to involve significant downtime.
Although a cold start is where distributed cache would make the most dramatic difference, it comes in handy after node restarts as well. The only downtime would be until Sitefinity starts up, and users don’t have to wait for every page to be compiled and cached. The latest version of a page will automatically be served from the shared output cache.
It’s quite useful too when a distributed application is dealing with high traffic. Whenever an app node compiles a page and caches it, all the other nodes will directly read it from the cache, instead of having to compile it from scratch and store it each in their local memory.
Now, here’s a fairly new feature that’s not yet used widely enough. The cache warmup proves quite helpful when a new page is published – and has its output cache invalidated as a result.
Without the output cache warmup feature, the first time a newly promoted page is requested, it may take a while for its latest version to compile and render. The user who drew the short straw will have to wait longer than they are used to or care to put up with.
Depending on the website and page complexity, delays can potentially be significant, and this is where the Sitefinity cache warmup more than earns its keep. It’s a matter of seconds, of course, but even a fraction of a second can be decisive for the user experience and, ultimately, your bottom line.
And what if we take this another level up and think about a page template that might have invalidated the output cache for tens or even hundreds of pages using it. We’ll end up with as many pages loading and it might take a while – an agonizingly long while if it’s a popular template, shared by many pages – before your website is back to its normal speed and response.
If you have enabled and configured cache warmup, Sitefinity will make the first request to itself instead of waiting for a visitor to hit the website and the page in question. Moreover, Sitefinity will make the request but also keep serving the last cached version of the page in question from the output cache – until the new version is cached.
An important disclaimer: this is in no way related to the warmup module but a separate functionality specific to the output cache. Which is good news really, a back-up on more than one side. You can utilize the cache warmup without even having the Warmup module enabled and vice versa.
Bottom line, the front-end experience will always be fast and uninterrupted. For a short while after publishing a new page, users will continue to see the previous version of the site until the warmup request is complete and the new content is properly compiled and cached. Once that’s done, the page will immediately switch to the latest version. For the customer, it will look instant and seamless, the UX they’re used to isn't compromised. Which is essentially why we’re here in the first place.
A lot is going on behind the scenes when you update a page or a template. Sitefinity is a complex system and getting all the configurations and settings right can often seem a tall task. Pulling strings does not necessarily make good music. It takes practice and a finely tuned instrument among other things.
Watch this space for more tips and tricks for optimal performance but don’t forget that practice is the best way to sharpen your Sitefinity skills. And remember you can count on us to help with anything. Talk to a Sitefinity expert today or read more about the best administration practices. While still fresh on the subject, configuring distributed output cache and the cache invalidation settings make sense indeed. Or you may want to put in-memory and distributed output cache side-by-side. Whatever it is – we’re with you all the way.
A new addition to the Sitefinity Product Marketing team, Anton has a mixed background of software and writing for the web. He has spent the last 7 years in software development, on the project management and product ownership side, all the while writing about technology, gadgets and their use and usability. Always trying to get to the bottom of it without missing the bigger picture.
Subscribe to get all the news, info and tutorials you need to build better business apps and sites
You have the right to request deletion of your Personal Information at any time.
You can also ask us not to pass your Personal Information to third parties here: Do Not Sell My Info
Let our experts teach you how to use Sitefinity's best-in-class features to deliver compelling digital experiences.
Copyright © 2020 Progress Software Corporation and/or its subsidiaries or affiliates.All Rights Reserved.
Progress, Telerik, Ipswitch, and certain product names used herein are trademarks or registered trademarks of Progress Software Corporation and/or one of its subsidiaries or affiliates in the U.S. and/or other countries. See Trademarks for appropriate markings.