Performance Optimizations - Why Output Cache is Important

Performance Optimizations - Why Output Cache is Important

March 09, 2009 0 Comments

The content you're reading is getting on in years
This post is on the older side and its content may be out of date.
Be sure to visit our blogs homepage for our latest news, updates and information.

I’m starting a series of articles on performance optimizations in Sitefinity. In the first article I will explain what Output Cache is, how it works, what options Sitefinity offers and why does it matter.

ASP.Net is a great technology allowing us to easily create dynamic pages. Dynamic as opposed to static HTML means we can read data form database or other data source and display that data as it’s been altered. Also we can use controls that render different HTML based on settings or user input. ASP.Net and other systems build on top, such as Sitefinity, allow us to quickly build web applications and provide business solutions. However, all this comes at a price. Every single component in the execution chain is adding, more or less, to the CPU cost. Usually shorter execution chain means less CPU usage.

Request / Response lifecycle

I know most of you are very familiar with request / response lifecycle, but let me briefly describe to emphasize the importance of proper output caching. So, when an .aspx page is requested IIS passes the request to ASP.Net. ASP.Net creates new HttpContext object and invokes all HttpModules. Once all HttpModules are completed the context is passed to HttpHandler. In the case of .aspx page request, the handler creates new instance of Page object and calls ProcessRequestMain method of the page providing the current HttpContext. From that point on - the page lifecycle begins. Control hierarchy is instantiated, control state is loaded, events are fired, and then rendering process begins. As each control is created and added to the page’s controls collection, it starts a similar lifecycle. Some of the controls can make database calls, retrieve and process data or perform business logic before they render their HTML output. At render stage, the entire control hierarchy is traversed in the order it was created and each control writes it’s rendered HTML to the output stream, forming the response. After all controls render, the page and its control hierarchy is destroyed and the response is complete.

As you can see this is a lot of work that happens on every single request. Very often the content of pages don’t change from one request to another or they do rarely and then all this work is just wasted. If your site is not very busy, going through the entire lifecycle on each request might be perfectly OK. For busy sites, however, we definitely have to take action, as all this work is multiplied by the number of simultaneous requests. Note that by simultaneous requests I mean simultaneous for the entire site, not just the same page. By caching as many pages as possible, you leave more CPU time for the ones that really need it.

Make your website snappy with output cache

How does output cache work? When a URL is requested for the first time, the entire request / response lifecycle described above is processed, but right before the response is sent to the client - it gets stored in memory (Sitefinity also provides database cache storage). When the same URL is requested next time, instead of going through the same process, the response is retrieved from the cache and directly sent to the client.

It is perfect - almost. Once the page is cached it starts to work like static HTML. Very fast response, almost no server resource consumed. But there is a catch.

Cached pages are not dynamic anymore?

 What happens if data changes after the pages are cached? Or what if we want to display some information that is specific to the user making the request? There are different techniques such as expiration periods, cache dependencies and cache substitutions that allow you to control caching behavior and overcome these problems. How you choose techniques depends on what is more important to you, data actuality, performance, ease of use or effort involved.

What is next?

Expiration periods and cache dependencies allow you to control data actuality, while cache substitution allows you to inject content into already cached output and thus dynamically display information per request without losing much performance. Since in most cases Sitefinity handles cache dependencies automatically - while in Sitefinity 3.6 expiration is handled through GUI (making it dully obvious), the next posts will be about creating and using cache substitution controls. I will explain how Sitefinity extends ASP.Net cache substitution and how easy it is to turn any user control into a cache substitution.

I will also demonstrate how to create custom cache dependencies and will explain expiration settings as well as their effect on performance and memory usage.

The idea of this series is to help you make the right decisions and pull the optimal performance from your system by taking advantage of Sitefinity performance components.


The Progress Team

View all posts from The Progress Team on the Progress blog. Connect with us about all things application development and deployment, data integration and digital business.

Comments are disabled in preview mode.
Latest Stories in
Your Inbox
More From Progress
The New Mobile Development Landscape
Download Whitepaper
IDC Spotlight Sitefinity Thumbnail
Choosing the Right Digital Experience Platform to Improve Business Outcomes
Download Whitepaper
The Fastest Way to Build Mobile Apps With Cloud Data
Watch Webinar