Sitefinity Streaming API: Introduction

Sitefinity Streaming API: Introduction

March 15, 2010 0 Comments

The content you're reading is getting on in years
This post is on the older side and its content may be out of date.
Be sure to visit our blogs homepage for our latest news, updates and information.

This is a part of the blog post series that explain the new streaming API for Sitefinity 3.7 SP3. You can view the TOC in the first blog post.

Until now, Sitefinity had just one type of binary content. It was saved as a byte array, meaning that it was stored in memory. While this makes the data layer simpler to explain and grasp, it does have a serious disadvantage, namely - having the whole content loaded in memory.

In the past we tried dealing with the problem by implementing "lazy loading" for IContent.Content. Said simply, this means that the actual content is loaded only when accessed. However, the simple check whether IContent.Content is byte[] or string meant loading the content into memory. While this doesn't present a problem with small images and documents, it becomes an issue with large video files, for example.

Here we have a major impediment: the web technology itself. There is maximum request length, which in ASP.NET is 2 GB. Even if we used a plugin (like Silverlight) to upload files larger than that (as Javascript can't access the local file system with a few exceptions - for security issues), we would need to have custom solution for downloading files (we can't use http handlers for files larger than the ASP.NET limit). If we go into details even further, WCF services have maximum message length themselves. 

Since we strive to provide a seamless upgrade process, we had to make sure that old content would continue to work without any problems. All this meant that we had to compromise. Here is what we came up with:
  • We will have a limit of 2 GB per file
  • Streaming will be available for Generic Content-based modules only
  • Streaming is used for newly uploaded content only
  • There are some corner cases where the whole binary data is loaded into memory
This list of limitations also hints of the solution we came up with: streaming. Since we don't want to keep the whole file in memory, we have only a portion of it loaded, when we don't need it - it gets discarded and cleaned-up by the garbage collector. And this whole change meant that we could implement files with streaming as well!

While the term "streaming" might be a little obscure to non-developers, this is the de-facto way of working with I/O in .NET. Almost any resource you can get uses a subclass of Stream. Therefore, our course of action was obvious. The new streaming API actually is all about getting a stream for your IContent's content, whether it supports streaming and some helper methods that will make using streams easier.

In the next blog post of the series, I will be explaining about the changes we made to the data layer to enable streaming.
progress-logo

The Progress Guys

View all posts from The Progress Guys on the Progress blog. Connect with us about all things application development and deployment, data integration and digital business.

Comments
Comments are disabled in preview mode.
Topics
 
 
Latest Stories in
Your Inbox
Subscribe
More From Progress
d12fcc0bdb669b804e7f71198c9619a7
5 Questions Automakers Should Ask to Improve Asset Uptime
Download Whitepaper
 
SF_MQ_WCM
2018 Gartner Magic Quadrant Web Content Management (WCM)
Download Whitepaper
 
What-Serverless-Means-For-Enterprice-Apps-Kinvey
What Serverless Means for Enterprise Apps
Watch Webinar