Munq is for web, Unity is for Enterprise

The Unity Application Block (Unity) is a lightweight extensible dependency injection container with support for constructor, property, and method call injection. It’s a great library for facilitating Inversion of Control and the recent version supports AOP as well. However, when it comes to performance, it’s CPU hungry. In fact it’s so CPU hungry that it makes it impossible to make it work at Internet Scale. I was investigating some CPU issue on a portal that gets around 3MM hits per day and I found unusually high CPU. Here’s why:

Unity_High_CPU

I did some CPU profiling on my open source project Dropthings and found that the highest CPU is consumed by Unity’s Resolve<>(). There’s no funky use of Unity in the project. Straightforward Register<>() and Resolve<>(). But as you can see, Resolve<>() is consuming significantly high CPU even after the site is warm and has been running for a while.

Then I tried Munq, which is a basic Dependency Injection Container. It has everything you will usually need in a regular project. It boasts to be the fastest DI out there. So, I converted all Unity code to Munq in Dropthings and did a CPU profile and Whala!

MunqMuchBetter

 

There’s no trace of any Munq calls anywhere. That proves Munq is a lot faster than Unity.

Keep website and webservices warm with zero coding

If you want to keep your websites or webservices warm and save user from seeing the long warm up time after an application pool recycle, or IIS restart or new code deployment or even windows restart, you can use the tinyget command line tool, that comes with IIS Resource Kit, to hit the site and services and keep them warm. Here’s how:

First get tinyget from here. Download and install the IIS 6.0 Resource Kit on some PC. Then copy the tinyget.exe from “C:Program Files (x86)IIS ResourcesTinyGet” to the server where your IIS 6.0 or IIS 7 is running.

Then create a batch file that will hit the pages and webservices. Something like this:

SET TINYGET=C:Program Files (x86)IIS ResourcesTinyGettinyget.exe

"%TINYGET%" -srv:dropthings.omaralzabir.com -uri:http://dropthings.omaralzabir.com/ -status:200
"%TINYGET%" -srv:dropthings.omaralzabir.com -uri:http://dropthings.omaralzabir.com/WidgetService.asmx?WSDL - status:200

Save this in a batch file and run it as a scheduled task at some interval like 10 minutes and your website will always remain nice and warm.

First I am hitting the homepage to keep the webpage warm. Then I am hitting the webservice URL with ?WSDL parameter, which allows ASP.NET to compile the service if not already compiled and walk through all the operations and reflect on them and thus loading all related DLLs into memory and reducing the warmup time when hit.

Tinyget gets the servers name or IP in the –srv parameter and then the actual URI in the –uri. I have specified what’s the HTTP response code to expect in –status parameter. It ensures the site is alive and is returning http 200 code.

Besides just warming up a site, you can do some load test on the site. Tinyget can run in multiple threads and run loops to hit some URL. You can literally blow up a site with commands like this:

"%TINYGET%" -threads:30 -loop:100 -srv:google.com -uri:http://www.google.com/ -status:200

 

Tinyget is also pretty useful to run automated tests. You can record http posts in a text file and then use it to make http posts to some page. Then you can put matching clause to check for certain string in the output to ensure the correct response is given. Thus with some simple command line commands, you can warm up, do some transactions, validate the site is giving off correct response as well as run a load test to ensure the server performing well. Very cheap way to get a lot done.

 

Redirecting traffic from http to https with zero coding in IIS

When you want to enforce https on users and you want to redirect any URL being hit on http to the exact counterpart of https, then usually you do it with some HttpModule written in .NET, or install some URL Redirector module in IIS, or setup a dummy website on http and then use meta refresh tag to send traffic to https. There are many solutions out there which requires some amount of coding skill. Let me show you a zero coding redirection setup.

First, go to the properties of the real website which is now listening on both http and https. Change the http port to something dummy like 8083. This will prevent the website from holding onto port 80.

Now create a new website on an empty folder that has read permission on NETWORK SERVICE account. The new website will have the same host headers as the real one, say omaralzabir.com. But it will listen on port 80 only. It won’t listen on https port 443.

Once created, go to properties and Home Directory tab. And do the following:

IIS website properties to redirect from http to https

Things to do here:

  • Select the “A redirection to a URL” option.
  • Enter the https://yourdomain.com and then $S$Q. Remember, there’s no trailing slash after the domain.
  • Select “The exact URL entered above"”
  • Select “A permanent redirection for this resource”
  • Set Execute permission to None.
  • You can select the same app pool as the original one.

That’s it.

Update:

The $S puts the subdirectories. If you have hit http://omaralzabir.com/subdir/anotherdit, $S = /subdir/anontherdir. And the $Q represents the query string. Together, they represent the whole path and query string.

99.99% available ASP.NET and SQL Server SaaS Production Architecture

You have a hot ASP.NET+SQL Server product, growing at thousand
users per day and you have hit the limit of your own garage hosting
capability. Now that you have enough VC money in your pocket, you
are planning to go out and host on some real hosting facility,
maybe a colocation or managed hosting. So, you are thinking, how to
design a physical architecture that will ensure performance,
scalability, security and availability of your product? How can you
achieve four-nine (99.99%) availability? How do you securely let
your development team connect to production servers? How do you
choose the right hardware for web and database server? Should you
use Storage Area Network (SAN) or just local disks on RAID? How do
you securely connect your office computers to production
environment?

Here I will answer all these queries. Let me first show you a
diagram that I made for Pageflakes where we ensured we get
four-nine availability. Since Pageflakes is a Level 3
SaaS
, it’s absolutely important that we build a high
performance, highly available product that can be used from
anywhere in the world 24/7 and end-user gets quick access to their
content with complete personalization and customization of content
and can share it with others and to the world. So, you can take
this production architecture as a very good candidate for Level 3
SaaS:


Hosting_environment

Here’s a CodeProject article that explains all the
ideas:

99.99% available ASP.NET and SQL Server SaaS Production
Architecture

Hope you like it. Appreciate your vote.


kick it on DotNetKicks.com

ASP.NET website Continuous Integration+Deployment using CruiseControl.NET, Subversion, MSBuild and Robocopy

You can setup continuous integration and automated deployment
for your web application using CruiseControl.NET, Subversion,
MSBuild and Robocopy. I will show you how you can automatically
build the entire solution, email build report to developers and QA,
deploy latest code in IIS all using CruiseControl.NET every N
minutes.

First get the following:

  • CruiseControl.NET

  • Subversion
    (install the command line tools and add the
    Subversion bin path to PATH environment variable)
  • Robocopy (Windows Vista/2008 has it built-in, here’s the link
    for
    Windows 2003
    )
  • Install .NET Framework. You need it for MSBuild.

You will learn how I have configured Continuous Integration and
Deployment for my open source AJAX Portal project www.Dropthings.com. The code is
hosted at CodePlex. When some developer makes a commit,
CruiseControl downloads the latest code, builds the entire
solution, emails build report and then deploys the latest web site
to IIS 6.0.

After installing CruiseControl.NET, go to Programs -> Cruise Control ->
CruiseControl.NET Config
.

Now keep copying and pasting the following XML blocks and make
sure you understand each block and make necessary changes:


   1: <cruisecontrol>

   2:     <project name="Dropthings" queue="DropthingsQueue" queuePriority="1">

   3:         

   7:         <workingDirectory>d:ccdropthingscodetrunkworkingDirectory>

   8:         

   9:         <artifactDirectory>d:ccdropthingsartifactartifactDirectory>

  10:         <category>Dropthingscategory>

  11:         

  12:         <webURL>http://localhost/ccnet/webURL>

  13:         <modificationDelaySeconds>60modificationDelaySeconds>

  14:         <labeller type="defaultlabeller">

  15:             <prefix>0.1.prefix>

  16:             <incrementOnFailure>trueincrementOnFailure>

  17:             <labelFormat>000labelFormat>

  18:         labeller>

  19:         <state type="state" directory="State" />

First change the working directory. It needs to be the path of
the folder where you will have the solution downloaded. I generally
create folder structure like this:

  • D:CC – Root for all CC.NET enabled projects
    • ProjectName – Root project folder
      • Code – Code folder where code is downloaded from
        subversion
      • Artifact – CC.NET generates a lot of stuff. All goes
        here.

Next comes the Subversion integration block:


   1: <sourcecontrol type="svn">

   2:     

   3:     <trunkUrl>http://localhost:8081/tfs02.codeplex.com/dropthings/trunktrunkUrl>

   4:     <workingDirectory>workingDirectory>

   5:     <username>***** SUBVERSION USER NAME *****username>

   6:     <password>***** SUBVERSION PATH *****password>

   7: sourcecontrol>

Here specify the subversion location where you want to download
code to the working folder. You should download the entire solution
because you will be building the entire solution using MSBuild
soon.

I left empty.
This means whatever is specified earlier in the is
used. Otherwise you can put some relative folder path here or any
absolute folder.

Now we start building the tasks that CC.NET executes – Build,
Email, and Deploy.


   1: <tasks>

   2:     <artifactcleanup   cleanUpMethod="KeepLastXBuilds"   cleanUpValue="5" />

   3:     <modificationWriter>

   4:         <filename>mods.xmlfilename>

   5:         <path>path>

   6:     modificationWriter>

   7:

   8:     

   9:     <msbuild>

  10:         <executable>C:windowsMicrosoft.NETFramework64v3.5MSBuild.exeexecutable>

  11:         <workingDirectory>workingDirectory>

  12:         <projectFile>Dropthings.msbuildprojectFile>

  13:         <targets>Buildtargets>

  14:         <timeout>300timeout>

  15:         <logger>C:Program Files (x86)CruiseControl.NETserverThoughtWorks.CruiseControl.MsBuild.dlllogger>

  16:     msbuild>

This block first says, keep artifacts for last 5 build and
remove olders. Artifacts are like build reports, logs etc. You can
increase the value for longer history.

Then the most important task. The
executable path is to the MSBuild.exe. I am using .NET 3.5
Framework 64bit edition. You might have .NET 2.0 and 32bit version.
So, set the right path here for the MSbuild.exe.

maps to a
MSBuild file. It’s a skeleton MSBuild file which basically says
build this Visual Studio solution file. Here’s how the msbuild file
looks like:


<Project DefaultTargets="Build" xmlns="http://schemas.microsoft.com/developer/msbuild/2003">

  <Target Name="Build">

    

    <MSBuild Projects="Dropthings.sln" Targets="Rebuild" />

  Target>

Project>

The Dropthings.msbuild and
Dropthings.sln file
exists in the same trunk folder. This file says – build
Dropthings.sln and do a rebuild.

Now you got the build done. Next is to deploy it. You will be
using robocopy to copy files from the code folder to a destination
folder which is mapped in IIS to a website. Robocopy will do a
synchronization of the directories. It will add new files,
overwrite old files and removes files from destination folder which
no longer exists in the source folder.

Before you can deploy, you need to stop the website or restart
IIS. Otherwise some files may be in use and you will not be able to
delete or overwrite the files. Here’s how to stop IIS using the
iisreset command
line tool:




<exec>

    <executable>iisresetexecutable>

    <buildArgs>/stopbuildArgs>

exec>

If you do not want to stop the entire IIS, instead just stop a
website and recycle an application pool, you can use the iisweb.vbs
script for stopping a website and iisapp.vbs script for recycling
application pool. Here’s an example:


<exec>

    <executable>iiswebexecutable>

    <buildArgs>/stop "Dropthings"buildArgs>

exec>



<exec>

    <executable>iisappexecutable>

    <buildArgs> /a "Dropthings" /rbuildArgs>

exec>

You need to first register cscript as the default script
runtime. In order to do this, go to command line and enter iisweb.
It will tell you that it cannot use wscript to run this script and
it needs to make cscript default. Let it make cscript as
default.

Now the time to do the deployment of latest web site files. The
following task launches robocopy to do the deployment:




<exec>

    

    <executable>robocopy.exeexecutable>

    <baseDirectory>DropthingsbaseDirectory>

    <buildArgs>. d:ccDropthingsDeploy *.* /E /XA:H /PURGE /XO /XD ".svn" /NDL /NC /NS /NPbuildArgs>

    <buildTimeoutSeconds>60buildTimeoutSeconds>

    <successExitCodes>1,0successExitCodes>

exec>

First you need to correct the robocopy.exe path. For Windows
Vista/Windows 2008, keep it as it is. For Windows 2003, you need to
specify the full path. You also need to remove the (x86) from the
path if you have 32bit OS.

Next is the . This is
relative to the working directory. It’s the path of the website
folder. Dropthings website folder is located under the Dropthings
folder under trunk. So, I have specified Dropthings as the
subfolder where the website files are located. You need to specify
your project’s website folder’s relative path here form the
.

Next change the path in the node. First
one is the source “.” which you keep as it is. It means copy files
from the baseDirectory. Next is the
absolute path to the deployment folder where the web site is mapped
in IIS. You can use both relative or absolute path here. While
using relative path, just keep in mind the robocopy is running from
the folder.

After the path keep the *.* and the remaining flags intact. The
flags mean:

  • Copy all subdirectories /E
  • Copy hidded files /XA:H
  • Do not copy old files /XO
  • Exclude .svn directory while copying files /XD “.svn”
  • Do not show list of files and directories being copie /NDL,
    /NC, /NP

After the deployment, you need to turn IIS back on or start the
website that you stopped:




<exec>

    <executable>iisresetexecutable>

    <buildArgs>/startbuildArgs>

exec>




Now we got the build and deployment done. Next is to email a
nice report to developers and QA. If build succeeds, email both
developers and QA so that they can check out the latest build. But
if build fails, email only developers.


<publishers>

    <rss/>

    <xmllogger />

    <statistics />



    

    <email from="admin@yourcompany.com" mailhost="localhost" mailport="25" includeDetails="TRUE"

           mailhostUsername="" mailhostPassword="" useSSL="FALSE">



        <users>

            <user name="Developer1" group="devs" address="dev1@yourcompany.com"/>

            <user name="Developer2" group="devs" address="dev2@yourcompany.com"/>

            <user name="Developer3" group="devs" address="dev3@yourcompany.com"/>



            <user name="QA1" group="qa" address="qa1@yourcompany.com"/>

            <user name="QA2" group="qa" address="qa2@yourcompany.com"/>

            <user name="QA3" group="qa" address="qa3@yourcompany.com"/>



        users>



        <groups>

            <group name="devs" notification="Always"/>

            <group name="qa" notification="Success"/>

        groups>



        <converters>

            

        converters>



        <modifierNotificationTypes>

            <NotificationType>AlwaysNotificationType>

        modifierNotificationTypes>



    email>

    <modificationHistory  onlyLogWhenChangesFound="true" />

publishers>

First you need to change the tab where you
specify the from address, mail server name, and optionally a user
account for the email address that you need to use to send out
emails.

Then edit the node and put your
developers and QA.

That’s it! You got the configuration file done. Next step is to
launch the CruiseControl.NET from Programs -> CruiseControl.NET
-> CruiseControl.NET. It will launch a process that will execute
the tasks according to the configuration. On Windows Vista, you
will have to run it with Administrative privilege.

There’s also a Windows Service that gets installed. It’s named
CruiseControl.NET. You can start the service as well on a server
and go to sleep. It will do continuous integration and automated
deployment for you.

There’s also a web based Dashboard that you can use to force a
build or stop a build or see detail build reports.


image

You can create multiple projects. You can have one project to
build trunk code only, but do no deployment. Then you can create
another project to build, deploy some branch that’s ready for
production. You can create another project to build and deploy on
QA server and so on.


Here’s the full configuration file
that you can use as your
baseline.


kick it on DotNetKicks.com

Best practices for creating websites in IIS 6.0

Every time I create an IIS website, I do some steps, which I
consider as best practice for creating any IIS website for better
performance, maintainability, and scalability. Here’ re the things
I do:

Create a separate application pool for each web
application

I always create separate app pool for each web app because I can
select different schedule for app pool recycle. Some heavy traffic
websites have long recycle schedule where low traffic websites have
short recycle schedule to save memory. Moreover, I can choose
different number of processes served by the app pool. Applications
that are made for web garden mode can benefit from multiple process
where applications that use in-process session, in memory cache
needs to have single process serving the app pool. Hosting all my
application under the DefaultAppPool does not give me the
flexibility to control these per site.

The more app pool you create, the more ASP.NET threads you make
available to your application. Each w3wp.exe has it’s own
thread pool. So, if some application is congesting particular
w3wp.exe process, other applications can run happily on
their separate w3wp.exe instance, running under separate app
pool. Each app pool hosts its own w3wp.exe instance.

So, my rule of thumb: Always create new app pool for new web
applications and name the app pool based on the site’s domain name
or some internal name that makes sense. For example, if you are
creating a new website alzabir.com, name the app pool alzabir.com
to easily identify it.

Another best practice: Disable the DefaultAppPool so that
you don’t mistakenly keep adding sites to
DefaultAppPool.


image

First you create a new application pool. Then you create a new
Website or Virtual Directory, go to Properties -> Home Directory
tab -> Select the new app pool.


image

Customize Website properties for performance,
scalability and maintainability

First you map the right host headers to your website. In order
to do this, go to WebSite tab and click on “Advanced” button. Add
mapping for both domain.com and www.domain.com. Most of the time,
people forget to map the domain.com. Thus many visitors skip typing
the www prefix and get no page served.


image

Next turn on some log entries:


image

These are very handy for analysis. If you want to measure your
bandwidth consumption for specific sites, you need the Bytes Sent.
If you want to measure the execution time of different pages and
find out the slow running pages, you need Time Taken. If you want
to measure unique and returning visitors, you need the Cookie. If
you need to know who is sending you most traffic – search engines
or some websites, you need the Referer. Once these entries are
turned on, you can use variety of Log Analysis tools to do the
analysis. For example, open source AWStats.

But if you are using Google Analytics or something else, you
should have these turned off, especially the Cookie and Referer
because they take quite some space on the log. If you are using
ASP.NET Forms Authentication, the gigantic cookie coming with every
request will produce gigabytes of logs per week if you have a
medium traffic website.


image

This is kinda no brainer. I add Default.aspx as the default
content page so that, when visitors hit the site without any .aspx
page name, e.g. alzabir.com, they get the default.aspx served.


image

Things I do here:

  • Turn on Content Expiration. This makes static files remain in
    browser cache for 30 days and browser serves the files from its own
    cache instead of hitting the server. As a result, when your users
    revisit, they don’t download all the static files like images,
    javascripts, css files again and again. This one setting
    significantly improves your site’s performance.
  • Remove the X-Powered-By: ASP.NET header. You really
    don’t need it unless you want to attach Visual Studio Remote
    Debugger to your IIS. Otherwise, it’s just sending 21 bytes on
    every response.
  • Add “From” header and set the server name. I do this on each
    webserver and specify different names on each box. It’s handy to
    see from which servers requests are being served. When you are
    trying to troubleshoot load balancing issues, it comes handy to see
    if a particular server is sending requests.


image

I set the 404 handler to some ASPX so that I can show some
custom error message. There’s a 404.aspx which shows some nice
friendly message and suggests some other pages that user can visit.
However, another reason to use this custom mapping is to serve
extensionless URL from IIS.
Read this blog post for details
.


image

Make sure to set ASP.NET 2.0 for your ASP.NET 2.0, 3.0 and 3.5
websites.

Finally, you must, I repeat you “MUST”
turn on IIS 6.0 gzip compression
. This turns on the Volkswagen
V8 engine that is built into IIS to make your site screaming
fast.

kick it on DotNetKicks.com

Deploy ASP.NET MVC on IIS 6, solve 404, compression and performance problems

There are several problems with ASP.NET MVC application when
deployed on IIS 6.0:

  • Extensionless URLs give 404 unless some URL Rewrite module is
    used or wildcard mapping is enabled
  • IIS 6.0 built-in compression does not work for dynamic
    requests. As a result, ASP.NET pages are served uncompressed
    resulting in poor site load speed.
  • Mapping wildcard extension to ASP.NET introduces the following
    problems:

    • Slow performance as all static files get handled by ASP.NET and
      ASP.NET reads the file from file system on every call
    • Expires headers doesn’t work for static content as IIS does not
      serve them anymore. Learn about benefits of expires header from

      here
      . ASP.NET serves a fixed expires header that makes content
      expire in a day.
    • Cache-Control header does not produce max-age properly and thus
      caching does not work as expected. Learn about caching best
      practices from
      here
      .
  • After deploying on a domain as the root site, the homepage
    produces HTTP 404.

Problem 1: Visiting your website’s homepage gives 404 when
hosted on a domain

You have done the wildcard mapping, mapped .mvc extention to
ASP.NET ISAPI handler, written the route mapping for Default.aspx
or default.aspx (lowercase), but still when you visit your homepage
after deployment, you get:


image

You will find people banging their heads on the wall here:

Solution is to capture hits going to “/” and then rewrite it to
Default.aspx:


image

You can apply this approach to any URL that ASP.NET MVC is not
handling for you and it should handle. Just see the URL reported on
the 404 error page and then rewrite it to a proper URL.

Problem 2: IIS 6 compression is no longer working after
wildcard mapping

When you enable wildcard mapping, IIS 6 compression no longer
works for extensionless URL because IIS 6 does not see any
extension which is defined in IIS Metabase. You can learn about IIS
6 compression feature and how to configure it properly from

my earlier post
.

Solution is to use an HttpModule to do the compression for
dynamic requests.

Problem 3: ASP.NET ISAPI does not cache Static Files

When ASP.NET’s DefaultHttpHandler serves static files, it
does not cache the files in-memory or in ASP.NET cache. As a
result, every hit to static file results in a File read. Below is
the decompiled code in DefaultHttpHandler when it handles a
static file. As you see here, it makes a file read on every hit and
it only set the expiration to one day in future. Moreover, it
generates ETag for each file based on file’s modified date.
For best caching efficiency, we need to get rid of that
ETag, produce an expiry date on far future (like 30 days),
and produce Cache-Control header which offers better control
over caching.


image

So, we need to write a custom static file handler that will
cache small files like images, Javascripts, CSS, HTML and so on in
ASP.NET cache and serve the files directly from cache instead of
hitting the disk. Here are the steps:

  • Install an HttpModule that installs a Compression Stream
    on Response.Filter so that anything written on Response gets
    compressed. This serves dynamic requests.
  • Replace ASP.NET’s DefaultHttpHandler that listens on *.*
    for static files.
  • Write our own Http Handler that will deliver compressed
    response for static resources like Javascript, CSS, and HTML.


image

Here’s the mapping in ASP.NET’s web.config for the
DefaultHttpHandler. You will have to replace this with your
own handler in order to serve static files compressed and
cached.

Solution 1: An Http Module to compress dynamic requests

First, you need to serve compressed responses that are served by
the MvcHandler or ASP.NET’s default Page Handler. The
following HttpCompressionModule hooks on the
Response.Filter and installs a GZipStream or
DeflateStream on it so that whatever is written on the
Response stream, it gets compressed.


image

These are formalities for a regular HttpModule. The real
hook is installed as below:


image

Here you see we ignore requests that are handled by ASP.NET’s
DefaultHttpHandler and our own StaticFileHandler that
you will see in next section. After that, it checks whether the
request allows content to be compressed. Accept-Encoding
header contains “gzip” or “deflate” or both when browser supports
compressed content. So, when browser supports compressed content, a
Response Filter is installed to compress the output.

Solution 2: An Http Module to compress and cache static file
requests

Here’s how the handler works:

  • Hooks on *.* so that all unhandled requests get served by the
    handler
  • Handles some specific files like js, css, html, graphics files.
    Anything else, it lets ASP.NET transmit it
  • The extensions it handles itself, it caches the file content so
    that subsequent requests are served from cache
  • It allows compression of some specific extensions like js, css,
    html. It does not compress graphics files or any other
    extension.

Let’s start with the handler code:


image

Here you will find the extensions the handler handles and the
extensions it compresses. You should only put files that are text
files in the COMPRESS_FILE_TYPES.

Now start handling each request from
BeginProcessRequest.


image

Here you decide the compression mode based on
Accept-Encoding header. If browser does not support
compression, do not perform any compression. Then check if the file
being requested falls in one of the extensions that we support. If
not, let ASP.NET handle it. You will see soon how.


image

Calculate the cache key based on the compression mode and the
physical path of the file. This ensures that no matter what the URL
requested, we have one cache entry for one physical file. Physical
file path won’t be different for the same file. Compression mode is
used in the cache key because we need to store different copy of
the file’s content in ASP.NET cache based on Compression Mode. So,
there will be one uncompressed version, a gzip compressed version
and a deflate compressed version.

Next check if the file exits. If not, throw HTTP 404. Then
create a memory stream that will hold the bytes for the file or the
compressed content. Then read the file and write in the memory
stream either directly or via a GZip or Deflate stream. Then cache
the bytes in the memory stream and deliver to response. You will
see the ReadFileData and CacheAndDeliver functions
soon.


image

This function delivers content directly from ASP.NET cache. The
code is simple, read from cache and write to the response.

When the content is not available in cache, read the file bytes
and store in a memory stream either as it is or compressed based on
what compression mode you decided before:


image

Here bytes are read in chunk in order to avoid large amount of
memory allocation. You could read the whole file in one shot and
store in a byte array same as the size of the file length. But I
wanted to save memory allocation. Do a performance test to figure
out if reading in 8K chunk is not the best approach for you.

Now you have the bytes to write to the response. Next step is to
cache it and then deliver it.


image

Now the two functions that you have seen several times and have
been wondering what they do. Here they are:


image

WriteResponse has no tricks, but
ProduceResponseHeader has much wisdom in it. First it turns
off response buffering so that ASP.NET does not store the written
bytes in any internal buffer. This saves some memory allocation.
Then it produces proper cache headers to cache the file in browser
and proxy for 30 days, ensures proxy revalidate the file after the
expiry date and also produces the Last-Modified date from
the file’s last write time in UTC.

How to use it

Get the HttpCompressionModule and
StaticFileHandler from:

http://code.msdn.microsoft.com/fastmvc

Then install them in web.config. First you install the
StaticFileHandler by removing the existing mapping for
path=”*” and then you install the HttpCompressionModule.


image

That’s it! Enjoy a faster and more responsive ASP.NET MVC
website deployed on IIS 6.0.

Share this post : del.icio.us digg dotnetkicks furl live reddit spurl technorati yahoo

Reduce website download time by heavily compressing PNG and JPEG

PNG and JPEG are two most popular formats for web graphics. JPEG
is used for photographs, screenshots and backgrounds where PNG is
used for all other graphics need including cliparts, buttons,
headers, footers, borders and so on. As a result, these two types
of graphics file usually take up 80% of the total graphics used in
a website. Of course, there’s the GIF, which is very popular. But
as it supports only 256 colors, it is losing its popularity day by
day. PNG seems to be a all rounder winner for all kinds of graphics
need. As all browsers support PNG well enough and PNG supports
alpha transparency, it’s surely the best format so far on the web
for all purpose graphics need for websites. So, if you can optimize
all PNG and JPEG on your website and compress them rigorously, you
can easily shed off several seconds of load time from your website
without doing any coding. Especially if your website is graphics
rich like Pageflakes, 30%
reduction in total size of graphics throughout the website is a big
performance win.

Optimize all PNG on your website

PNG has a lot of scope for optimization. Generally regular
graphics tools like Photoshop, Paintshop pro, Paint.NET all
generate PNG using a moderate compression. So, PNG can be
compressed further by using advanced compression tools. OptiPNG is such a tool
that can compress PNG and sometimes produce 50% smaller output. At
Pageflakes, we have around 380 PNG which when compressed using
OptiPNG, gives us 40% reduction in total size. This is a big win
for us.

Here’s what wikipedia says about OptiPNG:

OptiPNG is an open source command line computer program that
reduces the size of PNG files. The compression is lossless, meaning
that the resulting image will have exactly the same appearance as
the source image.

The main purpose of OptiPNG is to reduce the size of the PNG
IDAT data stream by trying various filtering and compression
methods. It also performs automatic bit depth, color type and color
palette reduction where possible, and can correct some data
integrity errors in input files.

Here’s a poweshell script that you can run from the root folder
of your website. It will scan through all the PNG files in the
webtree and run OptiPNG on each file. This takes quite some time if
you have hundreds of files. So, you should make it a part of your
build script.

gci -include *.png -recurse | foreach
 { fileName = _.FullName; cmd /c "C:softpngoptipng.exe -o7 "fileName"" }

Here I have stored the optipng.exe on the c:softpng
folder.

OptiPNG gives very good compression. But there’s even more scope
for compression. AdvanceCOMP is the
ultimate in compression technology for PNG as it uses the mighty
7zip compression algorithm. It
can squeeze down PNG even further after being compressed by OptiPNG
using its maximum compression mode. PNG files are compressed using
DEFLATE algorithm. DEFLATE has 0 to 9 compression level, where 9 is
the highest. AdvanceCOMP uses 7zip DEFLATE encoder, that extends
the compression factor even more. During 7zip compression, a much
more detailed search of compression possibilities is performed, at
the expense of significant further processor time spent on
searching. Effectively, the 10-point scale used in gzip is extended
to include extra settings above 9, the previous maximum search
level. There will be no difference in decompression speed,
regardless of the level of compressed size achieved or time taken
to encode the data.

Here’s a poweshell script that you can run from the root folder
of your website. It will scan through all the PNG files in the
webtree and run AdvanceCOMP on each file. You need to run
AdvanceCOMP after running OptiPNG.

gci -include *.png -recurse | foreach
 { fileName = _.FullName; cmd /c "C:softpngadvpng.exe
 --shrink-insane -z "fileName"" }

I have collected both optipng and advpng in this
zip
file.

Optimize all JPEG on your website

Unfortunately, there’s not much powerful tool like OptiPNG for
jpeg that you can run on all your jpeg files and compress them
rigorously. JPEG file is compressed when it is saved. Generally all
graphics applications provide you an option to select what’s the
quality ratio of the jpeg being saved. So, you have to consciously
make the best compression vs quality choice while saving the jpeg
file. However, libjpeg project has a
jpeg optimizer tool that does some optimization on jpeg files. It
has a jpegtran utility which does the optimization according to
wikipedia:

jpegtran – a utility for lossless transcoding between different
JPEG formats. The jpegtran command-line program is useful to
optimize the compression of a JPEG file, convert between
progressive and non-progressive JPEG formats, eliminate
non-standard application-specific data inserted by some image
programs, or to perform certain transformations on a file —
such as grayscaling, or rotating and flipping (within certain
limits) — all done “losslessly” (i.e. without decompressing
and recompressing the data, and so causing a reduction of image
quality due to generation loss).

However, when we ran jpegtran on all the jpeg files in
Pageflakes, we are able to reduce about 20% total size of all jpeg.
So, that was not too bad.

Here’s how you run jpegtran to get all the jpeg files within
your website directory optimized:

gci -include *.jpg -recurse | foreach
 { fileName = _.FullName; newFileName = fileName + ".tmp";
cmd /c "C:softjpegjpegtran.exe -optimize -outfile "newFileName" "fileName"";
copy newFileName fileName; del newFileName; }

The libjpeg
binaries are uploaded here
for your convenience.

Warning: You have to run all these powershell commands in a
single line. I have broken the commands in multiple line for better
readability.

Let’s save global bandwidth, go green.

10 ASP.NET Performance and Scalability Secrets

ASP.NET 2.0 has many secrets, when revealed, can give you big
performance and scalability boost. For instance, there are secret
bottlenecks in Membership and Profile provider which can be solved
easily to make authentication and authorization faster.
Furthermore, ASP.NET Http pipeline can be tweaked to avoid
executing unnecessary code that gets hit on each and every request.
Not only that, ASP.NET Worker Process can be pushed to its limit to
squeeze out every drop of performance out of it. Page fragment
output caching on the browser (not on the server) can save
significant amount of download time on repeated visits. On demand
UI loading can give your site a fast and smooth feeling. Finally,
Content Delivery Networks (CDN) and proper use of HTTP Cache
headers can make your website screaming fast when implemented
properly. In this article, you will learn these techniques that can
give your ASP.NET application a big performance and scalability
boost and prepare it to perform well under 10 times to 100 times
more traffic.

http://www.codeproject.com/KB/aspnet/10ASPNetPerformance.aspx

In this article I have shown the following techniques:

  • ASP.NET Pipeline optimization
  • ASP.NET Process configuration optimization
  • Things you must do for ASP.NET before going live
  • Content Delivery Network
  • Caching AJAX calls on browser
  • Making best use of Browser Cache
  • On demand progressive UI loading for fast smooth
    experience
  • Optimize ASP.NET 2.0 Profile provider
  • How to query ASP.NET 2.0 Membership tables without bringing
    down the site
  • Prevent Denial of Service (DOS) attack

The above techniques can be implemented on any ASP.NET website
especially those who use ASP.NET 2.0’s Membership and Profile
provider.

You can learn a lot more about performance and scalability
improvement of ASP.NET and ASP.NET AJAX websites from my book –
Building a
Web 2.0 portal using ASP.NET 3.5
.