10 cool web development related articles in 2007

Here’s a list of 10 cool ASP.NET, AJAX and web development
related articles and blog posts that I have written this year that
you might want to take a look:

13
disasters for production website and their solutions

Talks about 13 production disasters that can happen to any website
any time and bring down your business.

Build
Google IG like Ajax Start Page in 7 days using ASP.NET Ajax and
.NET 3.0

This block buster article shows how ASP.NET AJAX, Linq to XML, Linq
to SQL and Workflow Foundation can be used to create a Google IG
like start page in just 7 nights. Learn how to put together such
hot technologies into one project and make a production site out of
it.


Serve extensionless URL from ASP.NET without using ISAPI module or
IIS 6 Wildcard mapping

Currently there are only two ways to service extentionless URL like
www.pageflakes.com/omar that
hits something besides the default document – use a custom ISAPI
module or use IIS 6 wildcard mapping. Both has performance and
scalability problems because both intercepts each and every hit.
Learn how you can solve it by using a custom 404 handler.


Request format is unrecognized for URL unexpectedly ending in
/SomeWebServiceMethod

Since ASP.NET AJAX 1.0 release, Microsoft prevented JSON hijacking
by adding a special content type header. But this caused us some
trouble.


Cleanup inactive anonymous users from ASP.NET Membership
Tables

When you store anonymous user profile using ASP.NET Membership
provider and Anonymous Identification provider, you soon end up
with lots of idle anonymous user data where those users never come
back. We (Pageflakes) went through a lot of difficulty keeping our
database size down as we allow anonymous users to do almost
everything that a registered user can do. This introduces
scalability challenge. See how we solved this problem.


Prevent Denial of Service (DOS) attacks in your web
application

Web applications can be brought down to its knees by hitting the
site repeatedly or by calling expensive webservices randomly.
Anyone can write a simple loop that hits a webserver very
frequently from a high bandwidth connectivity and bring your
production server down. See how to prevent such application level
DOS attacks.


ASP.NET Ajax Extender for multi-column widget drag &
drop

It’s an ASP.NET AJAX extender that allows Pageflakes style drag
& drop functionality between columns and rows.


ASP.NET Ajax in-depth performance analysis

While building an open source start page using ASP.NET AJAX, I have
done a lot of performance analysis on AJAX framework in order to
improve first time load and perceived speed of javascript rich
pages. Check out my analysis.


Think you know how to write UPDATE statement? Think again.

Learn how to optimize common UPDATE statements


Make a surveillance application which captures desktop and then
emails you as attachment

Some time back I needed to capture a certain computers desktop in
order to find out what that user is doing every day. So, I made a
.NET 2.0 Winforms Application which stays on system tray (optional)
and capture the desktop in given time interval (say every 60 secs)
and emailed the captured images to me as message attachment (say
every 30 mins).


Today I received MVP award for the 3rd time on Visual C#. Thanks
to Microsoft for the award and setting up my new blog. I will continue
both my MVPS Blog and this
blog from now on.

Make your website faster using Content Delivery Network

Every request from a browser goes to your server traveling
through the Internet backbones that spans the world. The number of
countries, continents, oceans a request has to go through to reach
your server, the slower it is. For example, if you have your
servers in USA and someone from Australia is browsing your site,
each request is literary crossing the planet from one end to the
other in order to reach your server and then come back again to the
browser. If your site has large number of static files like images,
css, javascript; sending request for each of them and downloading
them across the world takes significant amount of time. If you
could setup a server in Australia and redirect users to your
Australian server, then each request would take fraction of the
time it takes to reach USA. Not only the network latency will be
lower but also the data transfer rate will be faster and thus
static content will download a lot faster. This will give
significant performance improvement on the user’s end if your
website is rich in static content. Moreover, ISPs provide far
greater speed for country wide network compared to the internet
because each country generally has handful of connectivity to the
Internet backbone that are shared by all ISPs within the country.
As a result, users having 4mbps broadband connection will get the
full 4mbps speed from servers that are within the same country. But
they will get as low as 512kbps from servers which are outside the
country. Thus having a server in the same country significantly
improves site download speed and responsiveness.

clip_image002[4]

Above figure shows average response time for www.pageflakes.com from
Washington, DC, where the servers are in Dallas, Texas. The average
response time is around 0.4 seconds. This response includes server
side execution time as well. Generally it takes around 0.3 to 0.35
seconds to execute the page on the server. So, time spent on
network is around 0.05 seconds or 50ms. This is a really fast
connectivity as there are only 4 to 6 hops to reach Dallas from
Washington DC.

clip_image004[4]

This figure shows average response time from Sydney,
Australia. The average response time is 1.5 seconds which is
significantly higher than Washington DC. It’s almost 4 times
compared to USA. There’s almost 1.2 seconds overhead on network
only. Moreover, there are around 17 to 23 hops from Sydney to
Dallas. So, the site downloads at least 4 times slower in Australia
than it is from anywhere in USA.

A content delivery network (CDN) is a system of computers
networked together across the Internet. The computers cooperate
transparently to deliver content (especially large media content)
to end users. CDN nodes (cluster of servers in a specific location)
are deployed in multiple locations, often over multiple backbones.
These nodes cooperate with each other to serve requests for content
by end users. They also transparently move content behind the
scenes to optimize the delivery process. CDN serves request by
intelligently choosing the nearest server. It looks for the fastest
connectivity between your computer to a nearest node that has the
content you are looking for. The number of nodes in different
countries and the number of redundant backbone connectivity a CDN
has measures its strength. Some of the most popular CDNs are
Akamai, Limelight, EdgeCast. Akamai is used by large companies like
Microsoft, Yahoo, AOL. It’s comparatively expensive solution.
However, Akamai has the best performance throughout the world
because they have servers in almost every prominent city in the
world. However, Akamai is very expensive and they only accept a
customer who can spend minimum 5K on CDN per month. For smaller
companies, Edgecast is a more affordable solution.

clip_image006[4]

This figure shows CDN Nodes that is closest to the browser
intercepts traffic and serves response. If it does not have the
response in cache, it fetches it from origin server using a faster
route and much more optimized connectivity that the browser’s
ISP can provide. If the content is already cached, then it’s
served directly from the node

There are generally two types of CDN. One is where you upload
content to CDN’s servers via FTP and you get a subdomain in their
domain like dropthings.somecdn.net. You change all the URL of
static content throughout your site to download content from the
CDN domain instead of relative URL to your own domain. So, a URL
like /logo.gif will be renamed to
http://dropthings.somecdn.net/logo.gif. This is easy to configure
but has maintenance problems. You will have to keep CDN’s
store synchronized with the files all the time. Deployment becomes
complicated because you need to update both your website and the
CDN store at the same time. Example of such a CDN (which is very
cheap) is Cachefly.

A more convenient approach is to store static content on your
own site but use domain aliasing. You can store your content in a
subdomain that points to your own domain like
static.dropthings.com. Then you use CNAME to map that subdomain to
a CDN’s nameserver like cache.somecdn. net. When a
browser tries to resolve static.dropthigns.com, the DNS lookup
request goes to the CDN nameserver. The nameserver then returns IP
of a CDN node which is closest to you and can give you the best
download performance. Browser then sends requests for files to that
CDN node. When CDN node sees the request, it checks whether it has
the content already cached. If it is cached, it delivers the
content directly from its local store. If not, it makes a request
to your server and then looks at the cache header generated in
response. Based on the cache header it decides how long to cache
the response in its own cache. In the meantime, browser does not
wait for CDN node to get content and return to it. CDN does
interesting trick on the internet backbone to actually route
request to the origin server so that the browser gets the response
directly served from origin server while CDN is updating its
cache. Sometimes CDN act as a proxy, intercepting each
request and then fetching uncached content from origin using a
faster route and optimized connectivity to the origin server.

Request format is unrecognized for URL unexpectedly ending in /SomeWebServiceMethod

At Pageflakes, our
webservers; event logs are being flooded with this error. In
ASP.NET AJAX 1.0 version, Microsoft added a check for all web
service calls to have Content-Type: application/json in the
request headers. Unless this request header is present, ASMX
handler fires an exception. This exception is raised directly from
the ScriptHandler which handles all web service calls made
via ASP.NET AJAX.

This is done for security reason. This prevents anyone from
feeding off from your webservices. For example, you might have a
webservice that returns some useful information that others might
be interested to consume. So, any one could just add a
< script> tag pointing to that web service URL and get the
json. If that webservice is a very expensive webservice in terms of
I/O and/or CPU, then other websites feeding off from your
webservice could easily bog down your server.

Now, this back fires when you have HTTP GET supported webservice
calls that produce response headers to cache the response. For
example, you might have a webmethod that returns Stock Quotes. You
have used response caching so that browser caches the response of
that webmethod and repeated visit do not produce repeated calls to
that I/O costly webservice. What will happen is proxy gateways or
proxy servers will see that their client users are making this
request frequently and it can be cached. So, they will make
periodic calls to that webservice and try to precache them on
behalf of their client users. However, during precache, they won’t
send the Content-Type: application/json header. That’s what
I have seen for several proxy servers. As a result, it produces
exception and you get your event log flooded with this
exception.

The reason why MS might have not seen this or anyone else is
because there’s no way to make HTTP GET response cacheable on
browser from webservice calls unless you do the hack
I mentioned in earlier blog post
.

However, one thing that puzzles me is we get this error so
frequently and the request headers look so legitimate that I am not
100% sure whether it’s the proxies not producing the content-type
header properly all the time. Somehow the error patterns look like
an registered user is trying to use the site and failing. It is
possible that some firewall might see this content-type as invalid
content type and filter out the header.

Windows Vista – built-in disk partition resize utility

When I tried to install windows vista on my computer, the first
decision I had to take was to partition the hard disk. I started
looking for software on the Internet that can resize existing disk
partition. I found some free software but none of them looked
convincing enough to install and try on my computer. So I just
decided to go on and install Windows Vista anyway. I had two
partitions on my computer. The first partition was the C drive
which had windows XP installed on it and the other partition was D
drive which had all my data. There was no way I can install Windows
Vista on C drive. So, I installed Windows Vista on D drive.
After installing Windows Vista I went to the computer manager by
right cliking on the “computer” icon and selecting and “Manage”.
There you will find the Disk Management option. From there, select
the D drive and choose “Shrink”. It will shrink the parition to all
available space. This will take a long time to complete without any
notification. So, don’t think Vista has hung. Let it finish and you
will find D has been shrunk and there’s a new parition.

image

Now move all your data from D drive to the new drive, say E
drive.

If is better to do a disk defragmentation on D drive after doing
this.

Note: I found this half written post in my draft box after 7
months. So, this may not be a new information. Just to let you
know, for my regular development, I went back to Windows XP because
there’s significant difference between IIS 6 and IIS 7 and I need
to see how IIS 6 behaves with my applications. Moreover, Visual
Studio 2005 UI does not mix well with Vista. Since I went back,
somehow XP actually feels quite light and faster than Vista.
Finally, Vista burns out laptop battery within 40 mins where XP can
withstand up to 1.5 hours. This can be because of Sony Advance
Power Management drivers not working with Vista.

A significant part of sql server process memory has been paged out. This may result in performance degradation

If you are using SQL Sever Server standard edition 64 bit on a
Windows 2003 64bit, you will frequently encounter this problem
where SQL Server says:

A significant part of sql server process memory has been paged
out. This may result in performance degradation. Duration 0
seconds. Working set (KB) 25432, committed (KB) 11296912, memory
utilization 0%

The number in working set and duration will vary. What happens
here is SQL Server is forced to release memory to operating system
because some other application or OS itself needs to allocate
RAM.

We went through many support articles like:

  • 918483:
    How to reduce paging of buffer pool memory in the 64-bit version of
    SQL Server 2005
  • 905865:
    The sizes of the working sets of all the processes in a console
    session may be trimmed when you use Terminal Services to log on to
    or log off from a computer that is running Windows Server 2003
  • 920739:
    You may experience a decrease in overall system performance when
    you are copying files that are larger than approximately 500 MB in
    Windows Server 2003 Service Pack 1

But nothing solved the problem. We still have the page out
problem happening every day.

The server has 16 GB RAM where 12 GB is maximum limit allocated
to SQL Server. 4 GB is left to OS and and other application. We
have also turned off antivirus and any large backup job. 12 GB RAM
should be plenty because there’s no other app running on the
dedicated SQL Server box. But the page out still happens. When this
happens, SQL Server becomes very slow. Queries timeout, website
throws error, transactions abort. Sometimes this problems goes on
for 30 to 40 minutes and website becomes slow/unresponsive during
that time.

I have found what causes SQL Server to page out. File System
cache somehow gets really high and forces SQL Server to trim
down.

clip_image002

You see the System cache resident bytes are very high. During
this time SQL Server gets much less RAM than it needs. Queries
timeout at very high rate like 15 per sec. Moreover, there’s high
SQL Lock Timeout/sec (around 15/sec not captured in screen
shot).

clip_image004

SQL Server max memory is configured 12 GB. But here it shows
it’s getting less than 8 GB.

While the file system cache is really high, there’s no
process that’s taking significant RAM.

clip_image006

After I used SysInternal’s
CacheSet
to reset file system cache and set around 500 MB as
max limit, memory started to free up.

clip_image008

SQL Server started to see more RAM free:

clip_image010

Then I hit the “Clear” button to clear file system
cache and it came down dramatically.

clip_image012

Paging stopped. System cache was around 175 MB only. SQL Server
lock timeout came back to zero. Everything went back to normal.

So, I believe there’s either some faulty driver or the OS itself
is leaking file system cache in 64bit environment.

What we have done is, we have a dedicated person who goes to
production database servers every hour, runs the CacheSet program
and clicks “Clear” button. This clears the file system cache and
prevents it from growing too high.

There are lots of articles written about this problem. However,
the most informative one I have found is from the SQL Server PSS
team:


http://blogs.msdn.com/psssql/archive/2007/05/31/the-sql-server-working-set-message.aspx

UPDATE – THE FINAL SOLUTION!

The final solution is to run this program on Windows
Startup:

SetSystemFileCacheSize 128 256

This sets the lower and higher limit for the System Cache. You
need to run this on every windows startup because a restart will
undo the cache setting to unlimited.

You can run the program without any parameter to see what is the
current setting.

Download the program from this page:

http://www.uwe-sieber.de/ntcacheset_e.html

Go to the end and you will get the link to the
SetSystemFileCacheSize.zip

13 disasters for production website and their solutions

When we first went live with Pageflakes back in the year 2005,
most of us did not have experience with running a mass consumer
high volume web application on the Internet. In our first year of
operation, we went through all types of difficulties a web
application can face as it grows. Frequent problems with software,
hardware, and network were part of our daily life. We have overcome
a lot of obstacles and established ourselves as one of the top most
Web 2.0 applications in the world. From a thousand user website, we
have grown to a million user website over the years. We have learnt
how to architect a product that can withstand more than 2 million
hits per day and sudden spikes like 7 million hits on a day. We
have discovered under the hood secrets of ASP.NET 2.0 that solves
many scalability and maintainability problems. We have also gained
enough experience in choosing the right hardware and Internet
infrastructure which can make or break a high volume web
application. In this article, you will learn about 13 disasters
than can happen to any production website anytime. These real world
stories will help you prepare yourself well enough so that you do
not go through the same problems as we did. Being prepared for
these disasters upfront will save you a lot of time and money as
well as build credibility with your users.

We have gone through many disasters over the years. Some of them
are:

  1. Hard drive crashed, burned, got corrupted several times
  2. Controller malfunctions and corrupts all disks in the same
    controller
  3. RAID malfunction
  4. CPU overheated and burned out
  5. Firewall went down
  6. Remote Desktop stopped working after a patch installation
  7. Remote Desktop max connection exceeded. Cannot login anymore to
    servers
  8. Database got corrupted while we were moving the production
    database from one server to another over the network
  9. One developer deleted the production database accidentally
    while doing routine work
  10. Support crew at hosting service formatted our running
    production server instead of a corrupted server that we asked to
    format
  11. Windows got corrupted and was not working until we
    reinstalled
  12. DNS goes down
  13. Internet backbone goes down in different parts of the
    world

This article of mine explains all these disasters and gives you
the solutions:

http://www.codeproject.com/install/13disasters.asp

Please vote for the article if you like it.

Webcast + Podcast + Blog + Widgets = Pagecast

You have seen webcasts and heard Podcasts. Skype introduced
Skypecast. Pageflakes now gives you Pagecast. Using the Page
publishing feature in Pageflakes, you can make your pages public
and share with the world and thus make Pagecast! On your page, you
can put almost anything you want. Put news feeds, photos, audio,
podcasts, movies etc. You can put cool widgets like Calendar,
To-do-list, Blog, Notes. You are no longer limited to providing
video via Webcast or audio via Podcasts. You can deliver rich
interactive content via Pagecast. Pagecast is a lot more powerful
than blogging or vblogging because you can deliver blog content
along with video, audio, and hundreds of rich interactive
components (flakes).

Check out my Pagecast at: http://www.pageflakes.com/omar

Similarly you can create .NET Pagecast with .NET contents from
all over the web, AJAX podcast with all Ajax content, Java
pagecast, PHP pagecast, …

There are thousands of Pagecasts in Pageflakes public page
directory at:

http://www.pageflakes.com/Community/Pages/Page.aspx

Example Public Pagecasts:

David Strom’ Web Informant (Journalist): http://www.pageflakes.com/davidstrom

Second Life Addict: http://www.pageflakes.com/windfeemail/10432298

Amy Winehouse (Musician): http://www.pageflakes.com/amywinehouse

Buddhism: http://www.pageflakes.com/newzaogao/10287762

Adopting Children from Ethiopia http://www.pageflakes.com/Texasselect191-Widgets

LINUX Help: http://www.pageflakes.com/yrcjaya

Bargain Shopping (Business): http://www.pageflakes.com/marcos1

Wine Lover http://www.pageflakes.com/ericlflau

The Ultimate Metal Homepage: http://www.pageflakes.com/rodslater

Dog Frisbee http://www.pageflakes.com/ole1

African-American Entertainment: http://www.pageflakes.com/bronzetrinity

Trekkie: http://www.pageflakes.com/codelayer/10325410

Example Group Pagecasts:

Ralph Posadas Family http://www.pageflakes.com/ralph4/10106645

East Lothian School District (Scotland): http://www.pageflakes.com/edubuzz/6523900

Onsted High School: http://www.pageflakes.com/akharsh/9789898

Parker Family: http://www.pageflakes.com/ljp530/10001056

Serve extensionless URL from ASP.NET without using ISAPI module or IIS 6 Wildcard mapping

If you want to serve extensionless URL from ASP.NET 2.0 like the
following:

  • www.store.com/books
  • www.store.com/books/asp.net2.0
  • www.forum.com/post/how-to-serve-extensionless-url

You cannot, unless you use some third party ISAPI
module
or use IIS 6.0 Wildcard mapping feature. Third party
ISAPI module needs to be installed on the server directly. If
you don’t have a dedicated server or a VPS, you cannot do this. IIS
6.0 Wildcard mapping makes each and every request go through
ASP.NET 2.0 ISAPI handler including Urls with .gif, .css, .js,
.html etc. So, it suffers from scalability problem. Some
independent research shows there’s 30% drop in performance in IIS
6.0 when you use wildcard mapping. So, this is not a good solution
either.

Here’s an approach which works without using ISAPI
module or wildcard mapping in IIS 6.0
. When you request
extensionless URL, you get HTTP 404. This means IIS receives the
request but it serves the page configured for HTTP 404. It does not
send the request to ASP.NET ISAPI. So, if you can forward all HTTP
404 to ASP.NET, you can serve such extensionless URL. In order to
forward HTTP 404 to ASP.NET ISAPI, all you need to do is configure
IIS to redirect to some .aspx extension on HTTP 404.

Benefits of this approach:

  • No thirdparty ISAPI module required
  • No Wildcard mapping thus no performance sacrifice

Here’s how to configure 404 redirection in IIS 6.0:

On IIS 6.0 change 404 default page to /404.aspx and the
type to “URL”. On IIS 7.0, change 404 default page to
/404.aspx and the type to “ExecuteURL”. Also, change the default
error response to “Custom error pages”.

When you will request an URL like “www.shop.com/products/books”
it will redirect to
“www.shop.com/404.aspx?404;http://www.shop.com/products/books”.
From Global.asax BeginRequest event, capture this URL and
see whether it’s an extensionless URL request. If it is, do your
URL rewriting stuff for such extensionless URL.

   1: protected void Application_BeginRequest(object sender, EventArgs e)
   2: {
   3:     string url = HttpContext.Current.Request.Url.AbsolutePath;
   4:
   5:     // HTTP 404 redirection for extensionless URL or some missing file
   6:     if (url.Contains("404.aspx"))
   7:     {
   8:         // On 404 redirection, query string contains the original URL in this format:
   9:         // 404;http://localhost:80/Http404Test/OmarALZabir
  10:
  11:         string[] urlInfo404 = Request.Url.Query.ToString().Split(';');
  12:         if (urlInfo404.Length > 1)
  13:         {
  14:             string originalUrl = urlInfo404[1];
  15:
  16:             string[] urlParts = originalUrl.Split('?');
  17:
  18:             string queryString = string.Empty;
  19:             string requestedFile = string.Empty;
  20:
  21:             if (urlParts.Length > 1)
  22:             {
  23:                 requestedFile = urlParts[0];
  24:                 queryString = urlParts[1];
  25:             }
  26:             else
  27:             {
  28:                 requestedFile = urlParts[0];
  29:             }
  30:
  31:             if( requestedFile.IndexOf('.') > 0 )
  32:             {
  33:                 // There's some extension, so this is not an extensionless URL.
  34:                 // Don't handle such URL because these are really missing files
  35:             }
  36:             else
  37:             {
  38:                 // Extensionless URL. Use your URL rewriting logic to handle such URL
  39:                 // I will just add .aspx extension to the extension less URL.
  40:                 HttpContext.Current.RewritePath(requestedFile + ".aspx?" + queryString);
  41:             }
  42:         }
  43:     }
  44: }

Synchronously execute and get return parameters from Workflow

In my DropThings project, I have used
Workflows to develop the business layer that run synchronously and
do most of the work in the middle-tier. The business layer facade
named DashboardFacade has no code but to call different
workflows. Each of the workflow serve a particular operation like
new user visit, existing user visit, adding a tab, moving a widget
from one column to another etc. ASP.NET page calls
DashboardFacade for each user action and
DashboardFacade inturn calls a workflow to respond to that
user action.

“This is insane!” you are thinking. I know. Please
hear me out why I went for this approach:

Architects can “Design” business facade
functions in terms of Activities and developers can just fill
in small amount of unit code in each activity.

This is a really good reason because Architects can save time in
writing Word Documents explaining how things should work. They can
directly go into Workflow Designer, design the activities, connect
them, design the flow, and verify whether all input and output are
properly mapped or not. This is lot better than drawing
flow-charts, writing pseudocode, explaining in stylish text how an
operation should work. It is also helpful for developers because
they can see the workflow and easily understand how to craft the
whole operation. They just open up each activity and write small
amount of very specific reusable code. They know what will be the
input to the Activity (like function parameters) and they know what
to produce (return value of function). This makes the activities
reusable and architects can reuse one activity in many workflows.
Workflows can be debugged right on Visual Studio Workflow Designer.
So, developers can easily find out defects in their implementation
by debugging the workflow. Architects can enforce many standards
like validations, input output check, Fault handling on the
workflow. Developers cannot but comply with those and thus produce
really good code. Another great benefit for both architect and
developer is that there’s no need to keep a separate
technical specification document up-to-date because the workflow is
always up-to-date and it speaks for itself. If someone wants to
study how a particular operation works, one can just printout the
workflow and read it through.

“But what about performance”, you say? Both you and
I have heard workflow foundation is a pretty big library and can be
memory hog. Also the workflow runtime is quite big and takes time
to start up. I did some profiling on the overhead of workflow
execution and it is very fast for synchronous execution. Here’s
proof from the log you get in Visual Studio output window:

   1:
b030692b-5181-41f9-a0c3-69ce309d9806 Activity: Get User Guid
0.078125
   2:
b030692b-5181-41f9-a0c3-69ce309d9806 Activity: Get User Pages
0.0625
   3:
b030692b-5181-41f9-a0c3-69ce309d9806 Activity: Get User Setting
0.046875
   4:
b030692b-5181-41f9-a0c3-69ce309d9806 Activity: Get Widgets in page:
189 0.0625
   5:
b030692b-5181-41f9-a0c3-69ce309d9806 Total: Existing user visit
0.265625

First four entries are the time taken by individual activities
during data access only, not the total time taken to execute the
whole activity. The time entries here are in seconds and the first
four entries represent duration of database operations inside
activities. The last one is the total time for running a workflow
with the four activities shown above and some extra code. If you
sum up all the individual activity execution time for database
operations only, it is 0.2500 which is just 0.015625 sec less than
the total execution time. This means, executing the workflow itself
along with overhead of running activities takes around 0.015 sec
which is almost nothing compared to the total effort of doing the
complete operation.

An example of such workflow is as following:

   1:
public
void MoveWidgetInstance(
int widgetInstanceId,
int toColumn,
int toRow )
   2: {
   3:
using(
new TimedLog(
this._UserName,
"Move Widget:" + widgetInstanceId) )
   4:   {
   5:     var properties =
new Dictionary<
string,
object>();
   6:     properties.Add(
"UserName",
this._UserName);
   7:     properties.Add(
"WidgetInstanceId", widgetInstanceId);
   8:     properties.Add(
"ColumnNo", toColumn);
   9:     properties.Add(
"RowNo", toRow);
  10:
  11:     WorkflowHelper.ExecuteWorkflow(

typeof( MoveWidgetInstanceWorkflow ),
properties );
  12:   }
  13: }

MoveWidgetInstance is a method in
DashboardFacade. It just calls a workflow named
MoveWidgetInstanceWorkflow which does the job for moving a
widget from one column to another, rearrange the widgets on the new
column, pull up widgets on the old column to remove the empty
space, and update all widgets positions. If I wrote all these
code in one function, it becomes quite a big function and requires
documentation. But having a workflow means I have a flowchart of
what’s going on and I have some handy reusable Activities which I
can reuse in other operations.

Implementing business layer as workflows has three important
requirements:

  1. Execute workflow synchronoulsy within the ASP.NET request
    thread
  2. Get output parameters from workflow which is returned as return
    value from Business Facade methods
  3. Get exceptions raised from Activities in a synchronous manner
    inside the same request thread

WorkflowHelper is a handy class I made, which makes use
of Workflow a breeze, especially from ASP.NET. In the business
layer, I need synchronous execution of workflow where the default
implementation of Workflow Foundation is to work asynchronously.
Moreover, I need return values from Workflows after their execution
complete, which is not so easily supported due to asynchronous
nature of workflow. Both of these require some tweaking with
workflow runtime in order to successfully run in ASP.NET
environment. You will find the source code of WorkflowHelper in the
code zip file from www.dropthings.com.

WorkflowHelper.Init function initializes Workflow
Runtime for ASP.NET environment. It makes sure there’s only
one workflow runtime per Application Domain. Workflow Runtime
cannot be created twice in the same application domain. So, it
stores the reference of the Workflow Runtime in Application
Context.

   1:
public
static WorkflowRuntime Init()
   2: {
   3:   WorkflowRuntime workflowRuntime;
   4:
   5:
// Running in local mode, create an return new
runtime
   6:
if( HttpContext.Current ==
null )
   7:     workflowRuntime =
new WorkflowRuntime();
   8:
else
   9:   {
  10:
// running in web mode, runtime is initialized
only once per 
  11:
// application
  12:
if( HttpContext.Current.Application[
"WorkflowRuntime"] ==
null )
  13:       workflowRuntime =
new WorkflowRuntime();
  14:
else
  15:
return HttpContext.Current.Application[
"WorkflowRuntime"]
as WorkflowRuntime;
  16:   }

The initialization takes care of both ASP.NET and
Console/Winforms mode. After the initialization, it registers
ManualWorkflowSchedulerService, which take care of
synchronous execution of Workflow.
Activities.CallWorkflowService is a handy
service that executes workflow synchrnously from
another workflow.
Read this post for details.
These two services make
Workflow Foundation usable from ASP.NET environment for practical
scenarios.

   1: var manualService =
new ManualWorkflowSchedulerService();
   2:
workflowRuntime.AddService(manualService);
   3:
   4: var syncCallService =
new Activities.CallWorkflowService();
   5:
workflowRuntime.AddService(syncCallService);
   6:
   7: workflowRuntime.StartRuntime();
   8:
   9:
// on web mode, store the runtime in application
context so that
  10:
// it is initialized only once. On dekstop mode,
ignore
  11:
if(
null != HttpContext.Current )
  12:   HttpContext.Current.Application[
"WorkflowRuntime"] = workflowRuntime;
  13:
  14:
return workflowRuntime;
  15:

Workflow Runtime is initialized from Application_Start event in
Global.asax. This ensures the initialization happens only once per
App Domain.

   1:
void Application_Start(
object sender, EventArgs e)
   2: {
   3:
// Code that runs on application startup
   4:
   5:
DashboardBusiness.WorkflowHelper.Init();
   6: }

The runtime is disposed from Application_End event in
Gloabal.asax:

   1:
void Application_End(
object sender, EventArgs e)
   2: {
   3:
//  Code that runs on application shutdown
   4:
DashboardBusiness.WorkflowHelper.Terminate();
   5: }

The most interesting function is the ExecuteWorkflow function
which does the following:

  • Execute workflow synchronously
  • Pass parameters to workflow
  • Upon completion, get output parameters from workflow and return
    them
    Handle exceptions raised in Workflow and raise to ASP.NET exception
    handler

First ExecuteWorkflow creates an instance of Workflow and passes
input parameters to it:

   1:
public
static
void ExecuteWorkflow( Type workflowType,
Dictionary<
string,
object> properties)
   2: {
   3:   WorkflowRuntime workflowRuntime =
Init();
   4:
   5:   ManualWorkflowSchedulerService
manualScheduler =
workflowRuntime.GetService();
   6:
   7:   WorkflowInstance instance =
workflowRuntime.CreateWorkflow(workflowType, properties);
   8:   instance.Start();

ManualWorkflowSchedulerService service executes the workflow
synchronously. Next step is to hook WorkflowCompleted and
WorkflowTerminated events of Workflow Runtime so that I can capture
output parameters and exceptions and handle them properly.

   1:
EventHandler completedHandler =
null;
   2: completedHandler =
delegate(
object o, WorkflowCompletedEventArgs e)
   3: {
   4:
if (e.WorkflowInstance.InstanceId
==instance.InstanceId)
   5:   {
   6:
workflowRuntime.WorkflowCompleted -= completedHandler;
   7:
   8:
// copy the output parameters in the specified
properties dictionary
   9:     Dictionary<
string,
object>.Enumerator enumerator =
e.OutputParameters.GetEnumerator();
  10:
while( enumerator.MoveNext() )
  11:     {
  12:       KeyValuePair<
string,
object> pair = enumerator.Current;
  13:
if( properties.ContainsKey(pair.Key) )
  14:       {
  15:         properties[pair.Key] =
pair.Value;
  16:       }
  17:     }
  18:   }
  19: };

When workflow completes, WorkflowCompletedEventArgs gives me the
OutputParameters dictionary. It contains all the public properties
of Workflow. I read all the entries in the OutputParameters and
update the input parameters Dictionary with the new values. This is
required in the AddWidget function of DashboardFacade where I need
to know the widget instance created by the workflow.

WorkflowTerminated fires when there’s an exception. When
any activity inside the workflow raises exception, this event fires
and workflow execution aborts. I capture this exception and throw
it again so that ASP.NET can trap this exception using its default
exception handler.

   1: Exception x  =
null;
   2:
EventHandler terminatedHandler =

null;
   3: terminatedHandler =
delegate(
object o, WorkflowTerminatedEventArgs e)
   4: {
   5:
if (e.WorkflowInstance.InstanceId ==
instance.InstanceId)
   6:   {
   7:
workflowRuntime.WorkflowTerminated -= terminatedHandler;

   8:     Debug.WriteLine( e.Exception );
   9:
  10:     x = e.Exception;
  11:   }
  12: };
  13: workflowRuntime.WorkflowCompleted
+= completedHandler;
  14: workflowRuntime.WorkflowTerminated
+= terminatedHandler;
  15:
  16:
manualScheduler.RunWorkflow(instance.InstanceId);
  17:
  18:
if (
null != x)
  19:
throw
new WorkflowException(x);

This helps me get exceptions shown in ASP.NET Exception handler
like this:

The WorkflowHelper is a reusable class that you can use
in your work project. Just copy the class file out of my project
and add in your own.