Rescue overdue offshore projects and convince management to use automated tests

I have published two articles on codeproject recently. One is a story where an offshore project was two months overdue, my friend who runs it was paying the team from his own pocket and he was drowning in ever increasing number of change requests and how we brainstormed together to come out of that situation.

Tips and Tricks to rescue overdue projects

Next one is about convincing management to go for automated test and give developers extra time per sprint, at the cost of reduced productivity for couple of sprints. It’s hard to negotiate this with even dev leads, let alone managers. Whenever you tell them – there’s going to be less features/bug fixes delivered for next 3 or 4 sprints because we want to automate the tests and reduce manual QA effort; everyone gets furious and kicks you out of the meeting. Especially in a startup where every sprint is jam packed with new features and priority bug fixes to satisfy various stakeholders, including the VCs, it’s very hard to communicate the benefits of automated tests across the board. Let me tell you of a story of one of my startups where I had the pleasure to argue on this and came out victorious.

How to convince developers and management to use automated test instead of manual test

If you like these, please vote for me!

New and Improved Dropthings – the Open Source Web 2.0 AJAX portal

I have made some significant improvements and bug fixes on the latest 2.5.4 release of Dropthings, my open source Web 2.0-style AJAX Portal built on ASP.NET 3.5, Linq to SQL, Linq to Xml, Silverlight, Enterprise Library, Unity, Velocity, and what not. All the cool production quality techs you want to see in action are there in Dropthings – in a production quality open source project that powers critical portals around the world.

You can get the latest code from here:

http://code.google.com/p/dropthings/

I have uploaded some how to video tutorials to help you get started with Dropthings easily and troubleshoot common problems.

Here’s a list of new stuffs that were added and stabilized in this new release:

  • Run Dropthings under a virtual directory.
  • Widget drag & drop, add/remove improvements and many bug fixes on some not-so-common use cases.
  • Velocity Caching support. Dropthings can now be run in web farm and/or web garden mode and use Velocity for the distributed cache. This gives you more scalability and you can deploy on a large web farm and run a heavy traffic website. I have done enough load test to prove Velocity does make Dropthings scale. You can turn on/off Velocity from web.config. Just create a cache store named “Dropthings”, turn on the config and you are good to go.
  • AspectF implementation to put sensitive operations under transaction, retry, logging and error handling. Error logging is more streamlined. There are two log files where one log contains information and the other contains exceptions. They are both in App_Data folder.
  • Rich set of xUnit tests in Behavior Driven Development style. Important operations like First Visit, Revisit are now covered under automated tests.
  • Addition of “admin” role and “admin” user in database, who can Manage widgets and assign/revoke roles to widgets. You can setup admin user and admin role in your existing database using the ASP.NET Configuration tool.
  • A new page /Admin/ManageWidgets.aspx which is a one stop shop for managing widgets and permissions. Add widgets very conveniently. See the video tutorials on how to get a new widget coded and deployed in less than 5 mins.
  • More web.config settings to allow customization of key behaviors of the project.

Enjoy the new version. Those who have purchased the old version, I strongly recommend you take the time to upgrade. There’s no DB schema change. But there’s some good amount of code change. Check out the code commit history and the details of the changes are there.

Synchronize File Date Time in multiple servers, solve IIS ETag problem

When you deploy the same website on multiple webservers, you end up having each file getting different last modified date. As a result, each IIS produces different ETag for the static files. If user is hitting different servers for the same file (due to load balancing), each IIS is responding with different ETag and thus browser downloading the same file over and over again. if you have 3 servers, same user has most likely downloaded the same file thrice. This gives poor page load performance.

Moreover, if you want to mirror two or more locations using one location as a base, not only you need to copy the same files but you need to set the same Create Date and Last Modified Date on the files. Otherwise they aren’t true mirror. There are various use cases where you need a complete mirror not only at file content level but also at file date time level.

Here’s a powershell script that will do the job for you:

# Path of the base folder. File date times in this folder is used as the base.
$SourceFolder = ".Folder1"
# Put all the other locations here. These locations must have the same folder structure as the base
$DestFolders = @('.Folder2', '.Folder3')

function sync($sourcePath, $destinationPath)
{
$sourceFiles = [System.IO.Directory]::GetFiles($sourcePath);
foreach ($sourceFileName in $sourceFiles)
{
$sourceFile = Get-Item $sourceFileName
$destFilePath = Join-Path -Path $destinationPath -ChildPath $sourceFile.Name
$destFile = Get-Item $destFilePath
if ($destFile.Length -eq $sourceFile.Length)
{
$destFile.LastWriteTime = $sourceFile.LastWriteTime;
$destFile.CreationTime = $sourceFile.CreationTime;

Write-Host ("SYNCED: " + $sourceFileName + " -> " + $destinationPath)
}
else
{
Write-Host ("SIZE DOES NOT MATCH: " + $sourceFileName + " -> " + $destinationPath)
}
}

$childFolders = [System.IO.Directory]::GetDirectories($sourcePath);
foreach ($childFolderName in $childFolders)
{
$childFolder = Get-Item $childFolderName
$destFolderPath = Join-Path -Path $destinationPath -ChildPath $childFolder.Name
$destFolder = Get-Item $destFolderPath
sync $childFolder.FullName $destFolder.FullName
}
}

$Source = Get-Item $SourceFolder
foreach ($destFolderName in $DestFolders)
{
$destFolder = Get-Item $destFolderName
sync $Source.FullName $destFolder.FullName
}

Redirecting traffic from http to https with zero coding in IIS

When you want to enforce https on users and you want to redirect any URL being hit on http to the exact counterpart of https, then usually you do it with some HttpModule written in .NET, or install some URL Redirector module in IIS, or setup a dummy website on http and then use meta refresh tag to send traffic to https. There are many solutions out there which requires some amount of coding skill. Let me show you a zero coding redirection setup.

First, go to the properties of the real website which is now listening on both http and https. Change the http port to something dummy like 8083. This will prevent the website from holding onto port 80.

Now create a new website on an empty folder that has read permission on NETWORK SERVICE account. The new website will have the same host headers as the real one, say omaralzabir.com. But it will listen on port 80 only. It won’t listen on https port 443.

Once created, go to properties and Home Directory tab. And do the following:

IIS website properties to redirect from http to https

Things to do here:

  • Select the “A redirection to a URL” option.
  • Enter the https://yourdomain.com and then $S$Q. Remember, there’s no trailing slash after the domain.
  • Select “The exact URL entered above"”
  • Select “A permanent redirection for this resource”
  • Set Execute permission to None.
  • You can select the same app pool as the original one.

That’s it.

Update:

The $S puts the subdirectories. If you have hit http://omaralzabir.com/subdir/anotherdit, $S = /subdir/anontherdir. And the $Q represents the query string. Together, they represent the whole path and query string.