Careful when querying on aspnet_users, aspnet_membership and aspnet_profile tables used by ASP.NET 2.0 Membership and Profile provider

Such queries will happily run on your development
environment:

Select * from
aspnet_users where UserName =
blabla

Or you can get some user’s profile without any problem
using:

Select * from
aspnet_profile where userID =
……

Even you can nicely update a user’s email in aspnet_membership
table like this:

Update aspnet_membership SET Email =
newemailaddress@somewhere.com Where
Email =

But when you have a giant database on your production server,
running any of these will bring your server down. The reason is,
although these queries look like very obvious ones that you will be
using frequently, none of these are part of any index. So, all of
the above results in “Table Scan” (worst case for any query) on
millions of rows on respective tables.

Here’s what happened to us. We used such fields like UserName,
Email, UserID, IsAnonymous etc on lots of marketing reports at
Pageflakes. These are some
reports which only marketing team use, no one else. Now, the site
runs fine but several times a day marketing team and users used
call us and scream “Site is slow”, “Users are reporting extreme
slow performance”, “Some pages are getting timed out” etc. Usually
when they call us, we tell them “Hold on, checking right now” and
we check the site thoroughly. We use SQL profiler to see what’s
going wrong. But we cannot find any problem anywhere. Profile shows
queries running file. CPU load is within parameters. Site runs nice
and smooth. We tell them on the phone, “We can’t see any problem,
what’s wrong?”

So, why can’t we see any slowness when we try to investigate the
problem but the site becomes really slow several times throughout
the day when we are not investigating?

Marketing team sometimes run those reports several times per
day. Whenever they run any of those queries, as the fields are not
part of any index, it makes server IO go super high and CPU also
goes super high – something like this:


We have SCSI drives which have 15000 RPM, very expensive, very
fast. CPU is Dual core Dual Xeon 64bit. Both are very powerful
hardware of their kind. Still these queries bring us down due to
huge database size.

But this never happens when marketing team calls us and we keep
them on the phone and try to find out what’s wrong. Because when
they are calling us and talking to us, they are not running
any of the reports which brings the servers down
. They are
working somewhere else on the site, mostly trying to do the same
things complaining users are doing.

Let’s look at the indexes:

Table: aspnet_users
Clustered Index = ApplicationID, LoweredUserName
NonClustered Index = ApplicationID, LastActivityDate
Primary Key = UserID

Table: aspnet_membership
Clustered
Index = ApplicationID, LoweredEmail
NonClustered = UserID

Table: aspnet_Profile
Clustered
Index = UserID

Most of the indexes have ApplicationID in it. Unless you put
Application=’…’ in the WHERE clause, it’s not going to use
any of the indexes. As a result, all the queries will suffer from
Table Scan. Just put ApplicationID in the where clause (Find your
applicationID from aspnet_Application table) and all the queries
will become blazingly fast.

DO NOT use Email or UserName fields in WHERE clause.
They are not part of the index instead LoweredUserName and
LoweredEmail fields are in conjunction with ApplicationID field.
All queries must have ApplicationID in the WHERE
clause.

Our Admin site which contains several of such reports and each
contains lots of such queries on aspnet_users, aspnet_membership
and aspnet_Profile tables. As a result, whenever marketing team
tried to generated reports, they took all the power of the CPU and
HDD and the rest of the site became very slow and sometimes
non-responsive.

Make sure you always cross check all your queries WHERE and JOIN
clauses with index configurations. Otherwise you are doomed for
sure when you go live.

Calculate code block execution time using “using”

Here’s an interesting way to calculate the execution time of a
code block:

private void SomeFunction() { using (
new TimedLog(Profile.UserName, Some
Function
)) { … … } }

You get an output like this:

6/14/2006
10:58:26 AM
4b1f6098-8c9d-44a5-93d8-e37394b6ef18
SomeFunction
9.578125

You can measure execution time of not only a function, but also
smaller blocks of code. Whatever is inside the “using” block, gets
logged.

Here’s how the TimedLog class do the work:

public class TimedLog
: IDisposable {
private
string _Message; private long _StartTicks; public TimedLog( string userName, string message) { this ._Message = userName +
t +
message; this ._StartTicks = DateTime.Now.Ticks; } #region IDisposable Members void IDisposable.Dispose() {
EntLibHelper.PerformanceLog(
this ._Message +
t +
TimeSpan.FromTicks(DateTime.Now.Ticks
this ._StartTicks).TotalSeconds.ToString()); }
#endregion }

We are using Enterprise Library to do the logging. You can use
anything you like on the Dispose method.

The benefit of such log is, we get a tab delimited file which we
can use to do many types of analysis using MS Excel. For example,
we can generate graphs to see how the performance goes up and down
during peak hours and non peak hours. We can also see whether there
are high response times or not and what is the pattern. All these
gives us valuable indications where the bottle-neck is. You can
also find out which calls take most of the time by doing sort on
the duration column.

Cost effective, failsafe, 2 server load balanced web & DB servers

We were very poor when we first started Pageflakes so we needed a really
cheap but fully reliable deployment configuration. The requirement
was:

  1. Fail safe. If a server goes down, another one can take over
    without users noticing
  2. Database must be mirrored in multiple servers
  3. Web servers will be in Load Balanced mode so that incase one
    goes down, the other takes over immediately
  4. Do it with least hardware and cost

So, we thought a lot and could not make a reliable solution
without at least three servers. Then we found one with two servers
only. Here’s the plan:


We made the above plan for our hosting service provider.

We have 2 windows server both with IIS 6.0 and SQL Server 2005.
Let’s call these servers Web server and DB server.

Web server gets 60% web traffic configured via Network Load
Balancing. We used Windows Network Load Balancing in order to save
the cost of buying a separate Load balancer. We also used Windows
Firewall instead of external firewall. You can imagine how poor we
were. DB Server gets 40% web traffic and hosts the database in its
SQL Server 2005. We started with SQL Server 2005 Workgroup Edition
because it was the only version we could afford. So, we could not
use the new DB Mirroring feature instead we had to use Log
Shipping. The web server also had SQL Server 2005 but it was used
for a standby database so we did not have to pay for licensing fee.
Remember: You do not need to pay for SQL server license if
the server is hosting standby databases only.
The DB
Server also runs the web site in its IIS.

So, we have 2 servers running the website in NLB. So, web server
is properly load balanced and failsafe. Then we have database on
the DB Server which is log shipped to the web server. So, incase DB
server goes down, we can divert all traffic to the web server and
bring up web server’s standby database in order to run the site
solely from web server. When DB server comes back online, we
configure log shipping the opposite way and divert most of the
traffic to the DB server. Thus DB server becomes the web server and
previously web server becomes the DB server. It requires some
manual work and not fully automated. But the cheapest solution for
a reliable configuration.

So when you add more servers in future, it’s very easy to add
new ones.


Things you must do for ASP.NET 2.0 Membership Provider before going live

Some tweakings you should do on your web.config if you are using
ASP.NET 2.0 Membership Provider before you go live on your
production server:

  1. Add “applicationname” attribute in Profile Provider. IF you do
    not add a specific name here, Profile provider will use a GUID. So,
    on your local machine you will have one GUID and on production
    server you will have another GUID. If you copy your local DB to
    production server, you won’t be able to reuse the records available
    in your local DB and ASP.NET will create a new application on
    production. Here’s where you need to add it:
  1. < profile enabled =”true” > < providers > < clear /> < add name =”…” type =”System.Web.Profile.SqlProfileProvider”
    connectionStringName
    =”…” applicationName =”YourApplicationName” description =”…” /> providers >
  2. Profile provider will automatically save the profile whenever a
    page request completes. So, this might result in unnecessary UPDATE
    on your DB which has significant performance penalty. So, turn off
    automatic save and do it explicitly from your code using
    Profile.Save();

    < profile enabled =”true” automaticSaveEnabled =”false” >
  3. Role Manager always queries database in order to get the user
    roles. This has significant performance penalty. You can avoid this
    by letting Role Manager cache role information on cookie. But this
    will work for users who do not have a lot of roles assigned to them
    which exceeds the 2 KB limit of Cookie. But it’s not a common
    scenario. So, you can safely store role info on cookie and save one
    DB rountrip on every request to .aspx and .asmx.

    < roleManager enabled =”true” cacheRolesInCookie =”true” >
  4. See my previous
    post
    on optimizing two important SPs used by Profile
    Provider.

Redirect Traffic from old to new server when you change hosting service

How do you redirect your users when you change hosting service?
Currently your domain maps to the IP which is with the current
hosting provider. When you change to a new hosting provider, you
get a new IP range for your new servers. So, even if you change the
DNS mapping, it will take at least 4 days to get refreshed all over
the world. So, during these 4 days, some user will go to new IP and
some will go to old IP.

The problem here is redirecting all users to the new IP without
letting themknow anything.

At Pageflakes, We have
done this many times. We had all sorts of problem with our hosting
providers and have changed servers almost once every month. So, we
had to come up with a solution which works transparently and
without any downtime.

Here’s what we do. First we map a new subdomain like
new.pageflakes.com to the new server IP. Then we create a new web
site (not virtual directory) on old web server called
“Redirectors”. It maps to a folder which has nothing but a
global.asax and web.config. Then we go to Web site Properties ->
Home Directory -> Configuration and map ASP.net to receive all
web requests. This includes all URLs including .html, .gif, .css.
Js etc.



Next, we write the following code in Global.asax which redirects
all traffic to the new server.


protected
void Application_BeginRequest(Object
sender, EventArgs e)
    {


string url =
HttpContext.Current.Request.Url.AbsolutePath;

string QueryParams =
HttpContext.Current.Request.QueryString.ToString();

if (QueryParams !=
"") {
          Response.Redirect(
"http://new.pageflakes.com" + url +
"?"+ QueryParams);
   }
else {
          Response.Redirect(
"http://new.pageflakes.com" + url );
   }
    }

So, anyone trying to go to www.pageflakes.com/aboutus.html
gets redirected to new.pageflakes.com/aboutus.html. The redirector
keeps the query string and logical path intact. So, complicated
URLs like
www.pageflakes.com/flakes/amit/notepad/notepad.html?haha=huhu&hehe=hoho

gets converted nicely.

After doing this, we stop the default web site which is
listening to port 80 and turn on the redirector. By this time, new
server is already up and running on the new subdomain.

Now we do the DNS change and map the new server’s IP to www.

So, those users who still have the old IP in their DNS cache
goes to the old server and then gets redirected to the new server.
But after a while when their DNS cache gets refreshed and they get
the new IP, all their requests go to the new server, so we have
nothing to do here. After 4 or 5 days, we can safely wipe out the
old server and start kicking old hosting providers backend.

Anyone has a better idea?

Pageflakes – the AJAX king

Here’s what a Digg article says:


Pageflakes and YouOS – not Google – are the Ajax
Kings

InformationWeek took a close look at Google and about 20
startups to find out who’s the Ajax King. In the “webtop” or
“personalized startpage” category – arguably the most important of
all categories – the winners are Pageflakes and YouOs, ahead of
Google and Microsoft.

I am very happy to see we again got recognized as the best
Webtop ahead of Microsoft and Google. The first one was on Web 2.0 Awards. Pleeeeeeeeeease give
us a Digg and don’t miss the article on InformationWeek.

Forms authentication failed for the request. Reason: The ticket supplied was invalid. (Solution)

When you turn on web garden (multiple process per application
pool) or go into multi server load balance deployment where servers
are serving the same website, you will have Forms Authentication
problem. Users will get automatically logged out or see the Yellow
screen of death (ASP.NET error page) frequently. This happens
because ASP.NET encrypts the login information in cookie. But the
encryption key is unique for each machine and for each process. So,
if one user hits Server #1 and gets an ecnrypted key, and then the
next hit goes to Server #2, it will fail to decrypt the cookie and
log user out or throw user the asp.net general error message.

In order to prevent this on your production server, you need to
remember this before you go live:

The reasons for a forms auth ticket failing are normally that
either the validation key or the decryption key are not in sync
across all servers in a web farm. Another potential reason can be
if both ASP.NET 1.1 and ASP.NET 2.0 applications are issuing forms
auth tickets with the same domain and path.

For the first case, setting the validationKey and decryptionKey
attributes explicitly on on each web server
will solve the problem.

For the second case, setting the validationKey and decryptionKey
attributes explicitly in for *both* the
ASP.NET 1.1 and ASP.NET 2.0 applications is necessary. Additionally
on the ASP.NET 2.0 apps, the “decryption” attribute in
should be set to “3DES”.

This is what I learned from Stephan Schackow (Microsoft Atlas
team). Here’s how the machine.config should look like:

< system .web > < processModel autoConfig =”true” /> < machineKey validationKey =”…” decryptionKey =”…” validation =”SHA1″ />

is the node that you need to introduce inside
the node if it does not exist.

Catch No 1:

How do you generate the machine key? You need to use a utility
to produce the key for your PC. I have made a .exe which can
generate such keys for you. Here’s how you run it:

SecurityKey.exe 24 64

Download
here

It takes two parameters and these are the exact values you have
to provide.

For super cautious IT guys like me, don’t worry, it’s a .NET 2.0
binary, no virus. Test the strength of your anti-virus software on
this .exe if you like. If it can’t find any virus then either I’m
smarter than you and have hidden a virus successfully or there’s no
virus at all and I am not that smart as I sound like.

Catch No 2:

You have put the machine keys in both machine. Restarted IIS.
Even restarted your server. But you still see lots of Event Log
error entries which shows users are still getting the dreaded
“Forms authentication failed for the request. Reason: The ticket
supplied was invalid”. So, what did you do wrong? You call
Microsoft support. Go to Forums and make post. Everyone says what
you did is correct.

Here’s what you need to do: wait. Wait for 2 or 3 days until all
those users come back to your website at least once. Those users
will have cookie encrypted with previously assigned encryption key
pair. Naturally, it will fail to decrypt with the new key pair you
have just specified in machine.config. So, until all those users
get a new key, you will keep on having the error message. Don’t be
alarmed if you see this even after one week or month. This just
means some user visited you after a long time and you are not doing
much good in attracting users to your site. So, if you see such
event log entries after a week or two, call you marketing team and
ask what kind of marketing they are doing.

It would be really good if there was something like “Checklist
for Going Live with ASP.NET” which stated all these issues.

How to change user name in ASP.NET 2.0 Membership Provider

Profile.UserName is a readonly field. So, how do you change a
user’s name? Many use email address as user name (like us,
Pageflakes) and we want to allow users to change their email
address which in turn changes their user name. So, how do you do
it?

It looks like there’s no easy way to do it. You have to do the
following:

  1. Create a new user using the new email address
  2. Get the password of the old account and set it to the new
    account. If you can’t get the old password via Membership provider,
    then ask user.
  3. Create a new profile for the new user account
  4. Copy all the properties from the old profile to the new profile
    object.
  5. Log out user from old account
  6. Auto sign in to the new account so that user does not notice
    what an incredible thing just happened.

Here’s how I do it:

if (Profile.UserName !=
newUserName)
{

// Changing email address of user.
Delete current user account and create

// a new one using the new email address
but the same password

if (
null !=
Membership.GetUser(newUserName))

throw
new
ApplicationException(
"There's another user with the same
email. Please enter a different email.");


MembershipUser newUser =
Membership.CreateUser(newUserName,
currentPassword);


// Create profile for the new user and
copy all values from current profile

// to new profile

ProfileCommon newProfile =
ProfileCommon.Create(newUserName,
true)
as
ProfileCommon;

    newProfile.IsInvited = Profile.IsInvited;
    newProfile.IsRealUser = Profile.IsRealUser;
    newProfile.Name = newUserName;

    newProfile.Save();


if (
Membership.ValidateUser(newUserName,
currentPassword))
    {

FormsAuthentication.SignOut();
        Session.Abandon();


// Delete the old profile and
user

ProfileManager.DeleteProfile(Profile.UserName);

Membership.DeleteUser(user.UserName);


//FormsAuthentication.SetAuthCookie(newUserName,
true);

FormsAuthentication.RedirectFromLoginPage(newUserName,

true);
    }
}