Prevent ASP.NET cookies from being sent on every css, js, image request

ASP.NET generates some large cookies if you are using ASP.NET membership provider. Especially if you are using the Anonymous provider, then a typical site will send the following cookies to every request when a user is logged in, whether the request is to a dynamic page or to any static resource:

ASP.NET_SessionId=bmnbp155wilotk45gjhitoqg; DBAUTH12=2A848A8C200CB0E8E05C6EBA8059A0DBA228FC5F6EDD29401C249D2

There are 517 bytes of worthless data being sent to every css, js and images from the browser to your webserver!

You might think 517 bytes is peanut. Do the math:

  • Avg page has 40 requests to server. 40 x 517 bytes = 20 KB per page view.
  • 1M page views = 20 GB
  • That’s 20GB of data getting uploaded to your server for just 1M page views. It does not take millions of users to produce 1M page views. Around 100k+ users using your site every day can produce 1M page views every day.

Here’s how to prevent this:

  • Setup a new website and map a different subdomain to it. If your main site is then map to it.
  • Manually change all the <link>, <script>, <img> css url(…) and prefix each resource with
  • If you don’t want to do it manually, use this solution I have done before.
  • Add a Global.asax and in the EndRequest do this trick:
    HttpContext context = HttpContext.Current;
    if (context.Request.Url.ToString.StartsWith("")
      List<string> cookiesToClear = new List<string>();
      foreach (string cookieName in context.Request.Cookies)
        HttpCookie cookie = context.Request.Cookies[cookieName];
      foreach (string name in cookiesToClear)
        HttpCookie cookie = new HttpCookie(name, string.Empty);
        cookie.Expires = DateTime.Today.AddYears(-1);

    This code reads all the cookies it receives from request and expires them so that browser does not send those cookies again. If by any chance ASP.NET cookies get injected into the domain, this code will take care of removing them.

17 thoughts on “Prevent ASP.NET cookies from being sent on every css, js, image request”

  1. Mmmm… the last thing (cookie cleanup) will work only if ASP.NET is setup to handle all and every request. By default IIS do not forward to ASP.NET non executable resources (images, css, js, etc.). At least not in IIS 6.
    I needed something similar (full asset interception) for a website and had to explicitly setup IIS to forward the requests to ASP.NET.

  2. @Omar – why don’t you just use cookieless sessions in your static website’s web.config instead of doing that code in global.asax?

    and when you’re testing your site, make sure you always open up a new Icognito Browser window, so this makes sure there’s no -previous- cookies that will be sent to the server upon the first request.

  3. @Justin: Cookieless session won’t help much since the URL still contains an encoded key to identify the session. You might save couple of bytes. But essentially the cookie is moved to the url. Also cookieless session helps ASP.NET Session only, does not help ASP.NET Membership provider cookies.
    Also the problem isn’t about first visit. As soon as you hit default.aspx and the cookies get generated, the cookies are sent for all subsequent request.

  4. Hi Omar,

    i just want to know if for example i am using membership provider which uses cookies this code will delete the cookies then membership provider will be disturbed?

    thanks in advance.

  5. Hi Omar,

    I am regular reader of your post and whether it on your blog or at CodeProject and I find them very useful and having details, These helped me a lot. Thanks for sharing all these and really useful post.


  6. I am not sure what you mean with “URL still contains an encoded key to identify the session” but I set sessin “Not enabled” in new site and applied those settings specified in here :
    And those are removing unnecessary process for static content as we always want.

    And I do not liked EndRequest operation BTW.
    Math part was good to understand the problem how much important..


  7. Do you recommend setting up some kind of dynamic robots blocking for the static subdomain?

    I’m thinking of the issue of not 301 redirecting your non-www to www. Google and then Google gets confused as to which is your real site.

  8. Obviously, having a route is preferable, but I am guessing that we could also look at the file being requested. if endswith .js, .css, could use the same process.

Leave a Reply