Web services are the most attractive target for hackers because
even a pre-school hacker can bring down a server by repeatedly
calling a web service which does expensive work. Ajax Start Pages
like Pageflakes are the
best target for such DOS attack because if you just visit the
homepage repeatedly without preserving cookie, every hit is
producing a brand new user, new page setup, new widgets and what
not. The first visit experience is the most expensive one.
Nonetheless, it’s the easiest one to exploit and bring down
the site. You can try this yourself. Just write a simple code like
this:
1: for( int i = 0; i < 100000; i ++ )
2: {
3: WebClient client = new WebClient();
4: client.DownloadString("http://www.pageflakes.com/default.aspx");
5: }
In your great surprise, you will notice that, after a couple of
call, you don't get valid response. It’s not that you have
succeeded in bringing down the server. It’s that your
requests are being rejected. You are happy that you no longer get
any service, thus you achieve Denial of Service (for yourself). I
am happy to Deny You of Service (DYOS).
The trick I have in my sleeve is an inexpensive way to remember
how many requests are coming from a particular IP. When the number
of request exceeds the threshold, deny further request for some
duartion. The idea is to remember caller’s IP in Asp.net
Cache and maintain a count of request per IP. When the count
exceeds a predefined limit, reject further request for some
specific duration like 10 mins. After 10 mins, again allow requests
from that IP.
I have a class named ActionValidator which maintains a count of
specific actions like First Visit, Revisit, Asynchrnous postbacks,
Add New widget, Add New Page etc. It checks whether the count for
such specific action for a specific IP exceeds the threshold value
or not.
1: public static class ActionValidator
2: {
3: private const int DURATION = 10; // 10 min period
4:
5: public enum ActionTypeEnum
6: {
7: FirstVisit = 100, // The most expensive one, choose the valu wisely.
8: ReVisit = 1000, // Welcome to revisit as many times as use likes
9: Postback = 5000, // Not must of a problem for us
10: AddNewWidget = 100,
11: AddNewPage = 100,
12: }
The enumeration contains the type of actions to check for and
their threshold value for a specific duration – 10 mins.
A static method named IsValid does the check. It
returns true if the request limit is not passed, false if the
request needs to be denied. Once you get false, you can call
Request.End() and prevent Asp.net from proceeding further.
You can also switch to a page which shows “Congratulations!
You have succeeded in Denial of Service Attack.”
1: public static bool IsValid( ActionTypeEnum actionType )
2: {
3: HttpContext context = HttpContext.Current;
4: if( context.Request.Browser.Crawler ) return false;
5:
6: string key = actionType.ToString() + context.Request.UserHostAddress;
7:
8: HitInfo hit = (HitInfo)(context.Cache[key] ?? new HitInfo());
9:
10: if( hit.Hits > (int)actionType ) return false;
11: else hit.Hits ++;
12:
13: if( hit.Hits == 1 )
14: context.Cache.Add(key, hit, null, DateTime.Now.AddMinutes(DURATION),
15: System.Web.Caching.Cache.NoSlidingExpiration, System.Web.Caching.CacheItemPriority.Normal, null);
16:
17: return true;
18: }
The cache key is built with a combination of action type and
client IP address. First it checks if there’s any entry for
the action and the client IP in Cache or not. If not, start the
count and store remember the count for the IP in cache for the
specific duration. The absolute expiration on cache item ensures
after the duration, the cache item will be cleared and the count
will restart. When there’s already an entry in the cache, get
the last hit count, and check if the limit is exceeded or not. If
not exceeded, increase the counter. There is no need to store the
updated value in the cache again by doing: Cache[url]=hit; because
the hit object is by reference and changing it means it gets
changed in the cache as well. In fact, if you do put it again in
the cache, the cache expiration counter will restart and fail the
logic of restarting count after specific duration.
The usage is very simple:
1: protected override void OnInit(EventArgs e)
2: {
3: base.OnInit(e);
4:
5: // Check if revisit is valid or not
6: if( !base.IsPostBack )
7: {
8: // Block cookie less visit attempts
9: if( Profile.IsFirstVisit )
10: {
11: if( !ActionValidator.IsValid(ActionValidator.ActionTypeEnum.FirstVisit) Response.End();
12: }
13: else
14: {
15: if( !ActionValidator.IsValid(ActionValidator.ActionTypeEnum.ReVisit) ) Response.End();
16: }
17: }
18: else
19: {
20: // Limit number of postbacks
21: if( !ActionValidator.IsValid(ActionValidator.ActionTypeEnum.Postback) Response.End();
22: }
23: }
Here I am checking specific scenario like First Visit, re-visit,
postbacks etc.
Of course you can put in some Cisco firewall and prevent DOS
attack. You will get guaranty from your hosting provider that their
entire network is immune to DOS and DDOS (Distributed DOS) attacks.
What they guaranty is network level attack like TCP SYN attacks or
malformed packet floods etc. There is no way they can analyze the
packet and find out a particular IP is trying to load the site too
many times without supporting cookie or trying to add too many
widgets. These are called application level DOS attack which
hardware cannot prevent. It must be implemented in your own
code.
There are very few websites out their which take such precaution
for application level DOS attacks. Thus it’s quite easy to
make servers go mad by writing a simple loop and hitting expensive
pages or web services continuously from your home broadband
connection. I hope this small but effective class will help
you implement DOS attack in your own web applications.
Update
Here's the code of the full ActionValidator class:
// Copyright (c) Omar AL Zabir. All rights reserved.
// For continued development and updates, visit
http://msmvps.com/omar
using System;
using System.Data;
using System.Configuration;
using System.Web;
using System.Web.Security;
using System.Web.UI;
using System.Web.UI.WebControls;
using System.Web.UI.WebControls.WebParts;
using System.Web.UI.HtmlControls;
///
///
namespace Dropthings.Web.Util
{
public static class ActionValidator
{
private const int DURATION = 10; // 10 min period
/*
* Type of actions and their maximum value per period
*
*/
public enum ActionTypeEnum
{
None = 0,
FirstVisit = 100, // The most expensive one, choose the
value wisely.
Revisit = 1000, // Welcome to revisit as many times as user
likes
Postback = 5000, // Not must of a problem for us
AddNewWidget = 100,
AddNewPage = 100,
}
private class HitInfo
{
public int Hits;
private DateTime _ExpiresAt =
DateTime.Now.AddMinutes(DURATION);
public DateTime ExpiresAt { get { return _ExpiresAt; } set {
_ExpiresAt = value; } }
}
public static bool IsValid( ActionTypeEnum actionType )
{
HttpContext context = HttpContext.Current;
if( context.Request.Browser.Crawler ) return false;
string key = actionType.ToString() +
context.Request.UserHostAddress;
HitInfo hit = (HitInfo)(context.Cache[key] ?? new
HitInfo());
if( hit.Hits > (int)actionType ) return false;
else hit.Hits ++;
if( hit.Hits == 1 )
context.Cache.Add(key, hit, null,
DateTime.Now.AddMinutes(DURATION),
System.Web.Caching.Cache.NoSlidingExpiration,
System.Web.Caching.CacheItemPriority.Normal, null);
return true;
}
}
}
Mate, you write very interesting stuff. You do realise that if someone wants to do a DOS badly, you cannot stop it right?
Apart from shutting the machine.
IP addresses can be faked. As a matter of fact MAC addresses can b faked too. I don’t know any solution that can really prevent DOS. Also organised DOS attack spread across a continent would be worst.
DOSdos
Hi Omar,
Thanks for your article, I wonder though whether there's any built-in features in IIS that could help stop DOS attacks. Your code is very good but the problem is that you have to make ASP.NET handle all files as the attack could be done by trying to download any type of files, images, js files .. etc
Great article. However, I am trying to understand why your code does not allow search engines in? Your code returns false for 'not valid' when checking for context.Request.Browser.Crawler. Doesn't that prevent search engines from indexing your page?
Hello Omar, question:
What about if it's a company using only one IP address(external) behind a router doing NAT? Employees try to access the webapp and will they be denied to access it?
This isn't DDOS prevention.
Your web server still has to accept the connection and work with it, thereby being busy.
So an attacker can still lock up your web server.
For a better DDOS prevention you have to work at the firewall level – but it's still not perfect.
Waleed,
ASP.NET does not handle requests to static files like JS, CSS, html etc unless you have Wildcard mapping turned on. So, this code only protects calls going to ASP.NET
Peter Tran,
You can take out the Crawler check. My intention was to prevent crawler. But that prevents valid crawlers as well. You can also check if the Crawler name is “Unknown”.
mlb,
Proxies that hide user’s IP and uses the public IP of the proxy will be treated as one IP. So, you just adjust the threshold values in such a way that it allows such proxies, but also prevents request floods.
Google Me,
This prevents Application level DDOS to some extent. It saves your application from burning out CPU/DISK. As the check is minimal code, it takes much less CPU then actually executing a page or some functionality.
Network level DOS attacks are protected by Firewalls.
Can someone give me more info. on the following:
Profile.Isfirstvist??
Do we have to create this property.. It is not a profile property..
Where we getting the hitinfo() class from? I cant seem to find its source..
Any help is greatly appreciated..
>>It's a custom boolean property that I added to Profile.
1. What is a “Profile?” What is a “HitInfo?” What is a “Profile.IsFirstVisit?” Are these things just pulled out of thin air? These things are unknown in MSDN.
2. Also, why is a “var” being used in C#?
>>var hit =(HitInfo)(context.Cache[key] ??
new HitInfo());
3. What class does OnInit derive from?
Am I the only one that cannot compile this example at all?
Do you have a downloadable file Omar? What references did you set up?
Denial of Service Attack Prevention should be done at Hardware/Firewall leve, NOT in your web-application.
It is too expensive and error-prone to do it in your application. FireWalls/Hardware/Routers are adept at doing such a thing.
Please read this para:
“Of course you can put in some Cisco firewall and prevent DOS attack. You will get guaranty from your hosting provider that their entire network is immune to DOS and DDOS (Distributed DOS) attacks. What they guaranty is network level attack like TCP SYN attacks or malformed packet floods etc. There is no way they can analyze the packet and find out a particular IP is trying to load the site too many times without supporting cookie or trying to add too many widgets. These are called application level DOS attack which hardware cannot prevent. It must be implemented in your own code.”
You cannot add logics like someone is trying to add too many widgets on your Firewall/Hardware/Routers.
Hi Omar,
Thanks for the info, I was wondering why you dont replace the cached item after updating the HitInfo.Hit? Is the object in cache a different copy of the object that is incremented?
“if( hit.Hits == 1 )
context.Cache.Add(key, hit, null, DateTime.Now.AddMinutes(DURATION),
System.Web.Caching.Cache.NoSlidingExpiration, System.Web.Caching.CacheItemPriority.Normal, null);”
Why not
“
if (hit.Hits > 1) {
Cache.Remove(key);
}
Cache.Add();
“
As the cache is in-memory cache, the Hit object is referenced as By Ref. So, making any modification to the object updates the actual object inside the cache.
Why you didn't write this ddos-protection code in HttpModule. It will be faster to process request and denie them (if ddos detected) on HttpModule level, isn't it?
Don't want to do the calculation on each and every call. Only where expensive operations are performed.
This is a joke or a bug ?
if( context.Request.Browser.Crawler )
return false; <----
you do not won crawlers on your site ?
I need help! Someone threatened to ddos my site! What should i do?!
I created the following to replace the Profile.IsFirstVisit
public bool IsFirstVisit()
{
if (Session[“IsFirstVisit”] == null)
{
Session[“IsFirstVisit”] = “false”;
return true;
}
else
{
return false;
}
}
I only wish I had found it sooner.
Hi Omar,
Good article. I have read through the comments, and pretty much agree with the opinion that this will not stop DDOS, and will only be effective against very basic DOS attacks.
However, having said that, this type of code does have its place.
Where I would use this, is for trying to protect my content against screen-scraping export. It’s pretty common that people who want to steal your content will write a simple application to repeatedly hit the site to screen-scrape the results. This code sample could be tuned such that an attempt to do this would need to be performed so slowly that it would take an inordinant amount of time to extract a useful portion of the data.
However, against more advanced attacks of this kind which are either distributed, or using some spoofing mechanism, it would again be impossible to prevent.
To respond on your behalf to the comment about corporate environments, well, you will be tuning your parameters to allow the maximum throughput before overloading your resources, so, if a single IP was reaching that level, then regardless of them being a single user, or multiple inside a proxy, I would still cut them off because they’re going to take the server down! So the point is moot.
If you really want to prioritise them, well, that would need to be examined on a case by case basis. No way around that.
Thanks for taking the time to publish your ideas and code.
hi omar
can you put the .cs file to download?
it is hard to combine and compile the code parts from here
i forgot another point.
there is a module for iis which can prevent dos attack (not perfectly)
read this Page
http://www.iis.net/download/DynamicIPRestrictions
i think that firstvisit should be changed somewhere am i true?
I am sorry, cache will not prevent any ddos attack.
Actually, if you pissed of a large botnet owner, there is nothing you can do, other then to pay large amount of cash to host your website in a cloud based hosting environment.
Lets say I am not using WebClient (which most hackers obviously do not use) and I am not using any caching mechanism, just sending HTTP GET packets to your Search.aspx?string=*, thus forcing your web application to run a ‘SELECT * FROM info WHERE topic LIKE %’whatever’%’.
Your db + application server will crash instantly.
This is how most DDoS Attacks look like.
1. You can deny a certain ammount of request per IP/per TimeSpan etc.
2. Caching will not help you.
Max Brin,
What type of caching are you talking about? He’s using asp.net cache object not browser cache .
@Max are you implying a SQL injection vulnerability as well?
Because, if so, the topic of DoS attacks becomes moot – you have far bigger problems.
I am not even going to comment on Max’s confusion with respect to SQL and Cache… The points he makes are not even relevant.
…
To say that no hardware can prevent a DOS attack is a gross overstatement. All hardware needs software to run, unless its an analog circuit board. All firewalls have software in them. The reason people say that hardware firwalls can’t prevent DoS attacks is because prevention of a DOS attack is specific to your application. There is nothing to stop someone from building a firewall which is programmable, configurable… so WHY NOT DO IT YOURSELF?
One way to do it is to add another dedicated server which will act as your IP filter. Let that server do all the IP caching and filtering and sending you only valid requests.
Another thing nobody mentioned is adding a rhythm sense to the filter. Most scrapers, not smart ones, will hit your site at equal intervals. So if you measure the time differece between each hit from a particular IP, even if its a second or more, you can do something about that too! Granted, there is nothing to prevent them from randomizing the intervals, which is what I did back in my web scraping days.
Rhythm may or may not be the means to detecting a distributed dos attack as well. If you can figure out how to detect a pattern across different IPs, you can block them all. For instance, if someone sets up 100 robots, each hitting you at equal intervals, starting at T+0, T+0.01, T+0.02, T+0.n, T+0.99, they have effectively set up a network of bots that hit you with one second intervals each, but collectively hitting you 100 times per second! So rhythm is important.
I have experience in signal processing back from the days when I was writing trading black boxes, where I was looking for similar price movements across 3000 stocks and I think I have some code that can be customized to do just this.
I am lauching a new website soon and the topic of DOS is one of the things that keep me up at night now.
Excellent info!
Hi rick, I have a lot of experience preventing and attacking with ddos.
In old days I have written a botnet which was then published and downloaded by many hackers/script kiddos.
DDoS is not preventable, period.
I will explain.
The purpose of DDoS is to take down a service.
When you have a large net of bots there is no way anybody could prevent DDoS.
When approaching a service, either via HTTP or any kind protocol, you are making a request, thus, forcing the server to answer you, by runing his logics.
This logic can be cached (ultimate for speed), can be protected with smart anti-dos algorithms and many other ip filtering ways.
When attacking with a large DOS network you are simulating a normal user entrance, but with a great number of users thus forcing the server to serve those users.
There is no signature (We can randomize the user agent, time etc), IP is different for each bot, and there is no algorithm that can stop this, unless your algorithm stops the service to prevent the full dos of the pc its runing on.
Imagine your server get 250,000 request from absolutely different pc’s (ip, useragent, country etc.).
How would you deny exactly those requests and keep providing service to your regular normative clients?
I promise you, there is no way.
Eventually, DDoS is ammount vs ammount.
If you have a 50 servers cluster, it would be much easier for you to handle 250,000 request per second.
If we attack with 600,000 bots you will probably die.
what a nice solution it is ! i like this solution.but i have gotten a better solution in other site.this site below
What is a denial-of-service (DoS) attack?
In a denial-of-service (DoS) attack, an attacker attempts to prevent legitimate users from accessing information or services. By targeting your computer and its network connection, …………..more
http://aspboss.blogspot.com/2011/04/web-security-tip-how-to-prevent-denial.html
“Web services are the most attractive target for hackers.” This is very true. I had this experience when my site was hacked and I really felt depressed when that happens. I did not know what to do. But with the help of my friends who are expert on this stuff, my site was retrieved.
Hey guys,
Omar’s solution is a very good one and one thing that I find very interesting is that Microsoft took this exact approach. There is now an IIS plugin that does exactly that. It is called Dynamic IP Restrictions and can be found here: http://www.iis.net/download/DynamicIPRestrictions
Regards to you all and hats down to Omar for coming up with this so long before Microsoft did 🙂
Serban
Hi Omar
Why don’t you use Global.asax?
Nicely done, sir.