Denial of service

When you consider all the possible attacks that you are likely to come under when running a successful web site, the most feared attack is denial of service - when a hacker, or potentially a group of hackers, conspire to stop your web site from working as intended. It may well be that this involves taking your database server offline, making your whole web site run slow, or, worse, taking your web site offline entirely.

There are three methods of denial of service (DoS), of which two are essentially the same:

  1. A malicious user with a fast Internet connection bombards your web server with requests, thereby overloading it

  2. A malicious user with accomplices, who may be unwitting, bombard your web server with requests, thereby overloading it. In this situation, the attackers do not need fast Internet connections - 100 requests from 10,000 people are as damaging as 1,000,000 requests from one person.

  3. A malicious user finds a hole in your web site that forces your server to perform an inordinate amount of work, thereby overloading the server.

Of the three, the first two are impossible to defend against - the world's largest sites have been taken offline by this form of denial of service, and there is nothing you can do irrespective of whether you are using PHP.

The last option, however, is something you can guard against. If you have holes in your code that can be exploited by outsiders to cause your web server to chew up 99% of your CPU time, you are in trouble.

A popular mistake is to write code that results in URLs like this:

www.example.com/article.php?file=aboutus.php
www.example.com/article.php?file=products.php
www.example.com/article.php?file=legal.php

The code for article.php will read in the $_GET['file'] variable, then include() the necessary file into the script. This might make sense at first, but consider what happens if a clever use modified the URL to this:

www.example.com/article.php?file=article.php

What will happen is that article.php will load, then include()article.php, which will load, then include()article.php, which will load, then... and so on. This will continue going on and on until your server hits the maximum execution time for a script and terminates. However, during this time your web server will be performing large amounts of unnecessary work, and will be slower for other clients connecting to it.

Now consider what would happen if that same malicious user loaded that URL three times quickly - or thirty. From that, consider what would happen if that user loaded the URL three thousand times - nothing difficult, considering that can be handled even with a slow connection. At three thousand almost-simultaneous connections, even a normal web server would have trouble coping. However, if each of those three thousand resulted in a CPU-consuming infinite include() loop, the server would simply stop responding to new requests and may well even crash.

Author's Note: if you are think three thousand pages is too much for just one client to handle, think again - HTTP has a special access method called "HEAD" that sends a request to the server, makes the server process the page fully, then returns only the header information of the page - this is usually only around 100 bytes. Three thousand times 100 bytes is three hundred thousand bytes, or 300 kilobytes - well within the reach of even lowly modem users.

The moral of the story is that you should always keep in mind the possibility that malicious users may use your own code against you. The most obvious solution to the problem detailed here is not include files based upon a variable, but if that is not possible then at least consider using include_once() to stop the recursion.

 

Next chapter: Pre-initialise important variables to safe values >>

Previous chapter: Restrict PHP database access

Jump to:

 

Home: Table of Contents

Follow us on Identi.ca or Twitter

Username:   Password:
Create Account | About TuxRadar