Skip to main content

Preventing Directory Traversal attacks in PHP

Directory traversal attacks occur when your program reads or writes a file where the name is based on some sort of input that can be maliciously tampered with.  When used in conjunction with log poisoning this can lead to an attacker gaining remote shell access to your server.

At the most simple it could be to include a file like this:

echo file_get_contents($_GET['sidebar']);

The intention would be for you to be able to call your URL and send a parameter indicating which sidebar content you want to load... like this:  http://foo.bar/myfile.php?sidebar=adverts.html

Which is really terrible practice and would not be done by any experienced developer.

Another common place where directory traversal attacks can occur is in displaying content based on a database call.

If you are reading from or writing to a file based on some input (like GET, POST, COOKIE, etc) then make sure that you remove paths.  The PHP function basename will strip out paths and make sure that you are left only with a filename.

This is still not foolproof, however, as an attacker would still be able to read files in the same directory.

A safer way to do it is to whitelist the files that are allowed to be included.  Whitelisting is safer than blacklisting, so instead of trying to exclude all malicious combinations we will rather allow only a set of safe options to be used.

Consider the following code as an alternative to the above:

$page = $_GET['page'];
$allowedPages = array('adverts','contacts','information');
if ( in_array($page, $allowedPages) ) 
{
    echo file_get_contents(basename($page . '.html'));
}

You should consider configuring PHP to disallow opening remote urls with the file stream wrapper by setting allow_url_fopen to Off in your php.ini file.  This does mean that you can't use any function that relies on the file stream (like file_get_contents) to read a URL (you'll need to use curl instead) but it does prevent an attacker from including their own code into your site.

On a system configuration scale it's ideal to have each site running in a chroot jail.  By locking down access to the user that your webserver runs under to a specific directory you can limit the impact of a traversal attack.

So in summary:

  1. Use basename() on any variable you use to include a file
  2. Set allow_url_fopen PHP setting to Off
  3. Set a whitelist of files that you allow to be included


Comments

Popular posts from this blog

Separating business logic from persistence layer in Laravel

There are several reasons to separate business logic from your persistence layer.  Perhaps the biggest advantage is that the parts of your application which are unique are not coupled to how data are persisted.  This makes the code easier to port and maintain. I'm going to use Doctrine to replace the Eloquent ORM in Laravel.  A thorough comparison of the patterns is available  here . By using Doctrine I am also hoping to mitigate the risk of a major version upgrade on the underlying framework.  It can be expected for the ORM to change between major versions of a framework and upgrading to a new release can be quite costly. Another advantage to this approach is to limit the access that objects have to the database.  Unless a developer is aware of the business rules in place on an Eloquent model there is a chance they will mistakenly ignore them by calling the ActiveRecord save method directly. I'm not implementing the repository pattern in all its glory in this demo.  

Fixing puppet "Exiting; no certificate found and waitforcert is disabled" error

While debugging and setting up Puppet I am still running the agent and master from CLI in --no-daemonize mode.  I kept getting an error on my agent - ""Exiting; no certificate found and waitforcert is disabled". The fix was quite simple and a little embarrassing.  Firstly I forgot to run my puppet master with root privileges which meant that it was unable to write incoming certificate requests to disk.  That's the embarrassing part and after I looked at my shell prompt and noticed this issue fixing it was quite simple. Firstly I got the puppet ssl path by running the command   puppet agent --configprint ssldir Then I removed that directory so that my agent no longer had any certificates or requests. On my master side I cleaned the old certificate by running  puppet cert clean --all  (this would remove all my agent certificates but for now I have just the one so its quicker than tagging it). I started my agent up with the command  puppet agent --test   whi

Redirecting non-www urls to www and http to https in Nginx web server

Image: Pixabay Although I'm currently playing with Elixir and its HTTP servers like Cowboy at the moment Nginx is still my go-to server for production PHP. If you haven't already swapped your web-server from Apache then you really should consider installing Nginx on a test server and running some stress tests on it.  I wrote about stress testing in my book on scaling PHP . Redirecting non-www traffic to www in nginx is best accomplished by using the "return" verb.  You could use a rewrite but the Nginx manual suggests that a return is better in the section on " Taxing Rewrites ". Server blocks are cheap in Nginx and I find it's simplest to have two redirects for the person who arrives on the non-secure non-canonical form of my link.  I wouldn't expect many people to reach this link because obviously every link that I create will be properly formatted so being redirected twice will only affect a small minority of people. Anyway, here's