26 August 2015

Setting up a new user in Ubuntu from scratch

Adding new users to Ubuntu is easy because of the convenience tools that exist.

Start with the command

sudo useradd -d /home/testuser -m testuser

This creates the user and sets up a default home directory.  The user doesn't have a password, but you could add one with passwd if you wanted to.

Then create a directory .ssh in their home directory.  Create a file called authorized_keys in the directory and copy in contents of the users public key into it.

Chown the .ssh directory (and file) to the user and chmod the file to 600.

Make sure that /etc/sshd_config is set up to deny logging in by password.

The user should be able to login using their public key by setting up their .ssh/config on their home machine.
Host foo
HostName server.ip.address
User testuser
IdentityFile ~/.ssh/id_rsa


18 August 2015

Fixing missing msvcp110.dll in xampp on Windows

Microsoft Happiness
I need to use a Windows server to deploy a program I'm busy writing.

I kept on getting a problem with Apache starting up that prevented it from including the MS sql server pdo drivers.

The message was:
the program can't start because msvcp110.dll is missing from your computer. Try reinstalling the program to fix the problem
and this usually relates to the Visual C++ Redistributable for Visual Studio 2012 package not being installed.

I had previously installed this (it's available on the Microsoft site) but was still getting the error.

Eventually I stumbled on this topic on the Apache friends forum which advised to copy the msvcp110.dll file to both
the \xampp\apache\bin and the \xampp\php directories.

Apparently Apache wasn't able to follow the OS path to find the file.  In my case it was already in the php directory but not with the other binaries for Apache.

After copying it there Apache restarted without errors and the PDO driver for MS-SQL was enabled.

11 August 2015

Fixing broken thumbnails in Drupal


If your autoscaled images are not loading on Drupal then here are some steps to take to troubleshoot the problem.

Firstly, if you have logging enabled then visit http://yoursite.com/admin/reports/dblog to see a list of the events.

If you see events with the message "Unable to generate the derived image located at...." then check the permissions on your files directory.  This is usually /sites/default/files.

Then check that the GD library is installed for PHP.  On Ubuntu you can install it with apt-get install php5-gd

If you don't see any events then try opening the image url in a new tab on your browser to confirm that your web server is properly rewriting the request to index.php instead of trying to serve a file out of the directory.

On Nginx you should consider using location blocks like this:

   # These next locations help with thumbnails - https://www.drupal.org/node/2374961  
   location @rewrite {  
     rewrite ^/(.*)$ /index.php?q=$1 last;  
   }  
   location ~ ^/sites/.*/files/styles/ {  
     try_files $uri @rewrite;  
   }  

This makes sure that requests to locations in the styles directory are routed to index.php if the requested file does not exist in the filesystem.

13 July 2015

Setting up Nginx as a reverse proxy for Apache with SSL termination

Reverse Proxy diagram from Wiki Commons
We're currently hosting client sites on a Rackspace server and using their Load Balancer feature as a way to terminate SSL and not have to have multisite certificates.

We only attach one node to the Load Balancer so we're paying for more than we're using.  My proof of concept is to use Nginx to terminate SSL certificates and proxy to the Apache server.  This will save us £ 225 per load balancer, and since we're using ten of them that's quite a significant saving.

My first step was to spin up a free tier EC2 instance running Ubuntu 14.04 LTS.  I guess you can replace this with your favourite cloud or on-the-metal server.

Then I installed my packages. These are the ones I remember so YMMV.

 sudo apt-get install nginx apache2 fail2ban php5-fpm mcrypt php5-mcrypt openssl php5-cli php5 libapache2-mod-php  

My network diagram is slightly different from the picture for this post in that the web server is hosted on the same machine as the proxy.

I decided to run Apache on port 8000 and listen only to connections from localhost. Nginx would listen on port 80 and forward requests to Apache. I decided to let Nginx serve static content because it's pretty quick at doing so and this saves Apache from being overwhelmed by requests.

Configuring Apache

My first port of call was to edit /etc/apache2/ports.conf and make sure that my Listen line looks like this: Listen 127.0.0.1:8000

Then I created two virtual hosts to test with.  Here's a sample:

 <VirtualHost *:8000>  
      ServerName dummy1.mydomain.co.uk  
      ServerAdmin webmaster@localhost  
      DocumentRoot /var/www/dummy3/  
      ErrorLog ${APACHE_LOG_DIR}/dummy3_error.log  
      CustomLog ${APACHE_LOG_DIR}/dummy3_access.log combined  
 </VirtualHost>  

I made a simple index.php file in two new directories /var/www/dummy1 and /var/www/dummy3 which just output two server variables for me to test with. I also copied an image file into those directories so that I could check how static assets would be served.

 <?php  
 echo $_SERVER['SCRIPT_FILENAME'] . '<br>';  
 echo $_SERVER['SERVER_SOFTWARE'] . '<br>';  

Configuring Nginx

I decided to use self-signed certificates for testing and reserve dummy2 for a trial run of a free ssl certificate.  There are quite a few certificate signers who will give you a 30 day certificate to trial.

I created an /etc/nginx/ssl directory because for some reason I don't like to contaminate my conf.d directory and made subdirectories for my sites under that.

I created self-signed certificates (commands are at the top of my host file) and set up the vhosts like this:

Now when I hit a static file on HTTP or HTTPS then Nginx serves it up directly. Inspecting the response headers with your favourite browsers debug tool will confirm that images are served by Nginx. Visiting the index file shows that it's loading the correct one and is being handled by Apache. Lastly, checking the certificate will show you that each site is using its own certificate.

That has the potential of saving my company £ 2200, which is a happy thing to be able to do in your first week while your boss is watching :)

10 July 2015

Securing Jenkins with oAuth

Jenkins is pretty easy to secure with the help of some useful plugins.

The first that I suggest using is an oAuth provider.  Our repositories are hosted on Bitbucket so I'm using their oAuth, but there is also a Github oAuth plugin.  The instructions to set up the plugin are very clear (see the plugin page).

When you're configuring your Jenkins to use the oAuth security remember to leave the Authorization setting to "logged in users can do anything" for now.  We'll change this later, but we don't want to get locked out of Jenkins when we apply the security settings.

Now install the plugin Role Based Authentication Strategy (see the plugin page).

Add a new group called "Anonymous" and uncheck everything.

When a user logs into the oAuth they'll be given a message by Jenkins saying that they don't have any permissions.  This means that not everybody with a Bitbucket account can access your site so thats a good thing.

You just need to add them to the roles plugin settings.  Click Manage Jenkins then Manage and Assign Roles.  Click on assign roles and add the user.  Then tick the boxes of the roles you want to assign them.




Checking the SSL certificates for a list of domains

We have a number of domains that are secured by SSL and need to be able to automate checks for certificate validity and expiry.

Luckily there is a script to do exactly this.  On Ubuntu you can apt-get-install ssl-cert-check but there are copies of the script online in case your distro doesn't have it as a package.

Create a file with a list of the domains you want to check and the port to check on.  It should look something like this:
 yourdomain.com 443  
 www.anotherdomain.com 443  
 www.yetanotherclientdomain.com 443  

Lets assume you called your file domainlist.txt

You can then run ssl-cert-check -f domainlist.txt to get the tool to run through it and display to console the status and expiry date of the domains you provided.

Using the options shown in the help page for the script lets you use the script to send an email to you if a certificate is going to expire soon.

ssl-cert-check -a -f domainlist.txt -q -x 30 -e yourmail@foo.com

If you get a message about a missing mail binary you'll spot that in the script (line 243) it looks in a variety of locations for a file called mail or mailx.  An appropriate binary in Ubuntu is contained in the heirloom-mailx package so installing that will solve your problem.

27 May 2015

Allowing the whole wide world to read your S3 bucket

This is a bucket policy that you can use to allow the whole world the ability to read files in your s3 bucket.

You might want to do this if you're serving static web content from S3 and don't need the fine grained control that the Amazon documentation details.


You will still need to set up permissions on the bucket but this policy will let people read the files you're storing on S3.