So you think you’re in control of your website?

I’ve recently been approached about a project for me to work on, part-time and on the side.  This itself isn’t unusual, but it’s one of a very small minority which I think is a good idea and has some promise.  I’ll not go into details about it, as I don’t know the full details of what role I would be involved in, and to what extent, but I had a look into it over the weekend and discovered something which probably affects many small businesses initially, and something which could be disastrous for them.  Their SEO sucks.

It’s not that great a shock when you discover they are a small business with limited technical knowledge, but just running a small SEO scan of their site highlighted some serious errors.  Mainly with the load time (30+ seconds due to loads of javascript files and images; relevant images though), and it showed that there were around 20 javascript files included, and about the same in CSS files.  This wouldn’t be as bad if some of them were spread out to external content delivery networks (CDNs), but they were all part of the same domain.

This got me thinking to other projects I’ve worked on where a general CMS has been involved, and a semi-generic theme has been used.  They have all generally had a lot of files included in the header, and not spread out in any sort of way.  If these are then just used without any optimisation, the experience for the end user, as well as the overall search engine ranking, is going to be awful.  For businesses which are operating solely online, this is a disaster.

I use an online tool, SeoSiteCheckup, to get the overview of the site, and here’s a quick run-down using my site as an example, which scores 75%:

I’m missing a meta description in the header (this is a bug at the moment, there’s a tag with the information, but it’s labelled wrong).  This is an important piece of information as it gives search engines an idea of what teh site is about.  This tends to be what is shown in search engines when the results are shown.

I don’t have a sitemap.  Sure the navigation is straight forward, but that’s no use to a search engine which simply wants details of the pages on the site.  That will let it know which pages to look at and index so people can find what they are looking for.

I’m making more than 20 HTTP requests.  It’s only 22 at the moment, and could go up as high as 28 when I add more galleries (as there’s a limit of 9 to display on the home page).  This means that the page load has to make 22 trips to my server to get the information it needs to build the page.  As a server will handle around 5 requests per client at a time, it means there’s a queing system going on to get those resources, which is going to slow load times down.  Going back to the point earlier about using a CDN (or similar), then these requests could be shared out to different hosts in order to speed up the loading of the site.  As my site uses Twitter Bootstrap for helping the layout, this is instantly 3 requests I could offload to somewhere else; and I could also move jQuery to a CDN to reduce the load on my server still.  Another option to fix this issue would be to create a subdomain (or two) for static content (such as bootstrap, jquery and some other libraries I use) in order to keep them on my server where I have control, but trick the browser to getting them at the same time as others on the server as it’s a different request domain.

My site (home page) loading time is clocking in just shy of 4 seconds.  This is below average, but still really slow.  I know this is down to the main background image, so I will need to reduce the size of that.  It won’t make for a great viewing experienc on 4k monitors (it’s currently a 4k image), but I have to weigh that up against usability for everyone else.

This site also has issue where the www. and non-www. version serve the same content, but they don’t look the same to search engines.  Similarly the IP address of the server doesn’t redirect to the domain name, so it could be seen as different content.

These are all simple things to do, so I’ll address them and come back with the change in score to see how much better a few small changes can make.

All of the above issues I’ve pointed out are simple things to do, but only if you know what you’re doing.  The sad fact is, small businesses are not likely to know what they are doing, and therefore struggle with these aspects of SEO.  Worse still, they could be using a generic template bought from the internet for a general CMS (nothing wrong with using a general CMS), but those templates could have split everything out into lots of little files which can create a huge overhead.  Add that to the different plug-ins/widgets/modules used by a CMS and it’s a recipe for a slow site, with a poor user experience.  Not what you want at any stage in business, least of all when you’re trying to make a name for yourself and sustain yourself.

With that in mind, have a think about just how much control you think you have over your site, and then probably think again.  If needs be, hire somebody who knows what they are doing, and get them to fix the simple issues for you.