5 Critical Web Developer SEO Mistakes

Let me start out by saying – most web developers don’t make these mistakes, and are very good at what they do. The vast majority of developers that I’ve worked with over the years at theMediaFlow have listened, asking good, probing and difficult to answer questions about site migrations and best practices for very particular questions.

eraser2But every now and again I encounter a really simple, critical mistake which more recently has been OKed by the development company’s “in house SEO Specialist”. I get that companies that offer web dev want to offer a broad range of services to their clients and I’m absolutely certain that a lot of them hire people that know what they are doing.

Sometimes, however, as a solo silo within a sometimes large organisation, they don’t have a team of people to bounce ideas off, to test theories with, to experiment and see what happens in certain circumstances that are a little unusual.

I have occasionally met with initial resistance from a handful of developers who think I’m about to recommend some dark arts magic on a website, to fill a site with spam content and fire up some automated tools to gain thousands of links to the clients website. Perhaps these fears are well founded and this is what they’ve encountered in the past, but once they see the sorts of changes I am going to recommend and realise a lot of it is “best practice” that they might have initially missed, they are usually more willing to engage with the process.

Mistake 1: Domain/New Website Migrations Part 1 – No Content Audit

I get the mind-set of how this happens. You engage a new web developer/designer to build a lovely new website. You’re really excited about creating a new vision and direction for the company. Everything is full steam with the “new”. You develop a new brand for the company and get them a coveted “brandable” domain to go with it.

So you plough on, designing the flashy new pages that really sing to the audience. You’ve worked out your customer groups and create a design and interface that speaks to your main customer groups. You cut down the content pages on the website to reduce the clutter from the navigation – after all, they’re not really needed.

But did you check? Did you audit the content to determine what pages were attracting visitors and interest?

Did you look at which pages were driving conversions?

I’ve been brought in to these sorts of situations after the event & had to review what used to work on what was a large content rich website, but has now been cut down to the bare bones. The conversion rate increases that you might see with the better calls to action and a clearer user journey through a check out process won’t be as much gain as you’d expected if you lose half of your search traffic over the course of a few weeks.

It’s much easier to define at an early stage which elements of the attraction content are working and incorporate those into a new website than it is to “claw them back again” later.

Mistake 2: Domain/New Website Migrations Part 2 – No Redirection Plan

If I had a pound for every encounter I’ve had with this over the years, I’d be at least slightly richer…

From my SEO mind-set, I struggle to see how this happens when it’s completely ignored. It’s a poor user experience, it’s clearly not good for your search visibility, yet it happens all the time.

Sometimes it’s not entirely neglected, but just isn’t’t implemented correctly. There might be one or more 301 redirect chain, I still quite often see sites mistakenly using 302 redirects rather than 301s. Redirects might only have been specifically mapped for the most popular pages but other older URLs might just 404, or even redirect to a 404.

Either way – this is something that needs to be planned in advance. It’s so much easier than trying to unravel it after the event and working out what should have been done.

Mistake 3: Having a Version of a Client Website Accessible on their Own Domain

I’ve seen this more times than I would like over the last year, from large and small developers alike.

Sometimes you need a development version of a website available to show changes before they are applied to the main website. Sometimes I’ve come across situations where a developer has just said “the site has to be available here” even when it’s a live mirror of the version available on a clients’ server.

If you have to have a version of a client website available on the web, then please restrict the search engines from being able to crawl it. I’d recommend employing any/all of the following methods:

1. Authorisation – make people sign in to view the content here, rather than it being visible to any user that can find the address.
2. IP Restriction – only let the IPs that really need to be able to view this content view it.
3. X Robots – use this at a server level to set noindex for the domain

I wouldn’t recommend using robots.txt initially as a method to keep this out of the index, but it can be used if the damage has been done and a site of this type has already been indexed. You’d upload a disallow all instruction, having verified the site in Webmaster Tools and request removal of the whole domain. One final word of warning with this – if it is an exact duplicate of the live website, be incredibly careful not to allow this file to be on the client website.

Mistake 4: Lack of Crawl Error Awareness

I’m surprised by how often this one crops up again and again for me.

It amazes me how often links are hard coded into sites that point to development areas, generating an error.

How small syntax mistakes can generate a 404’ing URL for every live page of your website.

How pages can be changed without updating internal links on the website at the same time.

Perhaps we as SEOs are at more of an advantage here. Through using crawling tools like Screaming Frog, Xenu or Deep Crawl, we can mimic how a search engine spider might travel through a website and identify these issues ASAP. I don’t know how widely used tools like this are in the web development sphere, though I don’t know a huge number that would think to do this (I’d love to hear if I’m wrong on this one – I’m sure there are exceptions to all of these rules!)

Being aware that problems can occur in markup, links can break, destinations of links can change is important and it’s easier to keep on top of these as and when they happen rather than waiting until the site is falling down with errors and trying to fix them then.

Mistake 5: Duplicating Titles & Descriptions Throughout A Website By Default

I really appreciate the reasons that this area is often neglected too. Sometimes writing titles and descriptions for new pages after having slaved over the content for an age can be boring. When it comes to my own blog posts, I must confess that I don’t always follow the best practices I often preach to my customers (do as I say, not as I do).

I can understand a level of automation to apply to titles and descriptions. Clients are busy people – they want to be busy doing the things that make them money, selling products or services – whatever they might be. Sometimes it is a bit of a pain to implement a hand written title and description for every new product when you are loading 100s or 1000s a day.

I like the sites I work on to have a unique title on every page, and reducing duplication of these can really improve search traffic to the website. Often finding duplicate titles means finding duplicate content at the same time, so identifying these can be full of win. Automation generally seems to work better on titles, particularly for ecommerce sites, as each product has its own unique name and category elements can be easily pulled into here.

We often find Meta Descriptions are really not handled as well automatically in these situations. What we’d often see is the home page Meta Description being displayed on large swathes of a website. Or perhaps a press or news section where all the articles are using the same data, rather than anything specific to the content on that page.

I’ve done many an audit on a site that inherits the home page title for large groups of pages. If there was 1 page I’d say not to replicate titles and descriptions of, I’d say it was that one!

Ensuring unique titles and descriptions are on all pages is something that when fixed can really help ALL of a website to be a route into the website. Obviously there is a lot more than can be done with messages to entice the click and proper keyword targeting, but uniqueness of this data would be the first place I’d start if I wasn’t able to spend the time optimising each and every listing specifically.

Conclusion

I would like to reiterate my first statement – I know most web developers don’t do this and those that do, likely do so from ignorance rather than malice. But these are potentially huge, avoidable mistakes that are much easier to prevent ahead of the event, than to fix afterwards.

Throughout my career I’ve tried to help and advise developers about SEO concerns and work closely together – after all, these are the people that have got to implement what you recommend a lot of the time and it’s a lot easier to do this when all parties are on the same side. Educating all parties as to reasons for changes can mean not having to make the same recommendations over and over again and leave you with the time to get on with actual “online marketing” type SEO work, rather than just the “fix everything that is broken” type of SEO work.

I won’t lie – I quite enjoy unravelling a websites “technical spaghetti” tangle and it’s one of the things I think I’m best at. But it would be much easier if we didn’t have to do this in the first place.

2 thoughts on “5 Critical Web Developer SEO Mistakes

Leave a Reply

Your email address will not be published. Required fields are marked *