Top SEO sins for small businesses to avoid

SEO

There are many articles on the web that give you good advice about Search Engine Optimisation – ensuring that your website has every chance of ranking well in search engine (mostly Google’s) results. Yet we still see small business websites that employ poor or even ‘black hat’ techniques in an attempt to rank above their competitors.  So, whether you’re just unaware of the pitfalls or you’ve been badly advised, here are some SEO sins that you should avoid.

Duplicate titles and descriptions – Each page of your website will have what are called ‘metatags’ in the source code which describe the content of the page.  There are usually 3 metatags: title, description and keywords.  The keywords metatag is no longer used by Google for ranking websites, probably due to spamming.  The other 2 tags are used, however, and appear as the page title and description in the list of search engine results.  It makes sense, therefore, to ensure that each page has a separate title and description, for search engines and users to distinguish between them.

Overuse of Images and/or Flash – Google’s crawlers can’t recognise images or Flash content very well, so if your website contains lots of either then Google won’t be able to understand what your site is all about.  Ensure your site also includes text in the form of h1 headings as well as plenty of main body text, and that any images have alt tags which tell users what they are.

Keyword stuffing – This is the practice of stuffing your keywords as many times as possible into your website content.  This is an old ‘black hat’ SEO technique that the search engines sussed long ago and is likely to get your website penalised.  It’s also fairly obvious to your website visitors and makes you look untrustworthy.  Do your keyword research, of course, to find out what words and phrases your visitors will use to find you – but ensure that these are included in a logical and readable fashion.

Cloaking – This is the dodgy technique of presenting different content or URLs to users and search engines.  In short, it means that what the search engine thinks it sees is not what is presented to the user.  This is regarded as a deceptive practice and will get your site removed from the search engine index.

Hidden text – This refers to the practice of ‘hiding’ text on your website so that it’s visible to search engines but not to the human eye.  Examples include text that is the same colour as the background (eg white text on a white background) or text that is hidden behind an image.  This is another own goal that could get you removed from the search engine index altogether.

Duplicate and/or scraped content – Duplicating the text on your own website or copying it from someone else’s website is a good way to get downgraded.  Be careful when setting up social media profiles, too – if the text is copied word for word from your website, this can be seen as duplicate content too, and won’t help your search results.

Overused links – It’s good practise to link to other relevant sites, and to have internal links in your site to help users navigate easily through your information.  In the early days of search, Google used the volume of links in and around a site to decide rankings – so of course the use of links went through the roof in an attempt to beat the system.  However, last year Google clamped down on excessive linking and downgraded many sites.  Make sure that the links you have are useful for real visitors and check regularly for any broken links (ie links that go nowhere other than a 404 error page) which could get you penalised.

Not including your address on your website – This one has been included because it’s a legal requirement in the UK for your address to appear on your website, so it’s a sin from that point of view (see Keeping Your Website Legal).  However, if you want to attract local customers to your business, then leaving it out is a shame because it will negatively affect your local search rankings.

Robots.txt file applied to your whole website – There may be some pages on your website that you don’t want Google to index and potentially show in search engines results.  To prevent Google’s crawler from seeing these pages, you can use a robots.txt file to tell it which pages to ignore.  If you don’t really know what you’re doing, you could accidentally apply this to more pages than you intended or even your whole site!  If your site seems to have disappeared from search engine results, get your webmaster to check your robots.txt file.

A Digital Footprint Audit can check where you may be falling foul of these guidelines, and pinpoint a strategy to improve your online visibility.  Contact us now for a no-obligation chat.