Net Effect: SEO 1010
The science of web search is a constantly moving feast. In his new column Net Effect, Stewart Baines outlines the basics.
Ever wonder how to rise above the noise?
Finding something on the internet - something you don't know much about, rather than a 'who is my local plumber' type of search - requires some skill.
As a business technology writer, it's part of my day job: I have to know how to phrase my queries, which sites will reveal certain types of results, where on a page I scan for meaningful content, where nuggets of useful data are buried or what is hyperbole. I get there eventually.
Most web users don't have the time or patience to develop the skills of a professional researcher so their search experience is the antithesis: type something into Google, choose the top three results, one of which will probably be Wikipedia.
So how do you make your site end up in the top three, or at least the first page of search results when users are searching for generic terms ('nuts and bolts') rather than specific ('Acme Inc')? (Note: you will nearly always come top in a search for your name unless you're really called Acme Inc.)
Search engine optimisation, or SEO - this sometimes arcane art - promises to raise your reputation among the billions upon billions of pages that Google scans and indexes every month. Many that practice SEO do so without really knowing for sure if it works. In fact, by being gung-ho with SEO, companies might actually damage their web marketing.
Let's start with a very basic introduction to how search engines work.
You submit your site to a search engine; it sends out spiders to search the entirety of your site that you want searched (pages that you do not want indexing should be in the robots.txt file). The spiders look at the keywords, meta tags, links and headlines. These pages are cached on the search engine's web servers and it uses proprietary algorithms on this vast database to prioritise the pages it thinks users want when they perform a search.
Initially search engines focused largely on meta tags submitted about a page's contents but this was open to widespread abuse, particularly by porn sites masquerading as something as innocuous as pages about lawn mowers or stock quotes, part of a practice known as 'spamdexing'.
The focus shifted then to the links between sites: the more links to a site, the more popular it was deemed. Again, this was open to abuse by 'blackhat' marketers in a practice known as 'link spam'.
With Google's rise to dominance, search techniques have evolved significantly. Google uses something like 200 different criteria to match the billions of pages in its caches with your search request.
One of these is PageRank, which estimates the likelihood that a given page will be reached by a web user who randomly surfs the web, and follows links from one page to another. Google also reviews links, keyword frequency, meta tags, headlines, site structure.
No one - outside of Google - knows quite how they do this, and the algorithms are constantly evolving. You can find out more on Google's help page.
We do know some basics: Google likes sites that…
Click here for page two
… have strong links to other sites, and by strong I mean inbound links from other sites relevant to your business (and most definitely not from a 'link farm' - a collection of sites that all link to every other site in the group - a 'blackhat' technique).
Other factors are important too: sites that are regularly updated attract the spiders more often and so will help to boost the position in the search rankings; personal details feed into the algorithm, as does the use of keywords within the page (do they correspond with your outbound links, meta tags and headlines?) and so on.
Fine-tuning your website to optimise these can be a very long and expensive process, especially considering SEO experts have only observed evidence to base their expertise upon. In other words, they have seen what they believe works and what doesn't but they don't know actually know why.
This may be an acceptable activity for large companies that can dedicate resources to SEO but many smaller companies probably have never even heard of it.
A recent BT Business survey of more than 70,000 websites found that only one in 3,000 are using keywords, only one in 130 are using code in their site to make them visible to the search engine spiders, and just 11 per cent update their website every couple of days.
Doubtlessly those that are updating do so for product updates rather than SEO but it does highlight how much web marketing is not a level playing field.
Stay tuned for our next column in which we'll cover how much you should trust SEO, and offer tips on how to make your site easier for searchers to find.
Stewart Baines is a director of Futurity Media, a leading technology marketing consultancy. Stewart's background is in journalism and industry analysis with expertise in collaboration, communications and green IT. Over the past 10 years, Futurity Media has worked with some of the biggest names in the technology business. Futurity Media's philosophy is simple: take complex technologies and make them accessible to a business audience. To find out more about Futurity Media's approach, go to www.futuritymedia.com