How to avoid search engine blacklisting
IT Pro Guide: Get SEO right and you can guarantee your website a steady stream of search engine-generated traffic. Get it wrong and you could irrevocably damage your business. Marcus Austin explains how to avoid the dreaded search engine
There are two truisms when it comes to the Internet: if you open a connection to the Internet within a few minutes someone will try and hack into it, and if you build a website, it won't be long before someone from a SEO (search engine optimisation) company will ring you and ask you if you want to be number one on Google.
SEO companies have sprung up all over the Internet in the last three years, all claimin the same thing: to get your website to the top of the search engine listings. The problem is that many are unscrupulous about how they go about this, and search engines are beginning to wise up.
In February of 2006 BMW Germany's site was de-listed from Google's index, and for a period of 24 hours anyone searching Google for BMW.DE would not have been able to find it. BMW were let off quite lightly: normally a de-listed site can take weeks if not months to get back on the database.
Many companies don't realise quite how big an effect Google and the other search engines have on site traffic. A good natural listing (a site that appears in the main listings rather than in the paid-for listings) on the combined search engines can account for up to 80 per cent of some sites' traffic, and Google accounts for 80 per cent of all search engine traffic. Take away Google and you could, in theory, cut traffic to a site by up to 64 per cent.
De-listing can be painful
The key lesson to learn from the BMW situation is that de-listing can happen to anyone, no matter how big or small. As we've seen it can be paiful too, for your bottom line as well as your corporate image, and you should do everything possible to stop it from happening. This is, however, not as straightforward as it seems.
According to Google employee and blogger, Matt Cutts, the German BMW site had been removed for violating the guideline: "Don't deceive your users or present different content to search engines than you display to users."
Google and the other engines, however, neither reveal how their search algorithms work - as they don't want SEO companies to know how to unfairly influence the results - nor do they publish what is and isn't acceptable practice for SEO. Its outlines and those of the other search engines are hazy.
The key point is to be open in everything you do. A search engine catalogues websites with applications know as bots. These applications spider the Web and your site looking at all the information contained within it and following any links on the pages. They are "dumb" applications. They can't pull down drop-down menus, they can't read text in gif or jpg files, they can't type names into search boxes, they can cope with Flash but they'd rather not, and they're particularly frightened of getting stuck in loops. Above all they hate been cheated.
SEO companies have traditionally used various practices that are designed to fool the dumb bot into thinking a page is more useful than it actually is. One method is called "cloaking", or presenting hundreds of keywords to the bot but hiding them from users by making them the same colour as the background or putting them at the foot of the page in very small text. Another is by producing so-called "doorway pages" that are full of keywords and phrases, but are never seen by the browser as the page automatically redirects to another page, often with completely different content on it.
Adopt 'best practice'
Google and all the other search engine sites want you to be honest, and display to the bot exactly what the browser will see. Richard Gregory Regional Sales manager at Google explains Google's policy "We don't offer advice or clearly defined dos and don'ts on an SEO level, but we do provide webmaster guidelines, which are pretty broad, that are available on the site. There are some technical parts to those guidelines eg what sort of code you should be using but generally it's a list of best practices that we would recommend.
"Because every website is different it's hard to have a clear list of dos and don'ts. Including a list would just encourage behaviour where people repeatedly try techniques until they find one that works that we are not aware of. We're trying to provide the best service for our users, and to provide the most relevant information to those users, anything that we feel that goes against that or that tries to influence that externally we will try and minimise the impact that that has."
Choose an accredited company
If you are going to go with an SEO company then there are several things to consider. The good companies will not offer to guarantee to get you to any particular position on a search engine overnight, they will understand that good search results are part of a long-term objective, rather than something achieved quickly, and they're also likely to be members of either SEMPO (the Search Engine Marketing Professional Organisation www.sempo.org) or SMA (the Search Marketing Association www.sma-uk.org). Neither SEMPO or the SMA guarantee, however, that their members will not or have not used techniques that the search engines frown upon.
Andy Atkins-Kruger, SMA President, explains their policy: 'What we are insisting on from members of the SMA is transparency. We ask our members to explain the risks of using SEO methods and to explain that SEO is much more of a long-term strategy.' He adds, 'When you go for the safe approach and you focus on, quality, content, and delivering a rich experience to the user, then you generally sail through algorithm updates.'
Start SEO early
If you look at the sites that are high-up in listings, the one thing you notice is how clean and simple the sites are. They're usually the sites that thought of search early on in the design process. As Simon Wiffen a Senior Web Designer at Sense Internet, points out, search needs to be included from the start. 'SEO shouldn't be viewed as a discrete process in a website's development. Too often it's approached retrospectively and that causes the frustration with performance and, eventually, reliance on underhand techniques to catch up with competitors.'
A good way of assessing how search engine friendly your site is, is to see how closely it conforms to the BSI guidelines on web accessibility (www.bsi-global.com/ICT/PAS78/index.xalter), and to use the W3C guidelines and tools on accessibility (www.w3.org/WAI/ER/tools/complete).
SEO companies can help your business get better listings on search engines, and it's a good idea to have them on board as early as possible. As with everything else, however, if someone offers you something that's too good to be true then it usually is.
FIVE WAYS OF AVOIDING BEING BLACKLISTED
- Don't deceive users, or present different content to search engines than you display to users
- Don't employ cloaking or redirects.
- Don't load pages with irrelevant key words that aren't in the main site text.
- Don't create multiple pages, sub-domains, or domains with substantially duplicate content.
- Avoid "doorway" pages created just for search engines or other approaches such as affiliate programs with little or no original content.
FIVE STEPS TO A GOOD SEARCH ENGINE LISTING
- Keep your code to a minimum. Using CSS (Cascading Style Sheets) for presentation is a very strong starting point for a good search engine ranking. You are immediately clearing out the "noise" of unnecessary markup and letting the search engines hear your message clearly.
- Design your content around the reader not the search engine.
- Write useful and relevant meta descriptions and keywords and make these specific to each page.
- Cross link intelligently and work with third parties to increase link activity to your site. Analyse and adapt to changes in the market.
- Remember that Content is King. Understand your users and design and tailor your site to their needs.
The essential guide to cloud-based backup and disaster recovery
Support business continuity by building a holistic emergency planDownload now
Trends in modern data protection
A comprehensive view of the data protection landscapeDownload now
How do vulnerabilities get into software?
90% of security incidents result from exploits against defects in softwareDownload now
Delivering the future of work - now
The CIO’s guide to building the unified digital workspace for today’s hybrid and multi-cloud strategies.Download now