Tag Archive for Search engine

Search engine optimisation 2012

By Jack Willis from Marketing Grin

Biography

JackJack Willis is the Managing Director of Marketing Grin, an online marketing agency that provides SEO services to a number of businesses of varying size. Jack has over 5 years’ experience in online marketing and particularly enjoys getting stuck into online marketing campaigns where he can use all of his online knowledge. Jack also has a good level of experience in running Google AdWords campaigns, being an AdWords qualified professional, and of running social media optimisation campaigns.

Search engines

The search engines constantly update their algorithms but in 2012, Google took this to a new level. In 2012, we even saw some reputable companies that have always applied white hat, best practise approaches to SEO get hit by Google penalties. So what has changed and why and where does this leave website owners?

Google always said that you are not allowed to create links in order to manipulate the search results and any links built purely for the purpose of improving rankings are considered to be web spam and black hat. Most people interpreted web spam as being automatically generated links and believed that if they built manual links, they were safe from any penalty.

The release of Google Penguin proved just how wrong this belief was. In actual fact, Google and the other search engines can’t tell how the links were built as there is no difference between a manually created link or a link with automation tools like Winautomation or Ubot, they can only analyse link profiles and identify certain trends.

Prior to 2012

Before, Google would ignore poor links and would only take into account the good links. If you had some poor links going to your site, Google’s advice would be not to worry about. In the past, the best practises approach when building links was to put your keywords in the anchor text as it made the links more relevant and therefore carried more link juice. However, you are only able to choose the anchor text if you have built the links yourself so sites with low anchor text diversities tended to be sites with a high number of manually created links. Remember, Google says you are not allowed to manually create links for the purpose of improving search rankings. If people have naturally linked to your site, the chances are they would have used different anchor texts giving you huge anchor text diversity.

Times have changed

The internet is becoming filled with poor quality content purely designed to boost web sites search rankings. In an attempt to reduce this, Google started to penalise sites with low anchor text diversity by introducing Google Penguin. This changed the SEO world as we knew it. Poor quality links now have a negative effect on sites.

What are poor quality links?

These are links that aren’t naturally built. Google believes that links should be naturally created by people writing good quality content and others sharing it. Any links built purely for the purpose of improving your sites rankings are considered to be web spam. This doesn’t mean you can’t manually create links anymore, just that they have to look naturally created.

Google Penguin

Google Penguin is an algorithmic penalty and runs once a month at random times penalising sites that have unnatural link profiles e.g. link profiles that do not appear to be naturally created having a low anchor text diversity. Google Penguin caused a huge shake up in the SEO world and marked the change of how links are built.

Negative SEO

The difficulty with Google Penguin is that it opened the door to negative SEO. Some savvy competitors will deliberately launch a negative SEO attack on sites in an attempt to lower their link diversity and trigger a penalty as explained by Matt Cutts last week.

Google Disavow tool

Google launched a disavow link tool in early October 2012 to protect against negative SEO where users can submit links from their link profile that they wish to be ignored. The disavow tool is essentially a crowd sourcing tool where webmasters do a lot of the spam filtering on behalf of Google. The difficulty with this though is that a lot of SEOs don’t know what they are doing and will simply disavow all of their links and start again.

Example

Imagine if you put together a really good link building campaign and saw some good movement in the search engine results pages but someone else built links to the same sites and didn’t diversify their links and then got hit with Google Penguin and so reports these links to Google causing these links then get taken down. There is nothing wrong with the sites that link to these sites, only in the 2nd example, the person didn’t know what they were doing.  Is this a useful tool or a recipe for disaster?

How to protect against Google Penguin penalties

Protecting against Google penalties is simply a case of knowing your link profile.You should be running regular live link reports that detail your exact match anchors e.g. keyword, and broad match anchors e.g. keyword with extra text, and the anchor text that doesn’t include any keywords of all of your links.

There are no exact figures that you should be looking for as every niche is different but if any of your keywords have anchor text diversities of 30% or higher, you should be looking to reduce them by building links with generic anchors. A natural link profile will have a high percentage of branded anchor text (your company’s name) and a high level of naked URLs (www.yourdomain.com).