Penguin giving spammy sites an increasingly icy reception
Research carried out by an online marketing company has suggested that Google’s Penguin updates are becoming increasingly intolerant of websites that are heavily linked to by dubious sources.
When Penguin was first introduced in April 2012, it was designed to target sites with link profiles that consisted of more than 80% links deemed to have been manipulated. As the Penguin algorithm change approaches its first birthday, this figure has gradually dropped to 50%, the report says.
The increased emphasis on targeting such sites suggests Google is now more confident that Penguin is able to identify dubious links, and means that website designers will need to focus harder on making sure that their link profiles are up to scratch.
The apparent clampdown on spam-heavy sites is another example of how fast-changing the world of search engine optimisation is, and how closely online marketers need to monitor the behaviour of search engines.
It has long been known by those in the SEO industry that Google is fond of sites with original, regularly updated content. Journalistic news updates are one way to attract the approval of Google and other search engines, and sites with well-written and relevant stories tend to be the ones that curry the most favour when Penguin and Panda updates come out.
Google’s Penguin updates differ from its Panda algorithm changes in that they specifically seek to penalise ‘black hat’ SEO techniques – like buying links, overusing irrelevant keywords and duplicating content. The ultimate aim of both, however, is to ensure that websites of good quality rank highly.
Unlike with Panda, webmasters who find that their sites have slipped down Google’s rankings after a Penguin update are unlikely to find that an appeal to Google will help their cause. Affected sites need to be repaired in order to move back up the rankings.
No Comments »
No comments yet.
RSS feed for comments on this post. TrackBack URL