Saturday, 14 June 2014

0

Does Google’s Panda 4.0 Hate Too Much Syndicated Content?


Does Google’s Panda 4.0 Hate Too Much Syndicated Content?

On May 20, Google officially rolled out Panda 4.0, the latest version of its Panda search algorithm. It was a major update and it was reported that many sites across the web were significantly affected by it. Glenn Gabe, a digital marketing consultant and contributor for Search Engine Watch, analysed 27 websites hit by Panda 4.0. Let’s take a look at what he has found.
On a positive note, Gabe said some of his clients reported a surge in organic traffic they are receiving after Google implemented its latest search algorithm update and making the appropriate changes to their respective websites. He added that there were industry experts who got higher rankings and there were also several “Phantom” victims who have recovered after the update.
But if there are websites that benefitted from the launch of Panda 4.0, there are also those who “got pummelled” by the tech giant’s latest move. According to Gabe, many companies lost more than 60% of Google organic traffic overnight. Some of the websites he had analysed even lost almost 75%. This is indeed a clear indication of how far-reaching and how big the impact of Google’s Panda 4.0 is.
But what has caused these websites to suffer the brunt of Panda’s full anti-spam power? Gabe speculated that problems with content syndication might have something to do with it. “There were a number of companies that reached out to me with major Panda hits (losing greater than 60 percent of their Google organic traffic overnight). Upon digging into several of those sites, I noticed a pretty serious syndication problem,” he wrote.
According to him, those websites heavily featured syndicated content and much of the content posted on the sites “got hammered” when Google rolled out Panda 4.0. But unlike in the past, there wasn’t a specific niche or category that was being targeted by the tech giant’s latest search algorithm update.
SEO - Search EngineGabe added that one probable reason why those sites were significantly affected is because none of them had the optimal technical setup, like using the rel=canonical tag to point to the original content on another domain. Inconsistent back links might have also contributed to the Google penalty. According to Gabe, some of the affected sites had links back to the original content on third-party websites, while other pieces of content did not.
To avoid being negatively affected by future Google Panda updates, Gabe advised website owners and marketers to be very careful when handling syndicated content. Also, they should be mindful of how much syndicated content they are putting on their site as compared to original content.

0 comments :

Post a Comment

Seo Company in India | seo company in delhi | Seo services in delhi | seo delhi | delhi seo services company Seo Company in India | seo company in delhi | Seo services in delhi | seo delhi | delhi seo services company, Blogorama - The Blog Directory Seo Company in India | seo company in delhi | Seo services in delhi | seo delhi | delhi seo services company, Online Marketing Seo Company in India | seo company in delhi | Seo services in delhi | seo delhi | delhi seo services company, Instagram Seo Company in India | seo company in delhi | Seo services in delhi | seo delhi | delhi seo services company, HTTP Header Check Quality Backlinks