Over the last while the public outings weren’t the ONLY news in town. Remember the whole content farms and scrapers discussions? We’ve already had the algorithm change that was supposed to improve attribution and limit scrapers. And there was rumbings that big changes were coming down the pipe.
Remember all that?
It seems Matt was also busy this week with a post over on the Google Blog about yet another update.
“… in the last day or so we launched a pretty big algorithmic improvement to our ranking””a change that noticeably impacts 11.8% of our queries””and we wanted to let people know what’s going on. This update is designed to reduce rankings for low-quality sites””sites which are low-value add for users, copy content from other websites or sites that are just not very useful. At the same time, it will provide better rankings for high-quality sites””sites with original content and information such as research, in-depth reports, thoughtful analysis and so on. “
Now, for the record, last time he’d mentioned the attribution algorithm change was only supposed to affect 2% of queries. As you can see, that has been bumped up (nearly 600%) with this one. That my friend, is a substantial percentage.
What’s the deal?
There has been grumblings from my SEO Dojo mates and WMW members alike the last few days about something being afoot. They do mention ‘original content’ which means the attribution element was also in the fold. Beyond that, we have the terms ‘high quality sites’ (not sure what that means), and “research, in-depth reports, thoughtful analysis “ as an example. That one is interesting in that it seems to hint at deeper semantic analysis.
Matt says that it “is important for high-quality sites to be rewarded, and that’s exactly what this change does. “. Once again, this might be authority, trust, semantic approaches, the specifics aren’t entirely obvious at this point. Nor are some of the reactions I’ve heard over the last few days. And we’ve been looking (for common elements).
He also touched on some data comparisons that showed the changes were fairly in-line with data from the blocked site chrome extension.
What are they doing?
From the early testing we’ve been doing this is certainly not directly aimed at websites such as eHow, Yahoo Answerers, HowStuffWorks, About.com and the like. They are all still fairly prominent in some query spaces. But that makes sense because I did say that algorithmic solutions was the way to go (not nuking on a site by site basis a la Blekko)
What is certain, as always, there will be winners and losers (the losers generally scream louder). If this does work well, that’s great! I not only work on search engines, I am a user as well. Fighting scrapers and improving relevancy and quality should be a good thing right? They have been “tackling these issues for more than a year, and working on this specific change for the past few months.“. Let’s hope it works as advertised.
From what we’ve heard so far this seems more geared towards ‘thin content‘ than ‘content farms‘ per se, but we’ll still let Danny have the ‘Farmer Update‘ because it’s all semantics and just fun to say.




With the sudden changed in the algo made by Google we cannot deny the fact that those big sites are the ones who actually feel the losses. I’m just thinking that yeah maybe Google did that in order for them to actually select those who have proven quality. I was surprised when ezine was one of those who actually experienced the dropped. Everyone is concerned now and continue asking Google what was the change for.
Evolution is integral to all paths of life including industry and technology (which encompasses SEO). This article is a straight-forward commentary on the algorithm change. However elsewhere there just seems to be moaning, complaints and indignation. Come on folks – things change! Instead of grumbling all over the show, jump on board and start innovating and mapping out the new ground – before someone else does!
OK so Google had a love affair with content sites but has now fallen out due to their so called low quality tendencies. The big question is how does a search bot judge content to be of little use to the reader when the writer has used every SEO trick in the book to rank the same way an expert would write the article?
Good Post For Seo And Webmasters