Banner

Follow Along

RSS Feed Join Us on Twitter On Facebook

Get Engaged

Banner

Featured Article

Google Reconsideration Requests & Replies; understanding the evolutionGoogle Reconsideration Requests & Replies; understanding the evolutionHas anyone else noticed that there seems to be some changes to the replies one receives when filing a...
Read More >>

Latest Comments

Latest Articles

Ranting can be Therapeutic Ranting can be Therapeutic
We all have our hot buttons - things that just drive us up the...
Read More >>
Avoiding Search Strategy PitfallsAvoiding Search Strategy Pitfalls
Building a digital and search strategy isn’t the easiest thing in the world to...
Read More >>
What Google Knows & The War Against SEOsWhat Google Knows & The...
I don’t know how many marketers know about Google Think. Essentially, Google Think is the...
Read More >>

Our Sponsors

Banner
Banner
Banner

Latest Search Videos

Join Us

Banner
Banner
The Bell Tolls for Thee
Written by Doc Sheldon
Tuesday, 10 April 2012 12:30

Folks have been in an uproar lately over recent fluctuations in their rankings in Google’s SERPs… like that’s never happened before, right?

Conjecture, as is always the case in the SEO world, runs rampant, with everyone and his cousin tossing out theories about what’s on Google’s agenda. Some have an air of logic to them - some may be drug-induced.

Google is watching you

Let’s take a look at a couple that seem to have some credibility:

1. The "Fishing for Suckers" Theory

With the recent spate of GWT notices, the folks at the ‘Plex are just fishing for intel… hoping to get site owners to out themselves for bad behavior.

This is one that might make a little sense, actually. While I think it’s ridiculous to believe that Google would ever launch a notice campaign for the sole purpose of getting people to out themselves, I can certainly see them thinking “and if some of them out themselves, so much the better”.

One theory is that the recent upswing in the number of GWT notices is simply caused by a recent lowering of the algorithm thresholds, which triggered more “transgressions”. This actually makes some sense, since many of the examples cited in some notices were for links that have been in existence for quite some time.


What’s really behind the increased number of notices, and why do some cite examples of offending links while others offer none? We may never really know the answer to that first part, but I think the second part is simply a matter of mistakenly thinking that all notices are being generated for the same reasons. C’mon, people! Haven’t we learned yet that assuming anything is foolish, at best?


2. The Askimet Theory

Google is using Akismet data to identify spammers.

This one is interesting, because it seems highly unlikely, yet the evidence supports it. Barry Schwartz posted (while his wife was in labor… congrats on the new baby girl, Barry!) that a Google spokesperson denied it. But we don’t know exactly how the question was asked or whether or not they’d admit it even if it were true. And to be fair, it could be perfectly true, regardless. What if, for instance, they use Akismet data that’s been aggregated elsewhere (there are several spam-lists out there)? That wouldn’t be ‘using Akismet’, strictly speaking.

The main point of this one is that a GWT notice cited a link that was never published, that had been erroneously marked as spam by a webmaster running Akismet. The comment was somewhat lengthy and not spammy at all, but was inadvertently earmarked as such, and consequently showed up as an example on GWT? How does that happen?

The idea that a comment being marked as spam might have a dampening effect on the entity leaving that comment isn’t a strange notion. I can even see that allowing such comments to exist on a blog could be seen as detrimental to the user experience. And personally, I have no issue with Google possibly using spam reports of comments to dampen or negate those comments in the graph(s). But I would hope that:

  • (a) there would be an equitable way of human verification and/or reconsideration, and
  • (b) there would be some mechanism in place to preclude abuse.

Takeaways

As is often the case in such situations, there’s no concrete answer. The best we can do is look at the evidence, make our own decisions on how to proceed and keep our minds (and one eye) open. Here’s my take on them:


(1.) I think that changes to the algo caught some pre-existing items that suddenly fell on the wrong side of the threshold. That is bound to have been noticed, and some massaging of the algo is likely to take place very shortly. I would suggest that if you find yourself in this situation, you take a “wait and watch” stance. If you were one of those borderline cases, chances are that you’ll see some recovery before long. Of course, if you fall far from “borderline”, you’d be well advised to examine your link profile and do some spring cleaning.

However, I wouldn’t even consider filing a reconsideration request, unless you can’t find your site in the index at all. Fix the problems, if any, and see what happens. If you weren’t de-indexed entirely, you’ll probably see a come-back.

(2.) I won’t call Google a liar on this. If they say “we don’t use Akismet to flag spam”, I’ll accept that for what it says… with the codicil that it doesn’t preclude them from acquiring such data from another source. As stated earlier, I think such action would probably be beneficial, if properly managed.

Do I suspect they’re doing something along these lines? Absolutely! I saw the evidence cited by William in his WPW post, and I do not believe that it’s simply correlative.

What I find interesting is that, more than usual, some sites took a hit and others got a boost. Which points out something I’ve said many times… if your rankings change, it may have something to do with your site, or it may be something to do with the other sites in that SERP.

Did you go down, or did someone else simply push ahead of you? Obviously, if you had several competitors suddenly pass you up on several keywords, it was your site that took a hit. But if one competitor had a number of pages that passed you up for just one KW, then it might well be that they simply got a boost.

And as always, it’s important to remember that dips in ranking may be caused by a number of things… never forget to consider that inbound links may have simply been dampened, losing some value.

What I think we’re experiencing right now is just one more step toward a new ranking concept for Google. We have already seen that rankings seem to be more site-oriented that strictly page-oriented. And as Bill Slawski recently discussed, author authority may be an important part of ranking, going forward. As it would represent a step away from the link graph, I find that an intriguing and attractive notion. It would certainly fall in line with the recent entity-driven focus we’ve seen.

As always, we’d love to see your comments/ideas on these two issues. Sound off!

Doc Sheldon -

Doc Sheldon is a retired business management consultant, and a perpetual student of all things SEO. He’s also a copywriter, providing professional webcopy, articles and press releases. He’s been involved in SEO for a little over five years, and writing professionally for over thirty.

Also hook up via

Read More >>


More articles by this author

RDFa:  The Inside Story from Best BuyRDFa: The Inside Story from Best Buy
An Interview with Jay Myers Late last year, I approached Jay...
Read More >>
 Myths,  BS and FUD, Oh My! Myths, BS and FUD, Oh My!
During a chat in the SEO Training Dojo the other...
Read More >>
 

Add comment


Security code
Refresh

Getting Around the Site

Home - all the latest on SNC
SEO - our collection of SEO articles
Technical SEO - for the geeks
Latest News - latest news in search
Analytics - measure up and convert
RSS Rack - feeds from around the industry
Search - looking for something specific?
Authors - Author Login
SEO Training - Our sister site
Contact Us - get in touch with SNC

What's New?