On a rather unremarkable Tuesday, beyond the heat, the folks at the WSJ tracked me down with an interesting story. Tis a tale of a very disgruntled CEO whom had been mauled by a Panda and lived to tell the tale. Care to hear it? Sure. Why not?
It seems that the fine folks over at HubPages have been having a hard time due to the Google ‘Panda’ update from earlier this year. Not surprising given them were #17 on the list of hardest hit sites. And since that time CEO Paul Edmondson has been shouting it from the mountain tops from Techcrunch, to WebPro News, the Google boards and elsewhere.
There was even a thread on WMW about using this approach and citing the Hubpages moves
This was somewhat interesting because as far as we’ve heard there is no getting up from the Panda mauling. The only REAL fix was to do away with the crap content and build out a better offering. Or that’s been the story at least. It seems Hubpages is claiming to have a work-around.
Can you recover from Panda?
The story broke yesterday in the WSJ with; Site claims to Loosen Google Death Grip.
Being somewhat close to this story I can say that it does seem that there is something going on. They took some of the authors from the site and put them on sub-domains. At which time, some articles that weren’t ranking on the main domain, started to rank again. Of course, the data I have seen from this is limited and I can’t state definitively that this is actually the case (don’t have much before/after data).
And of course the obvious, now that they’ve run off to the WSJ, Google is sure to no know about it and it’s shelf life will be limited. In fact, in talking with a Googler yesterday it seems they were quite aware of the situation already.
So, I wouldn’t go running off just yet to try this on your own site (if it was hit by Panda).
Dear Mr. Edmondson
The part we (my cohorts and I at the SEO Training Dojo) found somewhat stunning is that Hubpages would go running over to the WSJ to talk about it. Paul, brother, the first rule of fight club, is that we don’t talk about fight club. Sheesh. Your ongoing desire to poke the bear will ultimately cost you and your staff. Bad form ol chap. Next time get in touch with me first ok?
And you have been vocal in how Panda has effected you. What about Tumblr? Squidoo? From what I know they weren’t heavily affected. These are relative UGC sites. The real secret to getting past Panda is simply to work harder at getting the house in order.
Some other things to consider; Are SEOs manipulating the system? See here and here . Are there a lot of dupes from RSS? Have you considered changing the policies and judgements on when the no-follow is removed? And not to be cheeky, but the Hubpages site even has a hub post about using sites like yours for building links (with an author score of 96).
Let’s look at this small snippet from your Techcrunch post;
“We are concerned that Google is (1)targeting platforms other than its own and stifling competition by reducing viable platform choices simply by (2)diminishing platforms’ ability to rank pages. Google is (3)not being transparent about their new standards, which prevents platforms like ours from (4)having access to a level playing field with Google’s own services. We want to comply with and exceed Google’s standards. Google has my contact information. Hope to hear from them soon. ”
- Sadly untrue as others in his niche weren’t affected to the same degree.
- Again, not entire acurate. Google has made massive and minute changes to the algos many times over the years. That’s why we SEOs have a preofession. We adapt and move on. They can as well (as noted with recent sub-domain approach)
- Yes, because the nafarious types that manipulate the engines would have a field day. Then he’d complain about the crap in the SERPs right?
- See #2
Anyway… Look me up Paul, we’ll do lunch! Have your peeps contact mine. M’kay?
Anyway, it was an interesting experience and it does make me wonder if our perceptions (as SEOs) of how Google is treating sub-domains these days. In the past we were told there is a tight relation to the core domain from subs because of various manipulations in the past. Does this show that this may not be the case? Or is it just a Panda anomaly? It’s hard to say.
Should you be looking at this as a Panda fix? I am not entirely sure that it’s going to be worth the investment of resources. Google is certainly aware of this situation and as such I’d likely advise just cleaning house instead.
I shall update this post if I hear anything more on the situation.
Here’s is some of the test data Paul posted in this thread (on HPs);
Paul Edmondson (subdomain activated 06/23/11)
Here’s an analytics screen shots from that thread;
And of course, Aaron grabbed his tin-foil and got into the fray as well.