For many years now SEOs have talked about the various behavioural and query data points from the perspective of being ranking factors with Google or other search engines. You know the ones, behavioural data such as;
- Query history (search history)
- SERP interaction (selections and bounce rates)
- User scrolling behaviour (on selected page);
- User document printing behaviour;
- Adding a document to favourites (bookmarks);
- Data from different application (application focus ““ IM, Email);
- Surfing habits (frequency and time of day)
- Interactions with advertising
- Demographic and geographic
- Explicit editorial (such as +1 voting)
- Dwell time (time on page)
They are usually quoted by SEO aficionados because, well, they just seem logical.
Sadly our instincts, not matter how well honed, aren’t enough to go by. And testing? I’ve argued many times that this is also fools gold in that we can glean only insights into the nature of the beast that way, nothing definative. Why not you ask? I won’t get into it too much, but just consider the myriad of potential situations, markets and our inability to isolate that which we don’t know, and the viablility of (small scale) tests becomes quite self-evident.
But the question remains; are they worthy of being ranking factors? Is click-through rate a ranking signal?
Not ready for prime time
Ok, first let’s get a little perspective shall we? I have been learning about behavioural data and search engines since my early fascinations in 2006. From 2007 on I started writing about it including the post; Beware: Google is watching you! And What every SEO should know about Personalized Search. At that time I was actually quite bullish on the concepts and had felt they may be playing a part in the search (ranking) algorithms.
Of course, as anyone following along over years would tell ya, I began to doubt this and said as much in the 2009 with (now infamous) The final word on bounce rates as a ranking signal.
Enough with the history lesson, let’s get back to the issue at hand shall we?
Like most behavioural signals, click-through rates are noisy at best in most situations. Let us first consider the obvious; it is easily spammable. Yes, patterns might emerge, but by and large the quelity improvements likely are offset by potential spam issues. Second, we have the whole problem of click bias. That is to say, the habit people have for clicking the first result, second etc (see Understanding search user behaviour for more).
This creates some MASSIVE noise as far as click-through rate metrics are concerned.
Is all lost for CTR and behavioural metrics as ranking factors? Maybe. Maybe not. Last year we had a major infrastructure update from Google dubbed Caffeine. This was an interesting twist because it may mean more processing power which could be used in areas such as this for greater spam detection or a deeper layer of personalization (more on that in a moment). It was a bit of a shining light… a ray of hope if you will.
You see, most testing that has been done with these types of signals showed they can indeed create greater relevance in the results. The glaring issue of course was the spam potential and being able to weave the various metrics into something of value
You can’t simply take one factor (bounce rate, click-through rates) and derive valuable data. One would most certainly need to have them all in play.. which of course means more potential areas to police for spam. Did Caffeine enable that for Google? Hard to say.
A personalized world
The next piece in the puzzle is personalization. What’s interesting here is that one certainly can’t spam themselves, which solves the first problem. Let us remember that personalization was once ‘search history’ and has now evolved (at Google at least) into ‘surfing history‘ as well.
What is still a limitation though is the fact the users are categorized. It is not a pure form of personalization on a user-by-user basis. Meaning, that if you can crack the entity categorizations and spam accordingly, you can indeed spam in a personalized search world. Not ideal at all (for the search engineers)
Another newer element of personalization is, of course, the social graph (which we’re written about a TON here on SNC, most recently with; Google Social Search; seriously, WTF people?). This is yet another layer of personalization that can also help better categorize users and hopefully cut down on the spam issue.
Will personalization make behavioural metrics more valuable in search? Most certainly. I just don’t know if we’re there yet.
Query revisions and recommendation engines
Another element, and general over-sight among SEOs, is that there can be many ‘signals‘ that aren’t used for actual rankings. A few common ones are discovery and indexation. Another, more related to this CTR discussion, is query data.
Have you ever wondered where query refinements, recommendations and suggestions (like Google suggest) come from? Most of them are from query data that they glean from users. It doesn’t play into the rankings of documents per se, but does give insight into how successful/statisfied the user may or may not be with the results. I can see this as one area these signals/metrics can come into play.
So what’s the verdict?
I don’t bloody know ok? Anyone telling you bare-faced, 100% factually about almost anything in the Google black box is a little bonkers.
What we can say at this point though is that it is certainly unlikely that CTR (and other behavioural data points) are being used as ranking signals in any meaningful way in the major search engines at this point.
Should we (as SEOs) ignore it? Certainly not. While the prospect of these playing a larger role is looming, the fact remains that in many cases we’re talking about user-engagement which is ALWAYS a good thing. Great websites, great marketing, great engagement means users are talking about you.
That my friends, is most certainly good SEO.
David – you bring up a solid argument for not using CTR alone as a ranking signal. With the recent push for good, quality content I think using a combination of CTR, dwell time, and bounce rate would be more likely than using a single metric alone. It will be interesting to see qualitative studies come out post-Panda.
Agreed. As noted along the way the real issue, other than spam, was gleaning meaningful data (that improved search quality) without causing to much strain on the processes. A cost V benefit approach from what I understand.
Ultimately, from what I know, one really does need to use more than a few implicit feedback signals to really make it work. Individually the signals become quite noisy.
As for Panda, I doubt much of that was related to implicit feedback changes. Caffeine is the more likely suspect. Panda was only some (stated) 12% of queries; a change to implicit feedback would likely affect a much larger percentage.
Based on the work I am currently doing I strongly believe they are at least monitored by Google by some of the behaviour and footprints we have seen in code.
I would anticipate they hold a very small amount of weight right now. Logically after Panda I would expect them to increase in weight in the future.
I’m dubious about CTR as a singular data point for ranking calculation – though Bing kind of said they did (though not explicitly how)(http://searchengineland.com/bing-uses-click-through-rate-in-ranking-algorithm-52386) – here’s a thought – if you looked at aggregate user behaviour metrics such as normalised CTR and bounce rate per query, you might be able to weed out the bad guys by identifying anything standing out – any grossly different user behavioural signals. Maybe.
So yeah – CTR alone, probably not an important positive ranking signal, but if you’re already ranking well and your ctr / bounce suddenly went way off, could you lose a few ranking positions? Hm.
Well, no one knows for sure, that’s absolutely true. But I’d err on the side of believing that Google does use CTR and other behavioral metrics.
We know that Google measures satisfaction on the basis of a long click versus a short click metric. How that is derived isn’t entirely clear. Is it dwell time, number of clicks on the target site, scrolling on the target page, amount of time before returning to the SERP. There are plenty of ways you could construct a long click metric. I’m guessing Google has and will continue to experiment in this area.
But those all point to behavioral metrics in some form.
For CTR, we know that they track it both in AdWords (where it is used as a driving factor in quality score) and in Google Webmaster Tools. But could it be used as a ranking factor?
While the noise and spam might make it difficult to use, I think the scale and computing power available at Google likely allow them to weed that out and to normalize results. Further, I think they probably understand that different types of queries produce different patterns of behavior. So CTR and even pogosticking will have different benchmarks based on where those queries reside in the taxonomy.
And we know Google has that taxonomy, based on when it displays the dreaded Onebox, what it chooses to display in advanced search and even by looking at Google AdPlanner classification.
So, i don’t think that CTR alone is a signal but there’s a stack of evidence that points to the use of CTR in greater user behavior feedback metrics.
:zzz So what do you think of the reasonable surfer patent? :zzz I’d also agree I was skeptical about user behavior as a ranking factor, but then when you look at how footer links have been devalued you can see that it applies in some cases.
I’d agree with the comment from AJ here!
David – I think there are two questions in regard to click thru rates (CTR’s): 1) is Google utilizing external site CTR’s (including those from Google SERPs); and 2) is Google using internal site CTR’s as a signal. My speculation is that Google is utilizing internal site CTR’s as a ranking signal. If interested in why I came to this conclusion, check out my post on “link prominence”.
A couple of years ago, a Microsoft product manager indicated to me that in generating PPC quality scores, they were using internal clicks to secure pages as an proxy for conversions.
I always like it when former Google engineers, from the search quality team, tell me things like this:
“It’s pretty clear that any reasonable search engine would use click data on their own results to feed back into ranking to improve the quality of search results.”
That’s Edmond Lau: http://www.quora.com/Did-Bing-intentionally-copy-Googles-search-results
There are other good bits in there from Microsoft and Google search engineers, too.
This conversation I’ve just cited is specific to the Bing / Google debate about the former engine copying their results, but you’ll note much of the pertinent quotes (such as the one I’ve shared here) talk about “click data” being used in meaningful ways to rank results.
Some form of CTR is likely being used and contributing to ranking scores in Google and Bing. Edmond and others just said as much.
I wrote more about it here: http://searchenginewatch.com/article/2064047/Maximizing-Your-CTR-for-SEO-in-Organic-Results
Both Google and Bing are already using CTR and bounce rates as one of there major signals. Bing even stated last week that it is a bigger signal than backlinks…
If your site is average it may not play a big role but one might suspect that unusually poor CTR might be easily detected. Does not google already use this to some degree in adwords. Its definitely worth optimizing description and getting rich snippets that are tempting to click on with good titles. I am willing to bet it plays a rolw
[…] David Harry addressed the question in a post published on Search News Central entitled “Are Click-Through Rates a Viable Ranking Factor?” […]
[…] David Harry abordou a questÃ£o em um post publicado no Search News Central chamadoÂ “As Taxas de Clique SÃ£o um Fator de ClassificaÃ§Ã£o ViÃ¡vel?” […]
[…] hat die Frage in einem PostÂ angesprochen,Â der auf Search News Central unter dem TitelÂ „Sind Click-Through-Rates ein brauchbarer Rankingfaktor?“ verÃ¶ffentlicht […]
[…] David Harry respondiÃ³ a la pregunta en un post publicado en Search News Central titulado “Son los Click-through Rates Un Factor de Posicionamiento Viable”? […]
Comments are closed.