|Keyword Density is a Myth and Should Stay in 2003|
|Written by Brian Harnish|
|Monday, 24 June 2013 11:50|
Readers of Search News Central are no strangers to some of the many myths that continue perpetuating themselves throughout the SEO industry. What I would like to discuss in this article more in-depth is keyword density. First off, I would like to introduce Matt Cutts, who has a few things to say about keyword density :)
Click to watch the video, embed is broken:(
Next, we need to go back in time to around 2003 or so. Back in the hey day of 2003 everyone used keyword density as an SEO strategy. Back then, it was perfectly (kind of) acceptable to use many repetitions of keywords on a page in order to increase rankings on many different search engines.
While pursuing such a strategy caused many documents to be unreadable and caused them to reek of low quality, most search engines, including Google, didn't really have much in place to defend against such tactics. Until the Florida update hit in November of '03. Then this presented a problem - SEOs could no longer use keywords liberally on a page as an easy way to increase rankings. It became known as "over-optimization" to utilize repetitions of as many keywords as possible.
So why is it then that focusing on keyword density is no longer a valid SEO strategy? Let's look at several main reasons.
1. User Experience
When was the last time that anyone wanted to read copy that included many repetitions of a word that didn't provide any additional context or information which helped the reader? Not very many. This practice tends to result in machine readable text that is boring and teeters on the edge of being spammy, depending on the intentions of the person who is doing it. If the person intends to spam the document with as many keywords as possible in order to maximize its keyword density, then Google's really going to have a problem.
Targeting relevance enhances many different aspects of the document, and in turn enhances readability. The enhacing of relevance and readability in turn increases usability, which is yet another ranking factor. Good UX is so important to making sure that a reader is happy and satisfied that a document they searched for has met their query.
2. User Intent
Targeting relevance enhances documents returned for a query that is returned due to user intent. By targeting relevance and using keywords appropriately, you improve the likelihood that your document will come up as a result of being matched to user intent. When subtle variations of keyword phrases are entered into the search engines, some of these phrases may count as the same phrase.
But if a phrase has a different user intent, then another set of results will come up based on that intent. It's all based on user intent and relevance. That's why Google continues to fight spam in any and all of its forms. Spam does nothing to improve the Google user experience and bad queries that return crap results will only serve to annoy users of the search engine. So, they will skip over your result and go to the next one that better matches their query.
If you want to improve the UX around your website, make sure that you also create documents that match the user intent of what they are actually searching for.
Another reason keyword density should not be paid attention to (much), is because of the fact that adding many keywords into a document at random destroys readability. Readability is another aspect of user experience. If only a machine can read the document, why would any audience even want to bother reading further? This is why Google is going after keyword spam. Keyword spam focusing solely on keyword density as a way to increase rankings is only going to be completely filtered out of the search results by Panda. This is the only way Google is ever going to deliver the proper results that match their users' queries. By delivering readable documents that deliver quality user experiences.
It's a fine line balancing integrating keywords before it turns into the old tactic of including as many many keywords as possible in a document. Like Matt Cutts says, find someone who can read your document for you. They should be able to tell you "Hey, wait a minute, this sounds totally artificial. You should rewrite it so it sounds more natural".
4. Spam, spam, and more spam
The biggest reason is that documents that lace themselves with hundreds of keyword repetitions look like spam. So much so that it becomes unreadable for the normal human being. There is a reason that Google wants to eliminate spam from its index. It makes users angry, irritable, and just plain disgusted with the result. If users stumble across enough spam in an index, they will leave the index for another one. So, you see the reason why it's important to avoid looking like spam.
If you have found on your sites that you have been hit with the latest Panda integration, here are a few steps that you can take to recover. I present: the three-step content audit (believe me, the information that you will uncover is priceless).
1. Examine the content on your site, and perform a content audit of every page. Do this by creating an Excel spreadsheet listing every page of your site.
This is very easy to do with Excel and Firefox:
2. In your content audit, go over all the pages.
Find the pages that are not quite right and refine them. It's easy to do by marking all the pages in the spread sheet that need changes. Remove all of the additional keywords that don't make sense, remove internal links that don't make sense. Get this clean up into gear. Rewrite entire pages if necessary. Remember, you have built up a not-so-solid reputation that needs great content to overcome such a hit by Panda.
3. Meta Tags, Meta Tags, Meta Tags
Of course, what kind of content audit would it be if you didn't examine all of the meta tags on the page? Examine title tags and your meta description to ensure that these items are of the highest quality.
Typically, if you're not already aware, you want your title to be between 50-60 characters and your meta description to be around 150 characters or less to ensure that it does not get cut off in search results pages. I hate mentioning the Meta keywords tag at this point (especially since it isn't in use by Google and died years ago) but spending too much time on it is worthless. Instead, spend a little time on it. Take a couple seconds to add a couple of keywords and leave it at that and don't ever think of it again.
These steps should help you recover from Panda if you have been hit. It helps to be as thorough as possible. Throughout your audit you may discover other items that need attention (like your link profile).
But, alas, I digress. I could talk all day about keyword density and the technical ramifications behind considerations of such a move in your SEO strategy. The final, last word is, especially from Matt Cutts: use your keywords naturally in such a way that it is readable to people, not just for search engines.
Sprinkle keywords throughout the document naturally, in such a way that communicates relevance. Don't keyword spam. And don't keyword stuff. By following these methods, you should be able to clean up your site's reputation and avoid additional updates that may occur in the future.
|Last Updated on Monday, 24 June 2013 15:29|
Home - all the latest on SNC
SEO - our collection of SEO articles
Technical SEO - for the geeks
Latest News - latest news in search
Analytics - measure up and convert
RSS Rack - feeds from around the industry
Search - looking for something specific?
Authors - Author Login
SEO Training - Our sister site
Contact Us - get in touch with SNC