Banner

Follow Along

RSS Feed Join Us on Twitter On Facebook

Get Engaged

Banner

Featured Article

Digital Marketing Insights; Integrated SEO 2013Digital Marketing Insights; Integrated SEO 2013Hello and welcome to the first of many Google Plus Hangouts here on SNC. To get things rolling our first session...
Read More >>

Latest Comments

Related Reading

Latest Articles

Digital Marketing Insights; Integrated SEO 2013Digital Marketing Insights; Integrated SEO...
Hello and welcome to the first of many Google Plus Hangouts here on SNC. To...
Read More >>

Our Sponsors

Banner
Banner
Banner

Latest Search Videos

Join Us

Banner
Banner
Keyword Density is a Myth and Should Stay in 2003
Written by Brian Harnish
Monday, 24 June 2013 11:50

Readers of Search News Central are no strangers to some of the many myths that  continue perpetuating themselves throughout the SEO industry.  What I would like to discuss in this article more in-depth is keyword density. First off, I would like to introduce Matt Cutts, who has a few things to say about keyword density :)

Click to watch the video, embed is broken:(

Next, we need to go back in time to around 2003 or so. Back in the hey day of 2003 everyone used keyword density as an SEO strategy. Back then, it was perfectly (kind of) acceptable to use many repetitions of keywords on a page in order to increase rankings on many different search engines.

While pursuing such a strategy caused many documents to be unreadable and caused them to reek of low quality, most search engines, including Google, didn't really have much in place to defend against such tactics. Until the Florida update hit in November of '03. Then this presented a problem - SEOs could no longer use keywords liberally on a page as an easy way to increase rankings. It became known as "over-optimization" to utilize repetitions of as many keywords as possible.

So why is it then that focusing on keyword density is no longer a valid SEO strategy? Let's look at several main reasons.

1. User Experience

When was the last time that anyone wanted to read copy that included many repetitions of a word that didn't provide any additional context or information which helped the reader? Not very many. This practice tends to result in machine readable text that is boring and teeters on the edge of being spammy, depending on the intentions of the person who is doing it. If the person intends to spam the document with as many keywords as possible in order to maximize its keyword density, then Google's really going to have a problem.

Keyword density kicks the bucket Instead, to enhance the user experience, the best thing to do is to integrate keywords within the document, but sprinkle them throughout the document and use them in places where context and relevance matters. It's all about relevance now, rather than keyword quantity.

Targeting relevance enhances many different aspects of the document, and in turn enhances readability. The enhacing of relevance and readability in turn increases usability, which is yet another ranking factor. Good UX is so important to making sure that a reader is happy and satisfied that a document they searched for has met their query.

2. User Intent

Targeting relevance enhances documents returned for a query that is returned due to user intent. By targeting relevance and using keywords appropriately, you improve the likelihood that your document will come up as a result of being matched to user intent. When subtle variations of keyword phrases are entered into the search engines, some of these phrases may count as the same phrase.

But if a phrase has a different user intent, then another set of results will come up based on that intent. It's all based on user intent and relevance. That's why Google continues to fight spam in any and all of its forms. Spam does nothing to improve the Google user experience and bad queries that return crap results will only serve to annoy users of the search engine. So, they will skip over your result and go to the next one that better matches their query.

If you want to improve the UX around your website, make sure that you also create documents that match the user intent of what they are actually searching for.

3. Readability

Another reason keyword density should not be paid attention to (much), is because of the fact that adding many keywords into a document at random destroys readability. Readability is another aspect of user experience. If only a machine can read the document, why would any audience even want to bother reading further? This is why Google is going after keyword spam. Keyword spam focusing solely on keyword density as a way to increase rankings is only going to be completely filtered out of the search results by Panda. This is the only way Google is ever going to deliver the proper results that match their users' queries. By delivering readable documents that deliver quality user experiences.

It's a fine line balancing integrating keywords before it turns into the old tactic of including as many many keywords as possible in a document. Like Matt Cutts says, find someone who can read your document for you. They should be able to tell you "Hey, wait a minute, this sounds totally artificial. You should rewrite it so it sounds more natural".

4. Spam, spam, and more spam

The biggest reason is that documents that lace themselves with hundreds of keyword repetitions look like spam. So much so that it becomes unreadable for the normal human being. There is a reason that Google wants to eliminate spam from its index. It makes users angry, irritable, and just plain disgusted with the result. If users stumble across enough spam in an index, they will leave the index for another one. So, you see the reason why it's important to avoid looking like spam.

If you have found on your sites that you have been hit with the latest Panda integration, here are a few steps that you can take to recover. I present: the three-step content audit (believe me, the information that you will uncover is priceless).

1. Examine the content on your site, and perform a content audit of every page. Do this by creating an Excel spreadsheet listing every page of your site.

This is very easy to do with Excel and Firefox:

  1. First, create a new, blank folder.
  2. Using your favorite FTP program, download your site (only the HTML/PHP/ASP/whatever extension pages in the root directory) to this folder.
  3. Using Windows explorer (this works just fine in Windows 7), navigate to this folder you created and downloaded your site to, and get the exact location of where you are in the top location bar. Left-click on it (this should turn it into an address), highlight it, and copy the location path by either right-clicking and left-clicking on copy, or use your keyboard's ctrl+c function while it is highlighted to copy it.
  4. Bring up Firefox.
  5. Paste the path to the folder in Firefox by either right-clicking in the address bar and left-clicking on paste, or use your keyboard's ctrl + v function to paste it.
  6. Hit enter. Firefox should open up a page that displays the entire contents of this folder in links.
  7. Highlight everything, and copy the highlighted selection.
  8. Bring up Excel and create a new spreadsheet.
  9. Click on Edit > "paste..."  > "paste special" in Excel, and paste as text.
  10. This should paste everything into three - four columns.
  11. Get rid of the three columns on the right.
  12. Using find & replace, replace File: in the url by entering this into the Find: field. Then, enter http://www.yoursiteurl.com/ in the replace field. Click on replace all.
  13. The easiest, most efficient way to create all the links is to send it to yourself in Outlook. Depending on how many links you have, it may not create them all. Send the entire list to yourself in outlook, and you will have all these links automatically created. Now, you can go to each page on your site individually with a click, and perform the content audit.

 

2. In your content audit, go over all the pages.

Find the pages that are not quite right and refine them. It's easy to do by marking all the pages in the spread sheet that need changes. Remove all of the additional keywords that don't make sense, remove internal links that don't make sense. Get this clean up into gear. Rewrite entire pages if necessary. Remember, you have built up a not-so-solid reputation that needs great content to overcome such a hit by Panda.

3. Meta Tags, Meta Tags, Meta Tags

Of course, what kind of content audit would it be if you didn't examine all of the meta tags on the page? Examine title tags and your meta description to ensure that these items are of the highest quality.

Typically, if you're not already aware, you want your title to be between 50-60 characters and your meta description to be around 150 characters or less to ensure that it does not get cut off in search results pages. I hate mentioning the Meta keywords tag at this point (especially since it isn't in use by Google and died years ago) but spending too much time on it is worthless. Instead, spend a little time on it. Take a couple seconds to add a couple of keywords and leave it at that and don't ever think of it again.

These steps should help you recover from Panda if you have been hit. It helps to be as thorough as possible. Throughout your audit you may discover other items that need attention (like your link profile).

But, alas, I digress. I could talk all day about keyword density and the technical ramifications behind considerations of such a move in your SEO strategy. The final, last word is, especially from Matt Cutts: use your keywords naturally in such a way that it is readable to people, not just for search engines.

Sprinkle keywords throughout the document naturally, in such a way that communicates relevance. Don't keyword spam. And don't keyword stuff. By following these methods, you should be able to clean up your site's reputation and avoid additional updates that may occur in the future.

Brian Harnish -

Building websites his Brian Harnish's passion. He's been building websites since he taught himself HTML and took a college Photoshop class to create the graphics in the summer of 1998 when he built his first real website. He thoroughly enjoys reading almost anything and everything he can get his hands on regarding SEO, search marketing, as well as web design & development.

Also hook up via

Read More >>


More articles by this author

SMX West Recap: Key Takeaways and Experiences SMX West Recap: Key Takeaways and Experiences
As a follow up to my conference survival guide...
Read More >>
An SEOs Guide to a Painless Website MigrationAn SEO's Guide to a Painless Website Migration
It can be a daunting and scary experience to migrate...
Read More >>
Last Updated on Monday, 24 June 2013 15:29
 

Add comment


Security code
Refresh

Getting Around the Site

Home - all the latest on SNC
SEO - our collection of SEO articles
Technical SEO - for the geeks
Latest News - latest news in search
Analytics - measure up and convert
RSS Rack - feeds from around the industry
Search - looking for something specific?
Authors - Author Login
SEO Training - Our sister site
Contact Us - get in touch with SNC

What's New?