Not too long ago, the most relevant would have been the web page that contained most instances of the keyword (search term). It didn’t take long for some people to notice that if they filled a page full of that keyword, then that page would be smiled upon by Google. The result was thousands upon thousands of web pages being generated by software that reiterated the same keyword over and over again on a page, the rest of the text being practically nonsense. Here is an example using the keyword knitting:
“Knitting is popular. You can get information on knitting on the internet, and there many sites dealing with the subject of knitting. If you want information on knitting the information superhighway is full of experts on knitting, and there even lots of sites where you can purchase knitting that meets your requirements”
Meaningless, and providing no information whatsoever. The software could change any keyword for knitting, and it would still make the same sense, or lack of it. However, it was great spider food. They loved it, and many people made fortunes on Adsense using such software. However, Google found out and changed the rules. They now use what is commonly called LSI (latent semantic indexing) techniques to judge the relevance of each web page to the keyword.
It is not a true use of LSI, but that’s what it is called, so let’s stick with it. The Google algorithm looks at the keyword and decides what other words would be expected to appear if a human were writing about the topic of the keyword. Hence, for wool, it would be looking for sheep, knitting, knitwear, stitches, etc. If these words do not appear, then your page could be down listed. Nor is it looking for any specific keyword density: 1% could be too high, if it would not be expected to appear that often in normal speech or text.
The algorithm, or spider, looks for other related terms, and if you have your actual keyword as the title of your page in H1 tags, in the headline of your article, say, in H2 tags once in the first 100 characters of the text, and once on the last paragraph, that should be enough. Perhaps once more each 400 words, but no more than that. The 12 occurrences of your keyword if you stuck to 3% keyword density would be considered as keyword stuffing.
If you stick to these guidelines, and use plenty of text related to the keyword you are using, then your page should be regarded as being of value to anybody using your keyword as their search term, and it should be listed high in the search engine index for that keyword.
The reason why search engines no longer need a high keyword density is that they have wizened up to those that tried to take advantage.
Do you want to learn more about how I do it? I have just completed my brand new guide to generating massive traffic ‘Triple Your Traffic Fast'
Download it free here: Triple Your Traffic Fast
Download a free article marketing guide here: Secrets of Article Writing
Do you want to learn how to build a massive list fast? Click here: Email List Building
Raymond Nesa is an experienced web marketer specializing in article marketing, traffic generation, and list building.
Source: www.articlesbase.com