Tuesday, February 24, 2009

Search Engine Ranking Mistakes To Avoid

by Pierre Basson

1.) Maximizing a website for non performing key phrases

The first step in any search engine optimization campaign is to select the optimal keyword phrases for which to optimize your website. If you choose the wrong keyword phrases, all the time and effort in attempting to get a website to rank high will be wasted. If the keyword phrases you chose have no-one looking for them, or if the chosen keyword phrases don't attract relevant targeted visitors to your website, then you will have very little benefit from the top rankings.

2.) Putting too many key phrases in the Meta Keywords list

Sites which have hundreds of keyword phrases entered in the Meta Key Words tag hidden at the top of the page), in the hope that by providing all the possibilities in the keyword phrases in the Meta Keywords tag they will win a better ranking for those keyword phrases, are wasting their time. It won't help. Despite popular opinion, the Meta Keywords tag have hardly any importance any more in regard to search engine positioning is concerned. Consequently, just by inserting keyword phrases in the Meta Keywords tag, a site will not actually be able to win a better ranking.

3. Using the same key phrases over and over

Another typical error is to repeat target keyword phrases in the main text pages and in Meta Keywords tag. Because so many have tried using this tactic historically and many still do, the search engines are logging this constantly. They can block a website which repeats keyword phrases in this way, called keyword stuffing. Simply repeating the keyword phrases repeatedly will cease to be effective.

4. Adding lots of similar doorway pages

Another myth is that because the ranking formulas for each search engine varies, they must to make different types of pages for different search engines. While this is wonderful as an idea, it is counter-productive in practice. Anyone using this method, will quickly accumulate hundreds of additional pages, which quickly become a problem to manage. Furthermore, although the pages are meant for different search engines, they will all actually wind up being almost the same. Search engines are often capable of detecting if a website has built common doorway pages, and may delay or even ban the website from their index as a result. Instead of building another page for each search engine, create one page which is optimized exclusively for one keyword phrase for all the search engines together.

5. Using Hidden Text

Hidden text is text listed in the same color as the background color of your web page, so it seems concealed to viewers of the web page. Let's say the background color of the web page is white and if white text is added to that web page, it is concealed and considered to be hidden text. Many web designers, in order to get high rankings in the search engines, used to try to fill their pages with these additional concealed keywords. However, there is a definite limit to the number of additional keyword phrases anyone can keep adding in a web page before it starts to sound distorted to human visitors.

So, to be able to conceal it from the human visitors, but still keep it keyword focused, many web designers added keyword phrases with the same color as the background color.

This allows that text the search engines can index the keyword phrases, the human visitors cannot. The search engines have long since discounted this technique, and block the pages which have such text. They may also block the entire website if even a single page in that website has such concealed text.

The issue is that some search engines sometimes end up blocking sites which had not intended to use concealed text. For example there is a webpage with a white background and another table with a black background. Then insert some white text into that table. Although seen by human visitors, some search engines may interpret this to be concealed text, ignoring that the background of the table is actually black.

6. building WebPages Containing Only Video

Search engines only understand text - they don't understand graphics. If a website has a lot of graphics content and very little text content, it is less probable to get a high listing. To improve the rankings, the graphics must be linked to keyword containing text so that the spidering robots will understand what is in the website.

7. Ignoring the NOFRAMES tag in the case where your website uses frames

Many search engines can't decode frames. For sites which rely on these frames, search engines only consider what is inside the NOFRAMES tag. Many web designers make the error of adding something like this to the NOFRAMES tag: "This website contains frames, but browser doesn't support them". For the search engines which don't consider frames, this is all the content that they will index, which means the chances of a good listing is much smaller.

8. Using Cloaking of Pages

Page cloaking is a tactic used to show different webpages under different conditions. People generally attempt page cloaking for two reasons: A) in order to conceal the original source code of their optimized pages from their competitors and B) to hinder human visitors from viewing a webpage which looks great to the search engines but doesn't necessarily look great to humans. The challenge is that when a website uses this cloaking tactic, it prevents the search engines from being able to log the same webpage that the visitors are going to see. And if search engines discover something else, they can not be sure to provide relevant results. When a search engine discovers that a website has using cloaking, it will most likely blacklist the website forever from their index.

9. Utilizing Pre-Programmed Submission Tools

To take a shortcut, many web masters use an automatic submission software or service to send out sites and webpages to search engines. Submitting your website manually to search engines chews up a lot of time and automatic submission tools reduce the time. But search engines don't like automatic submissions and may sometimes blacklist them. But this is changing too as software becomes more common.

10. Submitting too often

Sending out too many webpages at a time to the search engines can be a problem, and may trigger search engines algorithms to ignore most of the webpages which have been sent out. Submitting one page per day to the search engines works. While some search engines will take more than one page per day from each domain, there are a few which will only accept one page per day.

About the Author

Pierre Basson is a prolific blog writer. Visit www.rankatlanta.com to see more helpful articles and blogs. Rank Atlanta SEO Search Engine Ranking Guidelines For Success Are you Asking ... Why Is My WebSite Not Ranking in the Search Engines? then Click Here for Help