If your website is not user friendly, your targeted keywords may appear on your site but you may still not find your website in the search engine results pages even when you search using those particular words. This may be caused by a number of reasons, including a weak navigation method, too much use of sophisticated web technologies, and a rigid site design.
Another way to boost the user friendliness of your website is to make sure that there are no errors in the HTML code. Whether your are using a WSYSIWYG web editing tool or you directly keying in the HTML code, it is advisable the check for errors using CSS and HTML validators.
The next strategy for making your site more user friendly is to include text in the images, Flash files and videos in your site because the search engines are unable to determine their topics simply from the videos and images. The technique is to include text in the HTML code that puts the images, videos or Flash files on the web page that describes the content for the spiders.
For those who apply a content management system, such as a blogging software or blogging service, a strategy for making a site user friendly is to remove the apparent duplication of content on the website. This situation occurs when automated content management services, such as blogging software and services, generate alternate URLs for the same article, thereby resulting into link dilution. The remedy for this problem is to find a way to disable this particular feature in the blogging service or software that allows one article to reached through various URLs.
Do not use image maps without text links that follow.
Image maps are images or graphics in which different parts of the image are hyperlinked to a different page. Often many horizontal or vertical navigation bars are really single images (buttons) that take you to another section of your site when you click on a certain part of the image. There is nothing wrong with having image links. However, if you do, you must be certain you also have text links on each of your pages (especially your home page) to the same pages the image links to.
Search engines get their data through spiders. When a spider finds your web site it first loads your “robots.txt”. The “robots.txt” file can be used to tell a spider which (if any) sections and/or pages on your web site it should avoid. Once the spider has finished with your “robots.txt” file, it will most likely move on to your home page. From there it will explore your web site and index all of the pages it finds (with usually a maximum of a few dozen). The problem is that the spider cannot “see” images, just text. It can only move to another page if there is a text link to another page. Therefore, if all your links are images, it will not find the rest of your site and will move on to the next site in its never ending list.
As you can see, it is crucial to have text links to each main section of your web site on every one of your pages.
Things which we should not do it..
Do not use drop down menus without accompanying text links.
On a similar note, spiders are also unable to follow the links that are in drop down menus. If you use a drop down menu to help the user select which section of your site to go to next, be sure to also use text links somewhere else on your page.
Do not use frames.
A cardinal sin in the world of search engines is using frames. Frames became popular a few years ago because they enabled web site developers to easily change content displayed on a site across all of the pages by changing only one document. Since then, web site development programs such as Dreamweaver have provided the ability to use templates. Coding, such as server side includes and global variables, have enabled more developers to change uniform features (such as design, navigation bars, or text footers) by only changing one document.
However, some web site owners have persisted in using frames. This will be detrimental to their search engine efforts. You see, frames start with a frameset. The frameset is simply code that tells the browser which two, or more, HTML files to display in the browser. The problem comes from the fact that the search engines are able to read the code in the frameset, but can’t follow the code to the actual frames (the HTML files). This causes the frameset pages to receive very poor rankings. If you do choose to use frames, be sure to put optimized pages within the tag on each frameset page. Managing this will be very time consuming if you have more than a few pages. My best advice to you is to not use frames.
Do not use dynamic content on pages you want to be indexed.
Dynamic content is content that is generated on the fly from data in a database. They can cause significant problems for the search engines spiders, and in turn, cause your site to be penalized and cause your dynamic content to not be indexed at all.
Dynamic content is generally only used by larger sites or by an experienced developer’s site. So if you have no clue what dynamic content is or how to use databases on your web site you shouldn’t have any cause for concern. Dynamic content can usually be spotted by looking for the ? or & symbol in a page URL or a .pl, .cgi, .php, or .asp page ending.
There is no inherent problem in using dynamic content. What you must avoid is using dynamic content on pages you wish to optimize for the search engines. Also, if you are using dynamic content be sure you create a “robots.txt” file on your server in which you tell spiders to stay away from indexing this area of your site.
The reason search engines have problem with dynamic content is rather quite simple. Again, they are mindless spiders that follow text links. Since dynamic pages really are not “there”, but rather created on the fly depending on what parameters are placed in the address a spider could potentially become trapped in a large database driven page. It would have to index the entire database and would be stuck in a loop until it did so, potentially crashing the site it was on. For this reason, most search engines have disabled the ability to index dynamic content. Google is the only engine that will index dynamic content; although it will only index the few pages before it forces the loop to stop and moves on.
The reason I mention this information on dynamic content is that there are quite a few very technically adept people out there who are able to create wonderful database driven web sites but do not learn about the marketing side until later. If you are one of these persons, simply make static copies of the dynamic pages that you will be optimizing and be sure to restrict the spiders through your “robots.txt” file.