Monthly Archives: April 2010

Search Engine Friendly URLs

As a URL is a very important part of a website it is a good way to make it SEO friendly. This will give your website a boost in rankings in the natural search engine results, thus, giving you many potential visitors to your website.

There are a couple of things to remember when building SEO friendly URLs:

1. Placement of your desired keyword in your URL will increase your ranking, so it may be wise to consider NOT changing your existing URL. Unless you can somehow redirect your URL if it is indeed changed, make it a point to use it for a long time. This is because a change URL may result in broken link and that would greatly decrease you ranking, especially if one of your pages is ranking well in the search engines.

2. Keywords in the URL in itself will not guarantee you great ranking but it will enhance your chances. A webpage with the keyword in it’s URL will definitely outrank a web page without it, and as a result will potentially get more visits.

Here are guidelines that will be helpful:

  • Keep It Clean
    The shorter and cleaner your URL is, the better as far as search engines are concerned. Longer query strings (associated with dynamic URLs) are confusing to search engine spiders. Keeping it shorter and cleaner (i.e. http://www.widget.com/4/basic.html vs. http://www.widget.com/cgi-bin/gen.pl?id=4&view=basic ) makes it a heck of a lot easier for a search engine to actually figure out what your page is about, and helps them distinguish folder names and find tie-ins to keywords. Not only that, static URLs are easier for users and webmasters both. A dynamic URL that’s full of parameters can be easy for a user to forget, and easy for a webmaster in a hurry to screw up. Not only that, but a dynamic URL that’s full of parameters makes it much easier for a potential hacker to see how a site is structured, then go in and create mischief. There are new methods for rewriting URLs that allow dynamics URLs to be “cloaked” in easier terms, and a static URL won’t be affected by changes in programming language (i.e. Perl vs. Java) from one page to another.
  • Mind File Extensions
    URLs that end in .html are favored by search engines. There are exceptions to this so make sure that you know your system well and that it’s friendly with search engine requirements.
  • Spammy URLs
    If you still have certain doubts about your URL, go to a resource like seomoz.com There are spam detection tools that will help weed out things like spam words, and give you tips on hyphens, sub domain depth, domain length and digits. Again, all of this information is freely available online.

To review:

  • Keep them short, keep them clean.
  • Keep the underscores out and replace them with dashes.
  • Use lower-case to avoid case-sensitive dilemmas.
  • Keep as many parameters as possible out of dynamic URL’s. Not only is it better from a search engine standpoint, but it makes it more difficult for the site to be hacked.

How to make your website search engine friendly.

If your website is not user friendly, your targeted keywords may appear on your site but you may still not find your website in the search engine results pages even when you search using those particular words.  This may be caused by a number of reasons, including a weak navigation method, too much use of sophisticated web technologies, and a rigid site design.

An important strategy for making a site user friendly is to provide a good navigation scheme.  One of the complex web technologies that can make your site appear to be invisible to search engines is the application of JavaScript-generated links for navigating to the other pages in the website.  Thus, the search engines cannot find their way to the other pages because they do not have the capability to comprehend JavaScript.  This may also be the problem with links that are included in Flash files.  The solution for this problem is to include simple HTML links aside from the links included in JavaScript and Flash files.

Another way to boost the user friendliness of your website is to make sure that there are no errors in the HTML code.  Whether your are using a WSYSIWYG web editing tool or you directly keying in the HTML code, it is advisable the check for errors using CSS and HTML validators.

The next strategy for making your site more user friendly is to include text in the images, Flash files and videos in your site because the search engines are unable to determine their topics simply from the videos and images.  The technique is to include text in the HTML code that puts the images, videos or Flash files on the web page that describes the content for the spiders.

For those who apply a content management system, such as a blogging software or blogging service, a strategy for making a site user friendly is to remove the apparent duplication of content on the website.  This situation occurs when automated content management services, such as blogging software and services, generate alternate URLs for the same article, thereby resulting into link dilution.  The remedy for this problem is to find a way to disable this particular feature in the blogging service or software that allows one article to reached through various URLs.

Do not use image maps without text links that follow.
Image maps are images or graphics in which different parts of the image are hyperlinked to a different page. Often many horizontal or vertical navigation bars are really single images (buttons) that take you to another section of your site when you click on a certain part of the image. There is nothing wrong with having image links. However, if you do, you must be certain you also have text links on each of your pages (especially your home page) to the same pages the image links to.

Search engines get their data through spiders. When a spider finds your web site it first loads your “robots.txt”. The “robots.txt” file can be used to tell a spider which (if any) sections and/or pages on your web site it should avoid. Once the spider has finished with your “robots.txt” file, it will most likely move on to your home page. From there it will explore your web site and index all of the pages it finds (with usually a maximum of a few dozen). The problem is that the spider cannot “see” images, just text. It can only move to another page if there is a text link to another page. Therefore, if all your links are images, it will not find the rest of your site and will move on to the next site in its never ending list.

As you can see, it is crucial to have text links to each main section of your web site on every one of your pages.

Things which we should not do it..

Do not use drop down menus without accompanying text links.
On a similar note, spiders are also unable to follow the links that are in drop down menus. If you use a drop down menu to help the user select which section of your site to go to next, be sure to also use text links somewhere else on your page.

Do not use frames.
A cardinal sin in the world of search engines is using frames. Frames became popular a few years ago because they enabled web site developers to easily change content displayed on a site across all of the pages by changing only one document. Since then, web site development programs such as Dreamweaver have provided the ability to use templates. Coding, such as server side includes and global variables, have enabled more developers to change uniform features (such as design, navigation bars, or text footers) by only changing one document.

However, some web site owners have persisted in using frames. This will be detrimental to their search engine efforts. You see, frames start with a frameset. The frameset is simply code that tells the browser which two, or more, HTML files to display in the browser. The problem comes from the fact that the search engines are able to read the code in the frameset, but can’t follow the code to the actual frames (the HTML files). This causes the frameset pages to receive very poor rankings. If you do choose to use frames, be sure to put optimized pages within the tag on each frameset page. Managing this will be very time consuming if you have more than a few pages. My best advice to you is to not use frames.

Do not use dynamic content on pages you want to be indexed.
Dynamic content is content that is generated on the fly from data in a database. They can cause significant problems for the search engines spiders, and in turn, cause your site to be penalized and cause your dynamic content to not be indexed at all.

Dynamic content is generally only used by larger sites or by an experienced developer’s site. So if you have no clue what dynamic content is or how to use databases on your web site you shouldn’t have any cause for concern. Dynamic content can usually be spotted by looking for the ? or & symbol in a page URL or a .pl, .cgi, .php, or .asp page ending.

There is no inherent problem in using dynamic content. What you must avoid is using dynamic content on pages you wish to optimize for the search engines. Also, if you are using dynamic content be sure you create a “robots.txt” file on your server in which you tell spiders to stay away from indexing this area of your site.

The reason search engines have problem with dynamic content is rather quite simple. Again, they are mindless spiders that follow text links. Since dynamic pages really are not “there”, but rather created on the fly depending on what parameters are placed in the address a spider could potentially become trapped in a large database driven page. It would have to index the entire database and would be stuck in a loop until it did so, potentially crashing the site it was on. For this reason, most search engines have disabled the ability to index dynamic content. Google is the only engine that will index dynamic content; although it will only index the few pages before it forces the loop to stop and moves on.

The reason I mention this information on dynamic content is that there are quite a few very technically adept people out there who are able to create wonderful database driven web sites but do not learn about the marketing side until later. If you are one of these persons, simply make static copies of the dynamic pages that you will be optimizing and be sure to restrict the spiders through your “robots.txt” file.

Do not place JavaScript above your meta tags
Search engines often have trouble reading meta tags placed after JavaScript on web pages. If your web site uses JavaScript, be sure to place all of your meta tags above the JavaScript code.

In general, having JavaScript in your source code will make it more difficult for the search engines to find what they are really looking for- the text on your page. If possible, do not use JavaScript on your optimized pages, and if you must, do so sparingly.