In part 2 of our series about SEO best practices we discussed optimizing the different types and format of content that you might be using on your website. We talked about crawl and indexation issues. We touched a little on issues related specifically to domain names and we explored the strategy behind planning the architecture of information found within your website. Let’s continue with part 3, the final piece to the SEO best practice puzzle.
Carefully choose a keyword to target for each of your web pages. Do not target a keyword simply based on search volume. Your keywords need to be relevant to the page you are optimizing them for. If you incorrectly target a keyword to a page on your website, you run the risk of a higher bounce rate, which does impact your position in the search engines.
The positioning of a keyword within your web page is important. Your keyword should appear once in your title tag, once in your Meta description tag, once in your H1 heading tag, two to three times within the body of your page with one occurrence of the keyword close to the H1 tag and emphasize one occurrence of the keyword within the body of your page using a bold tag. Finally, be certain to have at minimum one internal back link to this page using the keyword within the anchor text.
Too many occurrences of a keyword within your page is called keyword stuffing. If the page content does not read naturally to a visitor this is a problem. Be sure to use proper sentence structure and correct grammar within the content of your pages.
Every link that you place on your web page will distribute less PageRank to all links on that web page. Keep the number of links on a single page to a reasonable number. Don’t have more than 100 links on a single page.
Lists of Links
Anytime that you list several links consecutively on your web page make sure to use HTML or CSS list markup to provide structure to your list of links, otherwise those links might be devalued or discounted by the search engines.
Rate of Building Links
A higher than usual rate of new backlinks pointing to a website or web page is not necessarily a negative issue. Sometimes websites can gain lots of new backlinks immediately after publishing something controversial or hugely popular on their website. The rate at which your website gains new backlinks only becomes a problem if the links are coming from low quality or topically unrelated websites. Use this backlinks checker to see your backlinks and other related metrics.
Using relative URL’s is common practice within the developer community. Unfortunately, it can cause problems when you link to anything internally in your website without using <http://www.yourdomain.com/> at the beginning of the link URL. With regard to search engine optimization it is always a best practice to use absolute URL’s when linking to anything.
If you are linking to the same URL twice on a single web page, know that only the first link will be counted by the search engines. Repeating the same link and using different anchor text is pointless and should be avoided.
Flash files cannot be read by search engines. Don’t use Flash for navigation menus, unless you have an alternate text based navigation menu somewhere else on your web page.
If navigating through a website through the use of pull down forms and submit buttons is the only way to navigate the website, then you can expect that the search engines will not be able to crawl and index all of your web pages. Search engines cannot understand and use forms.
SEARCH ENGINE ROBOTS
A 302 redirect refers to the “Moved Temporarily” 302 header response code returned by a web server when a 302 redirect has been setup. Search engines do not pass any link value through a 302 redirect. Only use a 302 redirect if in fact your redirect is very temporary.
Cloaking refers to displaying a page of content to the search engine robots that is different than the content that you would display to a human visitor. Cloaking is considered unethical when it is implemented for that reason and should be avoided. Cloaking can be detected and can result in a search engine penalty.
The Meta refresh tag is used to redirect to another URL after a specified number of seconds. If you use this tag with a zero second delay, it is treated by the search engines as a “Moved Permanently” 301 redirect. This type of redirect is not always reliable because the browser handles the redirect and sometimes the browser can stall or freeze.
The Meta robots tag is used in much the same way as a directive inside a robots.txt tag, but when used as a Meta tag you can specify whether a page should be crawled and indexed. If you use the Meta robots tag, ensure that you are using it correctly.
The robots.txt file is the first file that a search engine robot will look for on your website. This file should be placed in the root HTDOCS folder or equivalent of your website. It tells the search engines if they have permission to crawl your web pages. Be sure to validate your robots.txt file and ensure that it is not blocking access to content that you want to be indexed.
Use of nofollow
The rel=nofollow attribute used in link statements is meant to be used only when you cannot vouch for the content of external links. When you use this attribute the search engines will not pass any value to the URL that you are linking to.
Just like the Meta robots tag is used to tell the search engines whether or not to crawl and index a page, the x-robots-tag header can be used as an element of the HTTP header response for a given URL to provide direction to the search engines as to how they should treat the page. Google has provided information about the x-robots-tag and usage guidelines here.
When you construct URL’s and name your web pages do not use underscores as word separators. The presence of underscores in a URL path is a common occurrence. Instead use hyphens as word separators. The search engines treat hyphens as word separators. They ignore underscores.
URL’s that contain special characters like “&” and “?” and have session ID’s and other parameters within them can be problematic for search engines. Those characters and parameters should be avoided. If they cannot be avoided, then you should have a clean URL for the same content and use the rel=canonical tag to indicate to the search engines that they should only pay attention to the clean URL.
Lengthy URL’s can be difficult for people to remember. They can also be truncated within search engines. Try to keep your URL length as short as possible.
Wow! If you’ve read through the entire 3 part series about SEO best practices and gotten this far, all of the things that we discussed might seem like a lot to keep track of. It doesn’t have to be difficult though. Just use common sense and always keep your website visitors in mind when you search engine optimize the content on your web pages.
Brendon Turner is co-founder of WebDevCompany.com. Feeling overwhelmed by all of this SEO stuff? Leave it to the Pro’s at Web Development Company Inc. Request a Free SEO Analysis and receive a detailed report of your website, including tips to improve your SE rankings and traffic.