Wednesday, 16 May 2012

How to have a SEO documentation ready in just 15 minutes

Being a marketer I know that getting a complete SEO documentation ready is the most vital step towards a happy online presence and a slight mistake can throw spanner in the entire online marketing endeavor. Now, as obvious it is a time consuming process; so you might be wondering how you are supposed to get the documentation ready in just 15 minutes.
Definitely, I am not going to ask you do a thorough check of the website, make a strategic revision to information architecture or create a compelling content strategy to keep the Panda at bay. This documentation is going to include those factors that can be fixed easily and help you see some difference in the volume of organic SEO traffic. However, I acknowledge that these may sound elementary to advanced SEO professionals , but these are some of these techniques that you cannot do away with:
Check Robots.txt File: Make sure the robots.txt file is on the root folder of the website. This might sound funny, but there are some website owners, who often place it on the wrong place all the time. Once you locate the robots.txt file, check whether it is blocking some important pages of the websites. If the robots.txt file contains two different set of instructions for two different search engines, you need to be extra careful and suggest changes only when you have clear idea of what the client wants.
Check Meta Data: Meta data is the most important part of the website as search engines are most likely to use them to produce the snippet. Make sure the Meta data does not too spammy and you should not make them stuffed with keywords imaginable.  Keep it simple and snappy and use keywords reasonably. There is no need to go overboard with it.

Canonical: Run Screaming Frog and once the crawling is complete, try to figure out whether the same content is accessible via different URLs. If there is any, you need to set up canonical URL for the preferred version. This only takes a moment or two but this will go a long way to fix the problem of duplicate content. And make sure all the non WWW URLs of your website are getting redirected to their WWW counter parts otherwise, otherwise all URLs will have a duplicate version or without www.

Error Page – Just type something random after an URL and if the website fails to turn up a custom template dedicated for such weird cases, you need to suggest developing a custom 404-error page. The body copy of this page should be shown whenever the server receives 404 server side response code. This will help you retain a significant sections of visitors, who might have otherwise bounce back from the website.

Content: Yup, you don’t have the time to check the quality of the content, but you can easily tell whether the content is developed for search engines or for the users. If the content is stuffed with keywords and convey little meaning human visitors, you need to take it off the website at the earliest.

Navigation: Check if the navigation is simple and there is no complexities involved. The navigation should not be done in Flash or JavaScript files as search engine bots still find it tough to parse them. So the navigation has to be in simple HTML so that search engines can discover other links without facing least hassle along the way.

No comments:

Post a Comment

chitika