How to perform an SEO audit on your website for free

Search engine optimisation is an ongoing discipline for any business owner who is looking to maintain a hold on market share online. Performing audits with the help of a third-party company is definitely a great way to analyze the data of previous campaigns, but it is not always necessary to spend the money that it takes to hire them.

Here are a few things that you could do completely on your own to perform an audit of your search engine optimization campaign.

– First, you must be sure that you have the right infrastructure so that nothing slips through the cracks.

Although it can be fun to jump right into analysis, resist that temptation. You need to implement a free tool that will crawl the entire website just as the search engine spider would. Xenu’s Link Sleuth is a very good option that is completely free.

You will need to configure this tool so that it will behave in the same way as your preferred search engine crawler. For instance, if you are going for Yahoo search results, then configure the tool in the same way as Yahoo. Configuration is different depending on the tool that you choose.

– Consult the search engines after you have done your own crawl.

Although there is no way that anyone can gain unfettered access to the search engine servers, you can still use the free webmaster tools that all of the major search engines have in order to analyze your results. Before you do this, if your website is not registered with the webmaster tools program of your preferred search engine, take the time out to do so. This will help in more ways than one, and it will definitely help in your free audit for search engine optimization.

– Make sure that your website pages are accessible.

If people cannot access your website, then it might as well not be there. The first step of the analysis portion of your audit is to manually check the robots.txt file to make sure that it does not restrict access anywhere on your website. If you are using Google Webmaster tools, you can also identify the URLs that your robots.txt file blocks.

Check your robots meta tag in order to speed up this process.

– Check the HTTP status codes on your website.

If you have any deep webpages that return errors, you may be limiting your visibility with search engines and users. Fix any URLs that you find that include soft 404 errors. You can do this through a simple redirect. If at all possible, use 301 HTTP redirects instead of 302 or any JavaScript based redirects. The 301 option gives you the most link juice.

– Next, check your XML site map.

Your XML site map should be a well-formed XML document that follows site map protocol. Search engines are very particular about the formatting of your site map. You should also be sure that your site map is submitted to your webmaster tools accounts.

– Next, conduct a manual test of your sales funnel.

How many clicks does it take to get from the landing page to your final conversion page? If there is any way to shorten this path, then you may want to do so. Extraneous information leaves more room for a high bounce rate as well as confusion among the major search engines when your site gets crawled. In past generations of the Internet, it may have helped you to have duplicitous information, but this will only reduce the validity of your website in the eyes of the major search engines today.