Fixing Supplemental Results with Robots.txt

-

Correcting supplemental results and directing search engine bots to your important content can be an overwhelming task if you don’t know what approach to take. Supplemental results aren’t the end of the world but if you are trying to get traffic from Google’s search results then you might be out of luck. Google will only show supplemental results after all the other relevant searches have been displayed. There are basically three main reasons Google puts pages into their Supplemental results:

Duplicate content
Pages with no content
Pages with no links

Today we are going to concentrate on duplicate content.

Duplicate content is one of the fastest ways to get your blog pages into Google’s supplemental results. In order to avoid this you need to direct Google to the correct pages you want indexed. With more and more blogs popping up everyday it’s very important to have a blog plan and understand the best ways to increase your traffic.

The organization of your blog can be a huge factor in determining which pages are included in Google’s main index and which end up in supplemental results. Because you can easily add posts to multiple categories, you can quickly end up with multiple versions of each post. However, multiple versions means that you are dividing the strength of each post and allowing Google various ways to link to each post. You want Google to only see one version of each post.

Another quick way to duplicate your posts is with your comment feeds and trackback URLs. Each time you post a new story or article on your blog and people either comment on it or trackback to it from their blog, a new link can be created in the SERPs. The problem arises when these comments or trackbacks get more attention then the actual post, the original post can be pushed into supplemental results.

You can fix all these problems with the proper inclusion of your robots.txt file. The robots.txt file is a text document that goes in the root structure of your website and basically tells Google and every other search engine where it can and can’t go.

WordPress offers many suggestions on how to edit your robots.txt file. Another important resource that you can use is Google’s Webmaster tools. If you setup an account with Google Sitemaps you can add various functions to your robots.txt file and test out which URLs will be blocked before uploading and potentially harming your website. Editing your robots.txt file can be dangerous so go slow and edit only what you need to.

Blocking unimportant or duplicate content will allow you to gain traffic by increasing the amount of pages you have in Google’s main index and reducing the amount of pages you have in their supplemental results.

Drew Stauffer is the CEO and founder of Alibi Productions, a website promotion & online marketing firm based in Greenville SC.

2 Responses

  1. Sarah Lewis

    I’ve been using the WordPress Duplicate Content Cure plugin for WordPress. It takes the headache out of the process for me but helps my search engine results (well, I haven’t tested this thoroughly yet, but it should :) ).

  2. Drew Stauffer

    @Sarah

    Thanks for the heads up. I’ll definitely check it out.