Fixing Supplemental Results with Robots.txt

[Free Consultation] Are you spending money on advertising but not getting the results you want? Are you looking for more sales and leads but have no idea where or how to start? Get help from our world-class marketing experts in a free consultation call.
Click Here To Schedule Your Free Consultation Now

Correcting supplemental results and directing search engine bots to your important content can be an overwhelming task if you don't know what approach to take.

Supplemental results aren't the end of the world, but if you are trying to get traffic from Google's search results then you might be out of luck. Google will only show supplemental results after all the other relevant searches have been displayed.

There are basically three main reasons Google puts pages into their supplemental results:

  • Duplicate content
  • Pages with no content
  • Pages with no links

Today we are going to concentrate on duplicate content.

Duplicate Content

Duplicate content is one of the fastest ways to get your blog pages into Google's supplemental results. In order to avoid this you need to direct Google to the correct pages you want indexed. With more and more blogs popping up every day, it's very important to have a blog plan and understand the best ways to increase your traffic.

The organization of your blog can be a huge factor in determining which pages are included in Google's main index and which end up in supplemental results. Because you can easily add posts to multiple categories, you can quickly end up with multiple versions of each post. However, multiple versions means that you are dividing the strength of each post and allowing Google various ways to link to each post. You want Google to only see one version of each post.

Another quick way to duplicate your posts is with your comment feeds and trackback URLs. Each time you post a new story or article on your blog and people either comment on it or trackback to it from their blog, a new link can be created in the SERPs. The problem arises when these comments or trackbacks get more attention then the actual post, the original post can be pushed into supplemental results.

You can fix all these problems with the proper inclusion of your robots.txt file. The robots.txt file is a text document that goes in the root structure of your website and basically tells Google and every other search engine where it can and can't go.

WordPress offers many suggestions on how to edit your robots.txt file. Another important resource that you can use is Google's Webmaster tools. If you setup an account with Google Sitemaps you can add various functions to your robots.txt file and test out which URLs will be blocked before uploading and potentially harming your website. Editing your robots.txt file can be dangerous so go slow and edit only what you need to.

Blocking unimportant or duplicate content will allow you to gain traffic by increasing the amount of pages you have in Google's main index and reducing the amount of pages you have in their supplemental results.

Drew Stauffer is the CEO and founder of Alibi Productions, a website promotion & online marketing firm based in Greenville SC.

Write for us

Think you’ve got a fresh perspective that will challenge our readers to become better marketers? We’re always looking for authors who can deliver quality articles and blog posts. Thousands of your peers will read your work, and you will level up in the process.

Contribute to our blog