Penguin Recovery Plan: Eliminating Over-Optimization

-

There’s no doubt about it – it’s a tricky time to work in SEO.  If Google’s recent Penguin update has made one thing clear, it’s that even seemingly safe SEO tactics can be turned around and labeled as web spam.  Just ask any of the thousands of people online today reporting negative impacts despite following established SEO best practices!

Obviously, there’s no clear solution that’ll guarantee penalty recovery at this point, given the rate at which search engine changes are being rolled out and the unprecedented effect they’re having on the SERPs.

However, we do feel confident in making a few recommendations based on Google’s stated Webmaster Guidelines, including techniques for minimizing on-page over-optimization, backlink over-optimization and low quality content.  Hopefully you find these tips helpful when either recovering from a search engine penalty or protecting your site from future attacks!

1 – Minimize on-page over-optimization

On-page SEO is all about manipulating specific web page elements in order to improve your SERPs rankings.  For many webmasters, on-page optimization is their first priority, as it’s a heck of a lot easier to tweak a few variables in your site’s code than it is to engage in a complex backlinking campaign.

The downside, however, is that it’s also a lot easier for the search engine spiders to pick up on the over-optimization of these specific elements.  As usual, there’s no definitive cutoff as to what specific percentages or practices lead to a site being labeled as “over-optimized,” although any of the following factors could play a role in these calculations:

  • Title tags – Title tags are one of the easiest potential over-optimization occurrences for the search engines to catch, so you’ll need to revise these elements if your current title tags read something like, “Keyword 1, Keyword 2, Keyword 3, etc.”  A far better template is to use one instance of your target keyword (assuming it’s relevant to the content on your page) and your site’s brand name, for a maximum of 60-70 total characters.
  • Heading tags – Similarly, avoid optimizing every single one of your website’s header tags perfectly in order to prevent excess scrutiny by the search engines.  It’s a much better idea to craft useful, informative header tags that help your audience to engage with your content than to pack these key areas full of keyword repetitions.
  • Internal links – Although creating links between the pages of your site can help ensure that all of your content is indexed appropriately and help your readers navigate your pages more easily, crafting them with anchor text optimization on a military precision scale is another dead giveaway that tells the search engines you’ve over-optimized your site.  Use caution here, and only include internal links when they make sense and can be created in a natural-looking way.

Hopefully by now, you get the idea.  The days of keyword stuffing your target phrases into every element of your website’s construction are long gone.  Eliminate these dated techniques if they’re present on your website, and make sure to take a more reasonable approach to content creation and website optimization in the future.

2 – Clean up bad links and implement new backlinking best practices

Whether or not you’ve received one of the most recent rounds of Google Webmaster Tools notifications alerting site owners to the presence of unnatural links, your current and past backlinking practices need just as much of an overhaul as your on-page optimization techniques.

The following are a few of the specific elements you’ll need to address on your own site, using data pulled from Majestic SEO, the Open Site Explorer or any similar program that’ll generate a listing of all the different links pointing back at your site.

  • Assess linking domain diversity – Too many links coming from too few domains is a key indicator of off-page over-optimization, especially if the majority of your backlink profile is comprised of “sitewide” links (aka – those displaying in the header or footer sections of other websites).  If possible, contact the owners of these sites to request the removal of your link.  If you don’t get a response, undertake a more natural linkbuilding campaign in order to dilute their influence on your site’s SEO.
  • Assess linking domain PageRank distribution – In addition to looking at the specific URLs that are sending links to your site, pull information on the PageRank of each page that’s linking to you.  Natural backlink profiles are composed primarily of low PageRank pages, though past website optimization efforts often have you courting exclusively pages with high PageRank scores.  If you see that the PageRank scores of the pages linking back to your site are more evenly distributed than would occur naturally, either request the removal of some links or refocus your efforts on obtaining more low PageRank links.
  • Check for “bad neighborhood” backlinks – “Bad neighborhood” links refer to backlinks originating from adult sites, gambling sites or other pages in vice-driven industries.  These links have no place in your backlink profile, so if you see them, do whatever you need to in order to get rid of them.  If you can’t, at least make a record of your attempts to use as documentation in future reconsideration requests if you’re penalized for them.
  • Assess anchor text distribution – Anchor text distribution is another key indicator of over-optimization.  Really, how natural do you think it looks to have thousands of links pointing back at your website; all featuring the same limited set of target keyword variations?  That kind of thing would never happen in the real world, which means that it no longer has a place in your SEO toolbox.  Going forward, either expand your set of target keyword variations significantly or eliminate the optimization of this variable entirely.

Yes, it’s true – some of these recommendations represent pretty major departures from past SEO best practices.  The idea of giving up on anchor text optimization entirely is likely more than a little uncomfortable for webmasters who have grown accustomed to following a certain set of best practices, but the reality is that times are changing.

Google has made no secret about its desire to rid the SERPs of spammy results, and whether or not you agree with the way it’s going about making these changes, there’s some pretty damning evidence that even the most innocuous of white hat techniques could be used as indicators of potential web spam.

So even if your site hasn’t been hit by Panda or Penguin, don’t think you won’t be affected in the future.  With the rate at which recent changes have been occurring, chances are Google’s got something up its sleeves that could impact your traffic and rankings – unless you’re prepared to look at SEO and website management in a whole different way.

3 – Focus on content quality

With all of that in mind, we simply can’t overstate this.  Google’s primary goal is to produce the best possible search results for its users, and that means weeding out web spam and enacting algorithm changes that present high-value content to its users.

So what does quality content look like?

We’ve mentioned it here before, but one of the best resources for assessing the value of your content is Amit Singhal’s seminal “23 Questions” post, which outlines the general framework Google uses to weed out quality articles from their spammy counterparts.  Really take an objective look at your content, and if it doesn’t measure up to the standards laid out there, either revise it or remove it to avoid future penalties.

If you find yourself blinded to content quality after too many years of keyword stuffing and optimization, ask a non-SEO friend to read your content.  If he wouldn’t be willing to pull out his credit card information after reading your articles, you’ve got some revising ahead of you!

What do you think?  Do Google’s recent changes represent the “end of SEO” as we know it?  Do you plan to change your promotional techniques as a result?  Share your thoughts in the comments section below!

Image: I_vow_to_You

11 Responses

  1. graham

    again another great post mate i personally make sure my websites are at least 90% written for my customers and not the search engines but i make sure theirs enough images for me to add keywords to the alt, titles and make sure the image link name has a keyword and place the img next to an actual related keyword. dont know if any1 else gets the same results but it seems to work for me and my customers.

  2. Jasjotbains

    Wonderful post Sujan ! I was hit by Panda and Penguin both, but was able to recover from Panda. I have started re-organising my site, removing excess internal links from posts and instead using tags for navigation within posts. Content has also been changed, to look good for readers rather than the search engines. Although it has still been 50% complete, I am witnessing a rise in rankings and social engagement !!

  3. Building Your SEO Dream Team | Search Engine Journal

    […] experience managing search engine penalties and algorithm […]

  4. How to Identify and Remove Bad Links @sujanpatel

    […] of months for webmasters and SEOs alike.  In addition to being pummeled by various Panda and Penguin updates and refreshes, Google has been investing significant resources into stepping up its […]

  5. Ashok Khandekar

    Hi Sujan,

    Great post as usual.I am badly hit by Panda / Penguin for my travel related site. Main reason was I think the content was less (in quantity) compared to images. I am increasing the content now. Let us see what happens. I only hope that any future updates should not affect it again.

    Ashok

  6. Should You Accept Guest Posts? | Search Engine Journal

    […] no secret in the SEO world that Google’s recent Panda and Penguin algorithm updates have killed off a number of backlink building techniques that were effective in […]

  7. Image Optimization 101 | Search Engine Journal

    […] according to the following formula: “keyword 1, keyword 2, keyword3 , etc.”  That’s just an over-optimization penalty waiting to […]

  8. Prashant Bairwa

    Good article.I think google is now very strong against spammers and google wants to give best search results to End user. This is not END of seo but this is starting of showing real and best results.

  9. Rajiv Pandey

    Amazingly explained Sujan…

    I am agree with all the factors and elements you have stated, I wonder if you’ll share about some more tweaks and tools to make these process more convenient…

    However there is no surprise google will update soem social sharing and authority based updates where social integration of your content will mandatory in a sense if you want to be on top, As you have already stated on your Personal blog – (http://www.sujanpatel.com/seo/why-i-love-google-algorithm-changes/) – SEP will keep us on our toes always and this is the best thing :-) (Survival of the best)… Isni’t ??

    Again thanks for a informative post…

  10. Why You Shouldn’t Use Google’s New “Disavow Links” Tool | Search Engine Journal

    […] reconsideration requests following post-Penguin cleanup have been […]

  11. Jhilmil

    Very informative and very nicely explained.