20 SEO Strategies for Sites Affected By Google Panda

by Jason Acidre on May 3, 2011 · 66 comments · Search


The Google Panda Update – sounds harmless, but it’s not. It has recently slapped a lot of huge sites and even the tGoogle Panda Slaprusted/reputable ones (globally) off the Google’s search result pages. The known major signals and basis of this update mostly targeted sites that contained poor quality of content, pages that have external duplicates and dormant behavior of users once they are on the page.

I have been receiving a lot of inquiries from big websites about this specific fixes recently through my link marketing services page, which made me decide to create this post. The strategies listed below are mostly based on my own research through various web sources, actual analysis/auditing of prospected clients’ sites and personal observations.

On-site and Content Management:

  • Identify weak inner pages of the site by tracking each page’s mozrank and page authority. List them all in an excel spreadsheet, which should include the necessary data (Pagerank, mozrank, page authority, quality of content and if it has a duplicate). You can also speed up this process by exporting these data in an excel spreadsheet via Open Site Explorer’s pro version, which will already contain information such as each page’s Page Authority and number of linking domains. Prioritize in tracking pages that have duplicates that can be manually tracked through Copyscape.
  • Remove low quality inner pages of the site from Google’s index by temporarily tagging them to noindex (or by excluding them via Robots.txt), as these pages affect and pull the search rankings of the site’s important/strong landing pages. You can then improve these low quality pages by further optimizing each page’s content while they are inaccessible to search engines, for them to regain their rankings (especially for long-tails) once it’s ready to be republished and be indexed again.
  • Track crawl errors of the site via Google Webmaster Tool and fix 404 errors by employing 301 redirects.
  • Identify high quality inner pages of the site that have strong mozrank, Pagerank (optional), page authority and good amount of incoming links. Make these authority pages internally link (within the content) to the site’s important pages using the targeted keywords as anchor texts (depending on the page’s relevance to the destination page).
  • Identify pages that have consistently brought substantial traffic to the site by tracking its performance (number of monthly visitors, low bounce rates, etc…) in the past few months (before the Google Panda update) via Google Analytics. Set these pages as high priority, wherein you’ll have to preserve and improve its content as well as to acquire links from authority websites directing to it. It’s best to list the pages that will be highly prioritized and the actions that need to be implemented to each page on an excel spreadsheet, to be able to monitor its performance upon executing necessary changes.
  • Optimize the content of the site’s important landing pages (pages that aim to rank on SERPs – homepage/site description, category page, product pages, etc…) by adding/improving its useful sections such as comprehensive product or category descriptions (make them as unique as possible).
  • Leverage browser caching and reduce disruptive elements (such as ads or slow loading opt-in) to the site’s important landing pages, as this are signals of poor quality content and/or page usability.
  • Integrate the blog of the site (especially for large ecommerce sites), and utilize its full potential by taking advantage of strategic internal linking structure through blog posts, wherein you can use your blog posts to support the important pages of the site and in generating more leads through the use of exceptionally written content and strong calls to action. It’s also a good channel in generating high quality and natural incoming links to the site. You can also use the blog in publishing link baits (highly linkable content) and in enriching the campaign’s social media marketing efforts and in emphasizing importance/authority of the site.
  • Create high quality support pages for each targeted keyword (could be a blog post or an individual page that links to each important page). These support pages will be targeting the same keyword as the targeted destination pages to help them regain their search rankings.

Link Prospecting and Competitor Analysis

  • Extract link data and link opportunities from top competitors who were not affected by the Google Panda update.
  • Track relevant authority sites/blogs linking to them and try to acquire links from those sites.
  • Check your competitors’ Robots.txt file (ex. theirwebsite.com/robots.txt) and observe how well their changes have worked for them.
  • Find their best pages via Open Site Explorer, and see if those pages have duplicate copies through Copyscape. Examine how they have managed to re-optimize those top pages and note them down, to have better ideas on how you can fix your own content.
  • Identify the top keywords sending traffic to the site’s competitors through SEMrush and study the site’s anchor text distribution through Open Site Explorer.
  • Continuously find for link opportunities through other sources such as manual Google search, vertical sites or directory listings.
  • List all the prospected sites on an excel sheet, which will include information such as the site’s Pagerank, name of webmaster, contact details (email, twitter, etc…).
  • Compare the site’s important pages to top competitors’ ranking pages (for each targeted keyword) and determine sections and aspects that make those pages rank highly on SERPs. Improve the site’s important pages by basing changes from the aspects that those top ranking pages contain (in-depth content delivery, socially engaging, number of incoming authoritative links, etc…).

Link Building

  • Produce high quality contextual links through guest blogging (on relevant blogs), in which links generated will use keyword-rich anchor texts directing to the site’s important pages. It’s important to produce high quality and trusted links to each important page of the site at this time to consistently improve and sustain its search rankings.
  • Produce high-utility and highly linkable content that will be published within the site or link bait. Consistently promote these extremely resourceful pages (as well as the site’s existing high quality pages) through content outreach such as email requests, social media/networking, Q&A sites, etc. These pages can also serve as a strong support for the site’s important keywords in obtaining/regaining its high rankings.
  • Make use of social sharing buttons (or make them very visible) and leverage social sharing to weak inner pages of the site, as you can use signals coming from social media/networking sites to hint search engines that these inner pages with poor content are already well optimized and updated.

Other Notable Sources:

Core Principle:

  • Weed out all of the site’s poor-content pages, particularly those that have duplicates (that have no proper link attribution to your site) first, because these pages are pulling your important keywords’ rankings off the SERPs.
  • For ecommerce sites, you don’t actually need to make those thin pages provide long descriptions; you’ll just need to make them as unique as you can and perhaps better than the manufacturers’ descriptions.
  • Focus on generating high quality links (content-driven links, through guest blogs, networking, and social media efforts) to support and push back the site’s important keywords’ rankings.
  • Re-optimize the site’s important pages by improving its content, internal links directing to it from other authority pages of the site, and by building links from authority websites.
  • Only present pages on your site that are genuinely offering value to users. Produce more high quality content to backup your previously harmed pages.

More Hints from Google

Amit Singhal recently wrote on Google Webmaster Central Blog more hints on how they try to write algorithms that are more focused on segregating trustworthy content from not:

  • Would you trust the information presented in this article?
  • Is this article written by an expert or enthusiast who knows the topic well, or is it more shallow in nature?
  • Does the site have duplicate, overlapping, or redundant articles on the same or similar topics with slightly different keyword variations?
  • Would you be comfortable giving your credit card information to this site?
  • Does this article have spelling, stylistic, or factual errors?
  • Are the topics driven by genuine interests of readers of the site, or does the site generate content by attempting to guess what might rank well in search engines?
  • Does the article provide original content or information, original reporting, original research, or original analysis?
  • Does the page provide substantial value when compared to other pages in search results?
  • How much quality control is done on content?
  • Does the article describe both sides of a story?
  • Is the site a recognized authority on its topic?
  • Is the content mass-produced by or outsourced to a large number of creators, or spread across a large network of sites, so that individual pages or sites don’t get as much attention or care?
  • Was the article edited well, or does it appear sloppy or hastily produced?
  • For a health related query, would you trust information from this site?
  • Would you recognize this site as an authoritative source when mentioned by name?
  • Does this article provide a complete or comprehensive description of the topic?
  • Does this article contain insightful analysis or interesting information that is beyond obvious?
  • Is this the sort of page you’d want to bookmark, share with a friend, or recommend?
  • Does this article have an excessive amount of ads that distract from or interfere with the main content?
  • Would you expect to see this article in a printed magazine, encyclopedia or book?
  • Are the articles short, unsubstantial, or otherwise lacking in helpful specifics?
  • Are the pages produced with great care and attention to detail vs. less attention to detail?

If you liked this post you might want to subscribe to my feed or follow me on my new twitter account or Facebook Page.

Image Credit: CoolSurface

Jason Acidre

Jason Acidre is Co-Founder and CEO of Xight Interactive, marketing consultant for Affilorama and Traffic Travis, and also the sole author of this SEO blog. You can follow him on Twitter @jasonacidre and on Google+.

More Posts - Twitter - Facebook - Google Plus

{ 62 comments… read them below or add one }

Leave a Comment

{ 4 trackbacks }

Previous post:

Next post: