How to Recover from Panda Dance

by Jason Acidre on August 28, 2013 · 37 comments · Search


In early June this year, Matt Cutts (head of Google’s search spam team) confirmed that Panda will be rolling out monthly over 10 of 30 days (or more known as the Panda Dance) on his talk with Danny Sullivan on SMX Advanced.

This particular update implies that Panda filters will now be slowly integrated to Google’s search ranking algorithm. The said algorithmic update was officially rolled out last June 25, 2013.

As the Panda Dance continually implemented tests and changes for the past couple of months on various verticals, it’s already foreseen that it will still result to a lot of ranking fluctuations for the coming weeks/months.

I’ve seen two different patterns of decrease in search traffic since the Panda Dance rolled out.

1. The first one is the gradual decrease due to ranking fluctuations.

gradual

2. While the second is a sudden drop in search traffic. sudden drop

If in case you aren’t aware what the Panda Update is, here’s a brief description (as very well defined by Mark Traphagen on his comprehensive report about the Google Panda Dance):

Panda is after site quality. Is the content really what a searcher would want to find?

In this post, I’ll be covering most of the things that we did to recover a site’s search visibility. Below are some of the optimization methods that you can try implementing to recover or somehow avoid ranking fluctuations from negatively affecting your site’s ability to rank.

Authorship and other Schema/Microdata markups

Authenticity has been a really big thing in this new age of search (and will definitely be a big part of its future as well).

Rich-snippet optimization seems to be one of the best methods to use in responding to these recent algorithmic changes, as this is one of the first things that we did that have somehow shown immediate results (since 3 of the sites we’ve optimized recently haven’t implemented authorship markups yet).

The reason may be because of the signals it can send to search engines – by making the site’s content look more authentic, easier for search engines to understand and making the site’s search listings more appealing to users (higher CTR) when displayed on search results.

Some of the markups that you can implement for your site’s pages:

Improve your low performing landing pages

Understand what your low performing landing pages lack. Check the pages of your site that have good volume of traffic but have low engagement rate (low visit duration and average page visits) and high bounce rates.

Start with the pages that you believe are important and optimize these landing pages to mainly increase user dwell time. Several areas that you can improve on your content to make visitors stay longer on the page/site are:

  • Make sure that the information provided or the context of the content matches the title of the page/keywords it is targeting (or matches the intent/search queries that are commonly used to find that content).
  • Add more thematically relevant internal links in the content – to make visitors check your site’s other strong pages.
  • Improve the page’s loading speed.
  • Optimize the page’s readability (optimize for skim readers – such as breaking down the content into shorter paragraphs, using bold texts on important phrases, etc…).
  • Reducing distractions, such as banner ads and/or pop-ups.

For more tips on reducing your page’s bounce rates you can check out these guides from Search Engine Watch and Crazy Egg.

Update evergreen landing pages

If you’re working on a site that has been around for more than a year, then checking and updating your top landing pages or content assets (that are constantly receiving good volume of search traffic) would be another great method to implement.

For example, one of our clients has ton of useful/evergreen content on their site’s blog/resources sections that are constantly driving traffic to their site.

landing pages

Although most of their content assets haven’t been updated for years now. Making them more comprehensive seemed to be a great way not just to maintain their search rankings, but also to rank better for the other keywords that these content assets are already ranking, but weren’t originally optimized for.

Untitled

Optimize your top landing pages for these other search terms through:

  • Including the other keyword variations (with high engagement rate) on the page’s meta tags and/or mentioning them within the body of the content.
  • Using the other keyword variations as anchor texts for the internal links directing to the landing page.
  • Adding more details/information as well as page elements (such as images, videos, etc…) in the content to give more ranking power to the page. In short, to make the page more relevant and comprehensive.

You can also check out the extensive guide I published earlier this year on implementing this type of keyword audit/discovery and optimization process.

Block crawlers from accessing poor content and duplicate pages

This has been the most known practice in fighting Panda (ever since the first version of its update). Aside from the overall quality of a website, Panda is also strict in targeting pages that are accessible in search results which have poor user engagement (as this signifies irrelevance and/or lack of quality).

Several tips on finding duplicate/thin content or other site errors that might affect your site’s ranking ability:

  • Compare the amount of pages in your sitemap vs. the amount of pages indexed by Google (if the # of indexed pages is far greater than the # of pages available on your sitemap, then the site probably has duplication issues).
  • Check the “HTML improvements” report on Google Webmaster Tools, and see if it’s reporting duplicates on your pages’ meta tags.
  • Check if the site has “Crawl Errors”. This feature on GWT may also show you URL parameters that are being crawled by search engines (check if these parameters are being indexed by using advanced search operators on Google search).

parameters

Make sure that search crawlers will not be able to index the poor/duplicate pages your site has (use the “noindex” tag on these pages or block access through your site’s robots.txt file).

Also, here’s a detailed guide on using Google Webmaster Tools for technical SEO audits.

Build new signals

When you start making changes on your site, it’s important to build new signals so that search engines can re-crawl and index the changes you’ve made.

Some of the ways you can do to send strong signals to search engines:

  • Acquire links from topically relevant authority websites.
  • Create and launch new content assets.
  • Build brand signals within the site, such as adding social proof to important pages, trust indicators (testimonials, badges, etc…) and including your brand name on your pages’ title tags (as this is mostly overlooked – and this is something that we’ve also implemented on one of our clients’ website), as well as building branded links to the site.
  • Sharing your updated content on social networks (social signals).

For more tips you can check out my guides on building brand signals and advanced off-page SEO.

Optimize for Local Search

Since Google is basing many search results on the searcher’s location and the device they use nowadays, local SEO might also be a good method to add in your optimization campaign (and to make sure that you’ll get more search visibility for your website).

On implementing local SEO:

  • Setup page(s) on your site that will cater geo-targeted users. This page can include your business address and local phone number (or you can also create content that’s specifically targeted to certain cities/states).
  • Get your business/website listed on Google Places for Business.
  • Build citations for your website (here’s a great list of local business directories).

If in case you’re looking for more resources on this topic, you can visit this complete guide on Local SEO from Koozai.

Wait and observe

There are times where all you can really do is to just wait (for new algorithmic updates or refresh). But what’s important is to make sure that your site is genuinely providing value to its visitors/users and that you’re doing ethical practices in terms of link building and in marketing the site/business as a whole.

Monitoring what’s happening in the search space (specifically with the ranking algorithm and SERP fluctuations) is very vital these days. The good news is that there are web sources that you can always check to keep yourself updated or to determine if your site has been hit by a new update – like Mozcast and Moz’s Google algorithm change history.

mozcast

The methods I’ve mentioned above are just some of the things that our team has tried to overcome the recent Panda Dance – that you can also try to somehow prevent your site from being affected by future algorithmic updates (targeting low quality sites) – but might not necessarily be the ultimate solution to already effected websites.

If you liked this post, you can subscribe to my feed and follow me on Twitter @jasonacidre.

Jason Acidre

Jason Acidre is Co-Founder and CEO of Xight Interactive, marketing consultant for Affilorama and Traffic Travis, and also the sole author of this SEO blog. You can follow him on Twitter @jasonacidre and on Google+.

More Posts - Twitter - Facebook - Google Plus

{ 31 comments… read them below or add one }

Leave a Comment

{ 6 trackbacks }

Previous post:

Next post: