Technical SEO Audit with Google Webmaster Tools

by Jason Acidre on September 24, 2012 · 47 comments · Search


There are so many tools these days that can extremely make the process of site audit and optimization so much easier, and I’m betting that several tools are already running through your head. But sometimes, the best ones are those that are offered for free.

Google’s Webmaster Tools is certainly on the top of my list. This browser-based web application from Google has ton of functionalities that can help determine your site’s condition from one end to another.

Particularly in areas that are really important when it comes to search optimization (such as site structure and performance as well as content-level issues that the site should be fixing/improving).

So in this post, I’ll share a few of its features that you can use to easily analyze and optimize your site for search.

Finding Duplicates and Thin/Poor pages

Webmaster Tools offers lots of features that can help you identify poor content pages that could be affecting how your site performs on search results.

Nowadays, it’s really important to weed out pages from a site that may not be very useful to searchers. Having thin and duplicate pages from a site allowed to be accessed and be indexed by search engines might harm all its other pages’ ability to rank (Panda), because these pages mostly serve irrelevant and unusable content to search users.

In finding possible duplicate and thin pages within a site, I usually start by comparing the number of pages from the sitemap vs. the number of pages already indexed by Google.

On Webmaster Tools, go to “Optimization”, then to “Sitemap”:

There are two ways to compare the pages from your sitemap to the indexed pages on Google. The first one is by searching all the site’s pages on Google search:

The second method is through Google Webmaster Tools’ Index Status. Go to “Health”, and then to “Index Status”:

By doing this, you’ll get a rough estimation of how many thin/duplicate pages from the site have been already indexed by Google.

This will then make it easier for you to know how many pages you’d be looking for to be removed on Google’s index (by tagging these pages to “noindex” or by blocking access through your Robots.txt).

There are several ways to find thin and possible duplicate pages within the site. But the best place to start is through Google Webmaster Tools’ HTML Improvements feature.  You can start off by going to “Optimization, and then choose “HTML Improvements”:

From there, you can instantly get clues for possible issues that are causing duplication within your site and easily identify pages (URL parameters, session IDs and/or pagination problems) that you should be blocking search engines from indexing.

Check for each URL parameter if they are being indexed by Google, and take note of the amount for each to assess if there are still more possible duplicates/poor content pages within the site. You can use the “site:” and “inurl:” search operators in doing this task.

You can also get clues from the site’s crawl error data. Go to “Health”, and choose “Crawl Errors”. See the URLs, particularly the extended URL strings being crawled by Google:

Bonus: You can check your site’s “tag” and “search” folders too, and see if they are being indexed by Google. As these are commonly providing poor and irrelevant content to search users and can somehow hurt your site’s ability to have its important pages get better search rankings.

Once you have identified the pages that could be hurting your site’s overall rankings due to duplication and improper indexation, you can now start removing these pages from Google’s indices, by tagging them to noindex or by blocking bots from accessing these pages via Robots.txt.

Crawl Errors

The next one is pretty basic, but definitely as important as the first one shared on this post. Ensuring that search crawlers will have no issues in accessing internal pages of the site is necessary, as this aspect of site optimization improves both user-end experience and the process of site crawling.

It is also used as a ranking factor by search engines, wherein the site’s condition in terms of its crawl errors/status can send signals if the site (or its content) is geared up to be served to their users.

Identifying the pages that cause crawl errors (which may vary on several response codes) is easy with Google Webmaster Tools. You can easily get this data through the “Health” > “Crawl Error” feature of the toolset.

The next step is to gauge the importance of each page found causing crawl errors to the site, as by distinguishing their importance will lead you to the necessary fixes needed for each (you can download the list of all the pages with errors in excel format).

After identifying the level of priorities of the pages with errors, manually check the pages that are linked to them (as these are the ways search crawlers access the pages in your site with problems). This will help you decide with what solutions to take for each issue.

Most common fixes that you can do to fix crawl errors on a site:

  • Reviving the page on a new or old URL (if the non-existent page is important), then 301 redirect the old URL to the new one.
  • 301 redirecting the page to another relevant page/category (if the page is linked from external websites)
  • Removing internal links pointing to the 404 page (if page is not that important)
  • Fixing the page (if issue was caused by server-end or coding errors).

HTML Improvements

Another feature of WMT that I think is mostly overlooked by its users is the HTML Improvements, which can be found under the “Optimization” tab.

This feature allows webmasters to see pages of their site that may cause problems for both user experience and search performance. This includes pages that have:

  • Duplicate meta descriptions
  • Long meta descriptions
  • Short meta descriptions
  • Missing title tags
  • Duplicate title tags
  • Long title tags
  • Short title tags
  • Non-informative title tags
  • Non-indexable content

The list for each potential page-level issue can guide you on what changes/improvements to implement for the pages that search crawlers might have found to be causing indexation problems for your site.

Site Speed

Gaining insights on how your site is performing when it comes to its pages’ loading time is also available with Google Webmaster Tools. Just go to “Labs”, and then choose “Site Performance”.

The performance overview that this feature provides will give you better understanding if you need to optimize for this aspect of your or your client’s website.

Site speed has been a very important ranking factor for quite some time now, so using other tools like Google’s Page Speed or Pingdom is a good option to further elaborate your client recommendations. With that, you can include the specific areas/elements of their site that’s affecting its loading time.

Search Queries

The “Search Queries” feature – which can be found under the “Traffic” tab – is also a great way to track the progress of your SEO campaign or to determine if the site has been hit by a penalty/algorithmic update.

The search queries graph on the image above is from a website that has been affected by the first Penguin Update (April 2012). And with this feature we’re somehow able to see the progress of our campaign in regaining the site’s search visibility.

Another great way to make use of the available data from this feature is by downloading the table of search queries (with each query’s status when showing up on SERPs, like CTR, impression, position and number of clicks) in excel format.

This list can help you improve your campaign, in terms of optimizing, targeting and discovering high performing keywords (based on average search positions, number of impressions and click-through rate).

Structured Data

This new function on GWMT is also a great addition for your campaigns, especially in giving site recommendations to your clients. You can easily find this feature on the “Optimization” – and choose “Structured Data”:

The Structured Data feature will also tell you if the site hasn’t used any type of schema/microdata or authorship markups on any of its pages. You can then suggest it to your clients to improve their website’s performance in search.

But if in case the site already has implemented schema/microdata markups, clicking on each type listed on the “Structured Data” table will allow you to see all the pages that have used that certain type of markup.

You can then test some of these pages using the Structured Data Testing Tool to see if their markups are working well, as well as to see how the pages’ snippets will most likely be seen in the search results.

Link Profile Analysis

The thing that I love the most about Google Webmaster Tools is the amount of site data available to be extracted. And this also includes a site’s full link data, which means implementing an efficient link profile analysis is very doable.

What I usually do when using Google Webmaster Tools for link profile analysis is to download the entire list of external domains linking to the site.

You can start by going to “Traffic”, and then to “Links to your site”:

Check the full list of domains “who linked the most” to your site by clicking on “More”. Then download the entire list by choosing “download this table”:

 You’ll now have the full list of domains linking to your site in excel format:

Download the Neils Bosma SEO Tools for Excel (unzip the file after downloading it, and drag the SEO Tools XLL add-in file to the spreadsheet that you’ve just downloaded from Webmaster Tools):

I use this tool so I can include more metrics for each domain listed in the excel sheet, which I can use in better understanding the site’s entire link profile.

Next is to add the Alexa Reach scores for each listed domain (I just chose Alexa Reach so I can easily classify the listed domains in the latter part of this audit process – and the Alexa Popularity function doesn’t seem to work these days).

You can start by clicking on the 4th cell after the name of the domain (D2), and select “Offpage” from the “SEOTools” tab, and then choose “AlexaReach”:

After choosing on “AlexaReach”, it will display a pop-up window. The next step is to just simply click on the name of the domain (on cell A2) and hit “enter”.

The chosen cell will now then give you the current Alexa Reach score of the first listed domain. Copy the formula on that cell (press Ctrl+C on D2), and paste it up to the last cell on that column (to automatically extract the Alexa Reach scores for each listed domain).

Note: 0 Alexa Reach Score means the domain hasn’t been ranked by Alexa (N/A). This metric is pretty much the same with Alexa Popularity, the lower the number is the better (ex: Google is ranked #1 and Facebook is ranked #2).

With this upgraded list, you can analyze a lot of areas of a site’s link profile. For instance you can easily see if you’re site is getting sitewide links from low quality domains, just by sorting the “number of links” from the domain from largest to smallest (and seeing each domain’s Alexa Reach with the most links):

After sorting the second column of the spreadsheet, you’ll be able to see low quality domains that may have been containing sitewide links to your site:

Another way to utilize this list when performing link auditing is to use it to determine the ratio of low quality vs. high quality domains linking to your site.

What I usually do in this process of the audit is to sort the list by the listed domains’ Alexa Reach, from largest to smallest.

From there, I would copy the entire D column (where all the Alexa Reach numbers are) and paste it on a new excel worksheet. And then after pasting the numbers in a new spreadsheet, I’ll have to segment it into 4 parts:

  • High Alexa Rank (1,000,000+)
  • Decent Alexa Rank (100,000 – 999,999)
  • Low Alexa Rank (1 – 99,999)
  • No Alexa Rank (0)

And have a quick count of each column and list the amount for each type of domain (preferably on a new tab of the worksheet):

Because in that way, I can create a chart that depicts the type of domains linking to the site (you can easily create the chart by choosing the “Insert” tab, and then “Column”):

The list that you have created, with the help of Webmaster Tools, can also be used in pruning the links that might be passing little to no value to your site (or could be damaging to your site’s ranking performance).

Also, if you have created your own chart, you can easily assess if the site has participated on low quality link schemes in the past – basing on the ratio of low quality vs. high quality domains linking to them.

Lastly, the data you can extract from your Webmaster Tools’ “most linked content” can also help you evaluate if the site has been over optimizing, or even being attacked by negative SEO campaigns (which actually happened to my blog months ago).

There are so many things that you can do and data you can explore with Google Webmaster Tools. And the best thing about it is that Google is continuously enhancing the toolset – so take advantage of it!

If you liked this post, you can subscribe to my feed and follow me on Twitter @jasonacidre.

Jason Acidre

Jason Acidre is Co-Founder and CEO of Xight Interactive, marketing consultant for Affilorama and Traffic Travis, and also the sole author of this SEO blog. You can follow him on Twitter @jasonacidre and on Google+.

More Posts - Twitter - Facebook - Google Plus

{ 39 comments… read them below or add one }

Jonathan September 24, 2012 at 11:25 am

Good post man, I also like the URL parameters feature under Configuration in WMT – it’s a great way of seeing a snapshot of what Google has crawled on your site and to spot any problems that might be wasted crawl.

Reply

Caleb Donegan September 24, 2012 at 4:31 pm

Awesome article. I get in the habit of looking at specific metrics and making the corrections based off the same data I always look at. Easy to forget how much information lives in webmaster tools. Thanks for the checklist!

Reply

John Garry September 25, 2012 at 1:22 am

Another brilliant post from you, Jason. I really enjoy reading your articles, since there are many useful tips in them. What I learned here is that I can check my site’s “tag” and “search” folders to see if they are being indexed by Google, because, as you correctly stated, these are commonly providing poor and irrelevant content to search users and can decrease the search rankings.

Reply

Kenny Fabre September 25, 2012 at 6:11 am

Kaiser

google webmaster is an awesome tool and thank for the info, now I will go and do this for myself

Reply

Alex September 25, 2012 at 8:47 am

Great post – I think a lot of people doing SEO get used to their tools and forget about how useful WMT can be. There are a lot of powerful insights to be gained from the data in there, and you’ve shown that perfectly.

Reply

Paige C. Willey September 25, 2012 at 10:12 am

This is an incredibly comprehensive post! So thorough and informative. WMTs is incredibly powerful. I think it’s interesting that you mention Alexa Rank. You don’t hear much about it in the SEO world these days. I’ll definitely try some of these things out.

Reply

Sanjib September 25, 2012 at 10:14 am

Hello Jason,

Such a fantastic post. Your posts are always useful and have something to learn for the readers. We all want to know ways to simplify site audit and SEO. We follow our heart and move ahead usually but if there is something which is technically easy and executable,then we just go with it.

Thanks a lot,
Regards
Sanjib

Reply

Nick Stamoulis September 25, 2012 at 11:30 am

The “Site Performance” option is a great way to analyze page load times. Many times sites can get weighed down with old code or high resolution images that cause the page to take extra time to render. Reviewing your site performance can help you identify these areas to clean up, which will boost your page load time. Faster loading pages tend to rank better and add to the overall user experience.

Reply

Kevin Gallagher September 26, 2012 at 8:37 am

Great article, peopel can learn a lot from this. Not surwe why you are looking at Alexa rankings though?

Reply

Allan Duncan September 27, 2012 at 7:11 am

I like SEO tools for Excel part. But of course everything you have written here are all valuable to someone doing SEO audit. Thumbs up bro!

Reply

Estella@Alternatieve geneeswijzen September 28, 2012 at 2:56 am

This paragraph is truly a pleasant one it assists new web people,
who are wishing for blogging.

Reply

Slavik September 28, 2012 at 3:12 am

Thank you for a good article, I will wait for new

Reply

Gerardo@Stainless September 28, 2012 at 3:50 am

I’m gone to tell my little brother, that he should also pay a visit this weblog on regular basis to take updated from hottest news update.

Reply

Onos Clinton September 28, 2012 at 5:54 am

what a brilliant post, i enjoyed every read and learnt new SEO techniques. Thanks for sharing.

Reply

Rob @ Atlanta Homes September 29, 2012 at 5:54 pm

Holy crap man, you really outdid yourself with this post.

This is GOLD.

I’ve barely been scratching the surface with the Webmaster tools.

Thanks.

Reply

Nawaz September 30, 2012 at 10:54 am

This is a complete tutorial on webmaster Tool. I often used this tool to see the search queries and the position of the keywords ranking wise on Google.com

Reply

Jack September 30, 2012 at 12:48 pm

Great post, Jason. I think that your blog is really helpful, especially for beginner bloggers such as myself who do not have much experience with all this technical stuff. The only thing I’m really good for is writing content.

Reply

James October 1, 2012 at 3:19 am

Again, pretty awesome insight. GWT is the top of the line tool when I do technical audit to my clients. Bookmarked!

Reply

Thomas Kane October 1, 2012 at 4:38 pm

Excellent Post

Reply

Paul@e-Marketing Strategy blog October 2, 2012 at 3:11 pm

Some good points there.

I’m a big fan of Google WMT too, I especially like the targeting by country feature especially when my clients have registered .coms on a US Server but are actually wanting to target the UK for example.

Reply

Pavel October 3, 2012 at 9:57 am

I’ve been using WMT for years but I have to admit that I wasn’t getting the most of it since I was mostly focusing on link count and search queries. This is truly a great guide for WMT and I’ve really learned a lot here. Thanks Jason!

Reply

Swamykant@YourDigitalSpace October 4, 2012 at 12:25 am

Interesting post. Actually I am using Google Webmaster tools for very long time and most of the things mentioned in post, I follow on regular basis. It really helps in SEO

Reply

Josh Malone October 24, 2012 at 7:49 am

Great post. I really liked the link comparisons of high, mid, and low quality as well as the html improvements. I have a few pages that I need to delete that are showing up in my sitemaps and I removed from my internal links (low value to me). Any idea how to see which pages are in the WT index so I can remove them?

Reply

prabhat@geek4share November 2, 2012 at 4:29 am

this post has opened my eyes. i don’t know what to do with webmaster tools. i only use it to index my site everytime i write a blog post.i have checked my account after reading this post and i found duplicate meta descriptions and titles.
thanks for sharing the post

Reply

Bryson November 8, 2012 at 2:00 pm

Hi,

I would absolutely LOVE to see you write/develop/create more, much, much more content, around this nature. Thanks for the such a useful information.

Reply

Richard November 11, 2012 at 4:05 pm

This is so far the most comprehensive and complete post I have seen regarding Webmaster Tools and with lots of interesting ideas of how to use it. I think Webmaster Tools has some great facilities to help us to maintain our websites, but with so many functions it is hard to understand what’s what, so thank you for sharing all this.

Reply

Rajkumar Jonnala January 3, 2013 at 9:43 am

Thanks for giving a detailed explanation of each and every corners of the Webmaster tool. Best free SEO toool used by many professional SEO’s.

Reply

Maja January 7, 2013 at 9:11 am

It is one of the greatest job. I have read all this article and opened my webmaster tool and learn all the tips that you have written in this article.

Reply

Specialist SEO @ iNet SEO January 16, 2013 at 6:23 am

Fantastic walkthough :)

However, if anyone wants a snapshot view of a web-page, they can have a look at a tool I had developed and then made available for all to use for free:

http://www.inetseo.co.uk/seo-services/seo-audit/free-seo-audit

Andy

Reply

Tory@cashnetusa February 17, 2013 at 8:25 pm

Hi, I do think this is a great web site. I stumbledupon it ;) I’m going to revisit yet again since i have book marked it. Money and freedom is the best way to change, may you be rich and continue to guide other people.

Reply

Hershel@pizza stone target australia February 17, 2013 at 8:37 pm

After 5 minutes, turn the pizza 180 degrees to insure even cooking.
Net offers two important top toaster oven accessories which
are the hanging toaster mound and the rotisserie kit.
It is not like ordinary dough, it is not supposed to be soft as a baby’s bottom.

Reply

Prashant@ Blogging tips February 24, 2013 at 12:03 am

We are using Webmaster tool from a long time, such a nice tool. Just one thing i am waiting in webmaster tool; a PR related tool that can provide information and ways to improve PR.

Reply

Danny Howard May 7, 2013 at 3:09 pm

Hi Kaiser

Great detailed article, great tips to use. I love using GWMT I use it on a daily bases along with Google analytics.

I don’t like using Excel for SEO sometimes but it great to get more data to see the low quality sites vs the more better quality site linking to you to determine your link profile and avoid penalties in the future.

Cheers

Danny Howard

Reply

Sana June 30, 2013 at 1:51 pm

I have question what is the disadvantage of ” Duplicate Title Tag” in Google Webmaster Report. Is can lower the ranking of a site.

Reply

irsah indesigns July 6, 2013 at 2:12 pm

Great insights Jason.

We’ve been using Google Analytic to send automatically our client’s website analysis, (via custom preferences on landed pages etc), now we can go more in depth with GWMT by providing useful data analysis like link juice to present to our clients. Thanks for the tips.

Best wishes.

Reply

Glen Wilson July 25, 2013 at 11:26 am

Just found your blog from seoteky and I am so glad I did. I have picked up so many valueable tips in here for Google Webmasters, thanks so much.

Now I’ll be burying my head in my account and referencing this post so much now.

Reply

Suhas August 12, 2013 at 5:14 am

Hi Jason,
I wonder what time and efforts it required for you to compose this post. I can only say that all your efforts are worth reading. Thanks for this exclusive post. A very detailed one about Web Master tools.

Reply

Muriba September 22, 2013 at 11:47 pm

Hi Jason, this have been one of the (I think the best) tutorials that I have been about SEO and technical audits. I knew your blog yesterday and this have a good content. Thanks for share. I’ll use some of your content to improve my skills and I like to translate some of that to share with people in spanish with your respective link. Thanks again.

Reply

Asher November 20, 2013 at 1:31 pm

Impressive and informative!

Reply

Leave a Comment

{ 8 trackbacks }

Previous post:

Next post: