Technical SEO Audit with Google Webmaster Tools

By on Sep 24, 2012 in Search | 40 comments

Share On GoogleShare On FacebookShare On Twitter

There are so many tools these days that can extremely make the process of site audit and optimization so much easier, and I’m betting that several tools are already running through your head. But sometimes, the best ones are those that are offered for free.

Google’s Webmaster Tools is certainly on the top of my list. This browser-based web application from Google has ton of functionalities that can help determine your site’s condition from one end to another.

Particularly in areas that are really important when it comes to search optimization (such as site structure and performance as well as content-level issues that the site should be fixing/improving).

So in this post, I’ll share a few of its features that you can use to easily analyze and optimize your site for search.

Finding Duplicates and Thin/Poor pages

Webmaster Tools offers lots of features that can help you identify poor content pages that could be affecting how your site performs on search results.

Nowadays, it’s really important to weed out pages from a site that may not be very useful to searchers. Having thin and duplicate pages from a site allowed to be accessed and be indexed by search engines might harm all its other pages’ ability to rank (Panda), because these pages mostly serve irrelevant and unusable content to search users.

In finding possible duplicate and thin pages within a site, I usually start by comparing the number of pages from the sitemap vs. the number of pages already indexed by Google.

On Webmaster Tools, go to “Optimization”, then to “Sitemap”:

There are two ways to compare the pages from your sitemap to the indexed pages on Google. The first one is by searching all the site’s pages on Google search:

The second method is through Google Webmaster Tools’ Index Status. Go to “Health”, and then to “Index Status”:

By doing this, you’ll get a rough estimation of how many thin/duplicate pages from the site have been already indexed by Google.

This will then make it easier for you to know how many pages you’d be looking for to be removed on Google’s index (by tagging these pages to “noindex” or by blocking access through your Robots.txt).

There are several ways to find thin and possible duplicate pages within the site. But the best place to start is through Google Webmaster Tools’ HTML Improvements feature.  You can start off by going to “Optimization, and then choose “HTML Improvements”:

From there, you can instantly get clues for possible issues that are causing duplication within your site and easily identify pages (URL parameters, session IDs and/or pagination problems) that you should be blocking search engines from indexing.

Check for each URL parameter if they are being indexed by Google, and take note of the amount for each to assess if there are still more possible duplicates/poor content pages within the site. You can use the “site:” and “inurl:” search operators in doing this task.

You can also get clues from the site’s crawl error data. Go to “Health”, and choose “Crawl Errors”. See the URLs, particularly the extended URL strings being crawled by Google:

Bonus: You can check your site’s “tag” and “search” folders too, and see if they are being indexed by Google. As these are commonly providing poor and irrelevant content to search users and can somehow hurt your site’s ability to have its important pages get better search rankings.

Once you have identified the pages that could be hurting your site’s overall rankings due to duplication and improper indexation, you can now start removing these pages from Google’s indices, by tagging them to noindex or by blocking bots from accessing these pages via Robots.txt.

Crawl Errors

The next one is pretty basic, but definitely as important as the first one shared on this post. Ensuring that search crawlers will have no issues in accessing internal pages of the site is necessary, as this aspect of site optimization improves both user-end experience and the process of site crawling.

It is also used as a ranking factor by search engines, wherein the site’s condition in terms of its crawl errors/status can send signals if the site (or its content) is geared up to be served to their users.

Identifying the pages that cause crawl errors (which may vary on several response codes) is easy with Google Webmaster Tools. You can easily get this data through the “Health” > “Crawl Error” feature of the toolset.

The next step is to gauge the importance of each page found causing crawl errors to the site, as by distinguishing their importance will lead you to the necessary fixes needed for each (you can download the list of all the pages with errors in excel format).

After identifying the level of priorities of the pages with errors, manually check the pages that are linked to them (as these are the ways search crawlers access the pages in your site with problems). This will help you decide with what solutions to take for each issue.

Most common fixes that you can do to fix crawl errors on a site:

HTML Improvements

Another feature of WMT that I think is mostly overlooked by its users is the HTML Improvements, which can be found under the “Optimization” tab.

This feature allows webmasters to see pages of their site that may cause problems for both user experience and search performance. This includes pages that have:

The list for each potential page-level issue can guide you on what changes/improvements to implement for the pages that search crawlers might have found to be causing indexation problems for your site.

Site Speed

Gaining insights on how your site is performing when it comes to its pages’ loading time is also available with Google Webmaster Tools. Just go to “Labs”, and then choose “Site Performance”.

The performance overview that this feature provides will give you better understanding if you need to optimize for this aspect of your or your client’s website.

Site speed has been a very important ranking factor for quite some time now, so using other tools like Google’s Page Speed or Pingdom is a good option to further elaborate your client recommendations. With that, you can include the specific areas/elements of their site that’s affecting its loading time.

Search Queries

The “Search Queries” feature – which can be found under the “Traffic” tab – is also a great way to track the progress of your SEO campaign or to determine if the site has been hit by a penalty/algorithmic update.

The search queries graph on the image above is from a website that has been affected by the first Penguin Update (April 2012). And with this feature we’re somehow able to see the progress of our campaign in regaining the site’s search visibility.

Another great way to make use of the available data from this feature is by downloading the table of search queries (with each query’s status when showing up on SERPs, like CTR, impression, position and number of clicks) in excel format.

This list can help you improve your campaign, in terms of optimizing, targeting and discovering high performing keywords (based on average search positions, number of impressions and click-through rate).

Structured Data

This new function on GWMT is also a great addition for your campaigns, especially in giving site recommendations to your clients. You can easily find this feature on the “Optimization” – and choose “Structured Data”:

The Structured Data feature will also tell you if the site hasn’t used any type of schema/microdata or authorship markups on any of its pages. You can then suggest it to your clients to improve their website’s performance in search.

But if in case the site already has implemented schema/microdata markups, clicking on each type listed on the “Structured Data” table will allow you to see all the pages that have used that certain type of markup.

You can then test some of these pages using the Structured Data Testing Tool to see if their markups are working well, as well as to see how the pages’ snippets will most likely be seen in the search results.

Link Profile Analysis

The thing that I love the most about Google Webmaster Tools is the amount of site data available to be extracted. And this also includes a site’s full link data, which means implementing an efficient link profile analysis is very doable.

What I usually do when using Google Webmaster Tools for link profile analysis is to download the entire list of external domains linking to the site.

You can start by going to “Traffic”, and then to “Links to your site”:

Check the full list of domains “who linked the most” to your site by clicking on “More”. Then download the entire list by choosing “download this table”:

 You’ll now have the full list of domains linking to your site in excel format:

Download the Neils Bosma SEO Tools for Excel (unzip the file after downloading it, and drag the SEO Tools XLL add-in file to the spreadsheet that you’ve just downloaded from Webmaster Tools):

I use this tool so I can include more metrics for each domain listed in the excel sheet, which I can use in better understanding the site’s entire link profile.

Next is to add the Alexa Reach scores for each listed domain (I just chose Alexa Reach so I can easily classify the listed domains in the latter part of this audit process – and the Alexa Popularity function doesn’t seem to work these days).

You can start by clicking on the 4th cell after the name of the domain (D2), and select “Offpage” from the “SEOTools” tab, and then choose “AlexaReach”:

After choosing on “AlexaReach”, it will display a pop-up window. The next step is to just simply click on the name of the domain (on cell A2) and hit “enter”.

The chosen cell will now then give you the current Alexa Reach score of the first listed domain. Copy the formula on that cell (press Ctrl+C on D2), and paste it up to the last cell on that column (to automatically extract the Alexa Reach scores for each listed domain).

Note: 0 Alexa Reach Score means the domain hasn’t been ranked by Alexa (N/A). This metric is pretty much the same with Alexa Popularity, the lower the number is the better (ex: Google is ranked #1 and Facebook is ranked #2).

With this upgraded list, you can analyze a lot of areas of a site’s link profile. For instance you can easily see if you’re site is getting sitewide links from low quality domains, just by sorting the “number of links” from the domain from largest to smallest (and seeing each domain’s Alexa Reach with the most links):

After sorting the second column of the spreadsheet, you’ll be able to see low quality domains that may have been containing sitewide links to your site:

Another way to utilize this list when performing link auditing is to use it to determine the ratio of low quality vs. high quality domains linking to your site.

What I usually do in this process of the audit is to sort the list by the listed domains’ Alexa Reach, from largest to smallest.

From there, I would copy the entire D column (where all the Alexa Reach numbers are) and paste it on a new excel worksheet. And then after pasting the numbers in a new spreadsheet, I’ll have to segment it into 4 parts:

And have a quick count of each column and list the amount for each type of domain (preferably on a new tab of the worksheet):

Because in that way, I can create a chart that depicts the type of domains linking to the site (you can easily create the chart by choosing the “Insert” tab, and then “Column”):

The list that you have created, with the help of Webmaster Tools, can also be used in pruning the links that might be passing little to no value to your site (or could be damaging to your site’s ranking performance).

Also, if you have created your own chart, you can easily assess if the site has participated on low quality link schemes in the past – basing on the ratio of low quality vs. high quality domains linking to them.

Lastly, the data you can extract from your Webmaster Tools’ “most linked content” can also help you evaluate if the site has been over optimizing, or even being attacked by negative SEO campaigns (which actually happened to my blog months ago).

There are so many things that you can do and data you can explore with Google Webmaster Tools. And the best thing about it is that Google is continuously enhancing the toolset – so take advantage of it!

If you liked this post, you can subscribe to my feed and follow me on Twitter @jasonacidre.

Jason Acidre

Jason Acidre is Co-Founder and CEO of Xight Interactive, marketing consultant for Affilorama and Traffic Travis, and also the sole author of this SEO blog. You can follow him on Twitter @jasonacidre and on Google+.

More Posts - Twitter - Facebook - Google Plus


  1. Jonathan

    September 24, 2012

    Post a Reply

    Good post man, I also like the URL parameters feature under Configuration in WMT – it’s a great way of seeing a snapshot of what Google has crawled on your site and to spot any problems that might be wasted crawl.

  2. Caleb Donegan

    September 24, 2012

    Post a Reply

    Awesome article. I get in the habit of looking at specific metrics and making the corrections based off the same data I always look at. Easy to forget how much information lives in webmaster tools. Thanks for the checklist!

  3. John Garry

    September 25, 2012

    Post a Reply

    Another brilliant post from you, Jason. I really enjoy reading your articles, since there are many useful tips in them. What I learned here is that I can check my site’s “tag” and “search” folders to see if they are being indexed by Google, because, as you correctly stated, these are commonly providing poor and irrelevant content to search users and can decrease the search rankings.

  4. Alex

    September 25, 2012

    Post a Reply

    Great post – I think a lot of people doing SEO get used to their tools and forget about how useful WMT can be. There are a lot of powerful insights to be gained from the data in there, and you’ve shown that perfectly.

  5. Paige C. Willey

    September 25, 2012

    Post a Reply

    This is an incredibly comprehensive post! So thorough and informative. WMTs is incredibly powerful. I think it’s interesting that you mention Alexa Rank. You don’t hear much about it in the SEO world these days. I’ll definitely try some of these things out.

  6. Sanjib

    September 25, 2012

    Post a Reply

    Hello Jason,

    Such a fantastic post. Your posts are always useful and have something to learn for the readers. We all want to know ways to simplify site audit and SEO. We follow our heart and move ahead usually but if there is something which is technically easy and executable,then we just go with it.

    Thanks a lot,

  7. Nick Stamoulis

    September 25, 2012

    Post a Reply

    The “Site Performance” option is a great way to analyze page load times. Many times sites can get weighed down with old code or high resolution images that cause the page to take extra time to render. Reviewing your site performance can help you identify these areas to clean up, which will boost your page load time. Faster loading pages tend to rank better and add to the overall user experience.

  8. Allan Duncan

    September 27, 2012

    Post a Reply

    I like SEO tools for Excel part. But of course everything you have written here are all valuable to someone doing SEO audit. Thumbs up bro!

  9. Gerardo@Stainless

    September 28, 2012

    Post a Reply

    I’m gone to tell my little brother, that he should also pay a visit this weblog on regular basis to take updated from hottest news update.

  10. Onos Clinton

    September 28, 2012

    Post a Reply

    what a brilliant post, i enjoyed every read and learnt new SEO techniques. Thanks for sharing.

  11. Nawaz

    September 30, 2012

    Post a Reply

    This is a complete tutorial on webmaster Tool. I often used this tool to see the search queries and the position of the keywords ranking wise on Google.com

  12. Jack

    September 30, 2012

    Post a Reply

    Great post, Jason. I think that your blog is really helpful, especially for beginner bloggers such as myself who do not have much experience with all this technical stuff. The only thing I’m really good for is writing content.

  13. James

    October 1, 2012

    Post a Reply

    Again, pretty awesome insight. GWT is the top of the line tool when I do technical audit to my clients. Bookmarked!

  14. Some good points there.

    I’m a big fan of Google WMT too, I especially like the targeting by country feature especially when my clients have registered .coms on a US Server but are actually wanting to target the UK for example.

  15. Pavel

    October 3, 2012

    Post a Reply

    I’ve been using WMT for years but I have to admit that I wasn’t getting the most of it since I was mostly focusing on link count and search queries. This is truly a great guide for WMT and I’ve really learned a lot here. Thanks Jason!

  16. Josh Malone

    October 24, 2012

    Post a Reply

    Great post. I really liked the link comparisons of high, mid, and low quality as well as the html improvements. I have a few pages that I need to delete that are showing up in my sitemaps and I removed from my internal links (low value to me). Any idea how to see which pages are in the WT index so I can remove them?

  17. prabhat@geek4share

    November 2, 2012

    Post a Reply

    this post has opened my eyes. i don’t know what to do with webmaster tools. i only use it to index my site everytime i write a blog post.i have checked my account after reading this post and i found duplicate meta descriptions and titles.
    thanks for sharing the post

  18. Bryson

    November 8, 2012

    Post a Reply


    I would absolutely LOVE to see you write/develop/create more, much, much more content, around this nature. Thanks for the such a useful information.

  19. Richard

    November 11, 2012

    Post a Reply

    This is so far the most comprehensive and complete post I have seen regarding Webmaster Tools and with lots of interesting ideas of how to use it. I think Webmaster Tools has some great facilities to help us to maintain our websites, but with so many functions it is hard to understand what’s what, so thank you for sharing all this.

  20. Rajkumar Jonnala

    January 3, 2013

    Post a Reply

    Thanks for giving a detailed explanation of each and every corners of the Webmaster tool. Best free SEO toool used by many professional SEO’s.

  21. Maja

    January 7, 2013

    Post a Reply

    It is one of the greatest job. I have read all this article and opened my webmaster tool and learn all the tips that you have written in this article.

  22. Tory@cashnetusa

    February 17, 2013

    Post a Reply

    Hi, I do think this is a great web site. I stumbledupon it 😉 I’m going to revisit yet again since i have book marked it. Money and freedom is the best way to change, may you be rich and continue to guide other people.

  23. After 5 minutes, turn the pizza 180 degrees to insure even cooking.
    Net offers two important top toaster oven accessories which
    are the hanging toaster mound and the rotisserie kit.
    It is not like ordinary dough, it is not supposed to be soft as a baby’s bottom.

  24. We are using Webmaster tool from a long time, such a nice tool. Just one thing i am waiting in webmaster tool; a PR related tool that can provide information and ways to improve PR.

  25. Hi Kaiser

    Great detailed article, great tips to use. I love using GWMT I use it on a daily bases along with Google analytics.

    I don’t like using Excel for SEO sometimes but it great to get more data to see the low quality sites vs the more better quality site linking to you to determine your link profile and avoid penalties in the future.


    Danny Howard

  26. Sana

    June 30, 2013

    Post a Reply

    I have question what is the disadvantage of ” Duplicate Title Tag” in Google Webmaster Report. Is can lower the ranking of a site.

  27. Great insights Jason.

    We’ve been using Google Analytic to send automatically our client’s website analysis, (via custom preferences on landed pages etc), now we can go more in depth with GWMT by providing useful data analysis like link juice to present to our clients. Thanks for the tips.

    Best wishes.

  28. Glen Wilson

    July 25, 2013

    Post a Reply

    Just found your blog from seoteky and I am so glad I did. I have picked up so many valueable tips in here for Google Webmasters, thanks so much.

    Now I’ll be burying my head in my account and referencing this post so much now.

  29. Suhas

    August 12, 2013

    Post a Reply

    Hi Jason,
    I wonder what time and efforts it required for you to compose this post. I can only say that all your efforts are worth reading. Thanks for this exclusive post. A very detailed one about Web Master tools.

  30. Muriba

    September 22, 2013

    Post a Reply

    Hi Jason, this have been one of the (I think the best) tutorials that I have been about SEO and technical audits. I knew your blog yesterday and this have a good content. Thanks for share. I’ll use some of your content to improve my skills and I like to translate some of that to share with people in spanish with your respective link. Thanks again.

  31. Jesse

    June 7, 2015

    Post a Reply

    This will then make it easier for you to know how many pages you’d be looking for to be removed on Google’s index (by tagging these pages to “noindex” or by blocking access through your Robots.txt).


  1. Technical SEO Audit with Google Webmaster Tools | Kaiserthesage - Inbound.org - [...] Technical SEO Audit with Google Webmaster Tools | Kaiserthesage 1 Upvotes Discuss…
  2. Round Up Week 39 Thesis and Wordpress 3.5 Beta — lawmacs web design blog - [...] (1) Technical SEO Audit with Google Webmaster Tools – There are so many tools these days that can extremely…
  3. Gnome Likes: GWT Technical Audit, Local SEO Citations & Free for Fall | Web Gnomes - [...] Technical SEO Audit with Google Webmaster Tools [...]
  4. A Walkthrough to Xight Interactive’s Inbound Marketing Process | Kaiserthesage - [...] Through Google Webmaster Tools’ HTML Improvements section (under the “Optimization” tab). More details on this post. [...]
  5. Empower Network >>> Get the Skinny on The Other Side of the Coin - [...] KaisertheSage [...]
  6. Favourite marketing posts of 2012 - Focus Product MarketingFocus Product Marketing - [...] Technical SEO Audit with Google Webmaster Tools | Kaiserthesage - a really useful, practical guide to carrying out an SEO…
  7. Best of SEO, Content Marketing, Social Media and CRO - 2012 - [...] Technical SEO Audit with Google Webmaster Tools [...]
  8. Cómo recuperarse de Panda Dance | Inbound Espanol - […] También puedes checar esta  información, a detailed guide on using Google Webmaster Tools for technical SEO audits. […]
  9. SEO Audit - […] How to Perform the World’s Greatest SEO Audit by Steve Webb Technical Site Audit Checklist by Geoff Kenyon SEO Website…
  10. Basic SEO Stuff That Every Best SEO Companies Should Know – Digital Marketing Company | Digital Marketing, Internet Marketing- Yourneeds - […] kaiserthesage.com/technical-seo-audit/ […]
  11. Favourite marketing posts of 2012 | Benjrees - […] Technical SEO Audit with Google Webmaster Tools | Kaiserthesage – a really useful, practical guide to carrying out an SEO…
  12. Cómo recuperarse de Panda Dance Altura Interactive Español - […] También puedes checar esta  información, a detailed guide on using Google Webmaster Tools for technical SEO audits. […]
  13. 6 Actionable SEO Tips - Kaiserthesage - […] duplicate pages as well as the URL parameters that you shouldn’t be allowing to be indexed like Google Webmaster…

Submit a Comment

Your email address will not be published. Required fields are marked *