Posts

How I increased my blog’s Search Traffic by 44% in under a month

Update: This method for on-page optimization still remains to be very effective. However, there were changes in the research process of uncovering search terms that could be very valuable to your SEO efforts due to changes made in Google Analytics (“not provided” data). 

I’ve updated 2 sections of this post – on discovering keywords via Google Search Console and on effectively reoptimizing pages by providing “direct answers” to queries.  

I’ve been blogging for over 2 years now, but I didn’t really optimize my blog for search extensively – which is kind of odd, since I work as an SEO.

Mid last year up to last December has been a very busy stretch for our fast growing team. Growing our company and working with our clients somehow limited my time in maintaining this blog (especially in publishing new posts on a regular basis, which led to gradual loss in search traffic).

So when year 2013 started, I decided to re-optimize my blog (some parts of it), a week before I published my recent post about Advanced SEO tips for blogs.

The result was definitely interesting, as I get to improve my blog’s search traffic by 43.92% with minimal effort, in the first 24 days of the year.

Read more

Technical SEO Audit with Google Webmaster Tools

There are so many tools these days that can extremely make the process of site audit and optimization so much easier, and I’m betting that several tools are already running through your head. But sometimes, the best ones are those that are offered for free.

Google’s Webmaster Tools is certainly on the top of my list. This browser-based web application from Google has ton of functionalities that can help determine your site’s condition from one end to another.

Particularly in areas that are really important when it comes to search optimization (such as site structure and performance as well as content-level issues that the site should be fixing/improving).

So in this post, I’ll share a few of its features that you can use to easily analyze and optimize your site for search.

Finding Duplicates and Thin/Poor pages

Webmaster Tools offers lots of features that can help you identify poor content pages that could be affecting how your site performs on search results.

Nowadays, it’s really important to weed out pages from a site that may not be very useful to searchers. Having thin and duplicate pages from a site allowed to be accessed and be indexed by search engines might harm all its other pages’ ability to rank (Panda), because these pages mostly serve irrelevant and unusable content to search users.

In finding possible duplicate and thin pages within a site, I usually start by comparing the number of pages from the sitemap vs. the number of pages already indexed by Google.

On Webmaster Tools, go to “Optimization”, then to “Sitemap”:

There are two ways to compare the pages from your sitemap to the indexed pages on Google. The first one is by searching all the site’s pages on Google search:

The second method is through Google Webmaster Tools’ Index Status. Go to “Health”, and then to “Index Status”:

By doing this, you’ll get a rough estimation of how many thin/duplicate pages from the site have been already indexed by Google.

This will then make it easier for you to know how many pages you’d be looking for to be removed on Google’s index (by tagging these pages to “noindex” or by blocking access through your Robots.txt).

There are several ways to find thin and possible duplicate pages within the site. But the best place to start is through Google Webmaster Tools’ HTML Improvements feature.  You can start off by going to “Optimization, and then choose “HTML Improvements”:

From there, you can instantly get clues for possible issues that are causing duplication within your site and easily identify pages (URL parameters, session IDs and/or pagination problems) that you should be blocking search engines from indexing.

Check for each URL parameter if they are being indexed by Google, and take note of the amount for each to assess if there are still more possible duplicates/poor content pages within the site. You can use the “site:” and “inurl:” search operators in doing this task.

You can also get clues from the site’s crawl error data. Go to “Health”, and choose “Crawl Errors”. See the URLs, particularly the extended URL strings being crawled by Google:

Bonus: You can check your site’s “tag” and “search” folders too, and see if they are being indexed by Google. As these are commonly providing poor and irrelevant content to search users and can somehow hurt your site’s ability to have its important pages get better search rankings.

Once you have identified the pages that could be hurting your site’s overall rankings due to duplication and improper indexation, you can now start removing these pages from Google’s indices, by tagging them to noindex or by blocking bots from accessing these pages via Robots.txt.

Crawl Errors

The next one is pretty basic, but definitely as important as the first one shared on this post. Ensuring that search crawlers will have no issues in accessing internal pages of the site is necessary, as this aspect of site optimization improves both user-end experience and the process of site crawling.

It is also used as a ranking factor by search engines, wherein the site’s condition in terms of its crawl errors/status can send signals if the site (or its content) is geared up to be served to their users.

Identifying the pages that cause crawl errors (which may vary on several response codes) is easy with Google Webmaster Tools. You can easily get this data through the “Health” > “Crawl Error” feature of the toolset.

The next step is to gauge the importance of each page found causing crawl errors to the site, as by distinguishing their importance will lead you to the necessary fixes needed for each (you can download the list of all the pages with errors in excel format).

After identifying the level of priorities of the pages with errors, manually check the pages that are linked to them (as these are the ways search crawlers access the pages in your site with problems). This will help you decide with what solutions to take for each issue.

Most common fixes that you can do to fix crawl errors on a site:

  • Reviving the page on a new or old URL (if the non-existent page is important), then 301 redirect the old URL to the new one.
  • 301 redirecting the page to another relevant page/category (if the page is linked from external websites)
  • Removing internal links pointing to the 404 page (if page is not that important)
  • Fixing the page (if issue was caused by server-end or coding errors).

HTML Improvements

Another feature of WMT that I think is mostly overlooked by its users is the HTML Improvements, which can be found under the “Optimization” tab.

This feature allows webmasters to see pages of their site that may cause problems for both user experience and search performance. This includes pages that have:

  • Duplicate meta descriptions
  • Long meta descriptions
  • Short meta descriptions
  • Missing title tags
  • Duplicate title tags
  • Long title tags
  • Short title tags
  • Non-informative title tags
  • Non-indexable content

The list for each potential page-level issue can guide you on what changes/improvements to implement for the pages that search crawlers might have found to be causing indexation problems for your site.

Site Speed

Gaining insights on how your site is performing when it comes to its pages’ loading time is also available with Google Webmaster Tools. Just go to “Labs”, and then choose “Site Performance”.

The performance overview that this feature provides will give you better understanding if you need to optimize for this aspect of your or your client’s website.

Site speed has been a very important ranking factor for quite some time now, so using other tools like Google’s Page Speed or Pingdom is a good option to further elaborate your client recommendations. With that, you can include the specific areas/elements of their site that’s affecting its loading time.

Search Queries

The “Search Queries” feature – which can be found under the “Traffic” tab – is also a great way to track the progress of your SEO campaign or to determine if the site has been hit by a penalty/algorithmic update.

The search queries graph on the image above is from a website that has been affected by the first Penguin Update (April 2012). And with this feature we’re somehow able to see the progress of our campaign in regaining the site’s search visibility.

Another great way to make use of the available data from this feature is by downloading the table of search queries (with each query’s status when showing up on SERPs, like CTR, impression, position and number of clicks) in excel format.

This list can help you improve your campaign, in terms of optimizing, targeting and discovering high performing keywords (based on average search positions, number of impressions and click-through rate).

Structured Data

This new function on GWMT is also a great addition for your campaigns, especially in giving site recommendations to your clients. You can easily find this feature on the “Optimization” – and choose “Structured Data”:

The Structured Data feature will also tell you if the site hasn’t used any type of schema/microdata or authorship markups on any of its pages. You can then suggest it to your clients to improve their website’s performance in search.

But if in case the site already has implemented schema/microdata markups, clicking on each type listed on the “Structured Data” table will allow you to see all the pages that have used that certain type of markup.

You can then test some of these pages using the Structured Data Testing Tool to see if their markups are working well, as well as to see how the pages’ snippets will most likely be seen in the search results.

Link Profile Analysis

The thing that I love the most about Google Webmaster Tools is the amount of site data available to be extracted. And this also includes a site’s full link data, which means implementing an efficient link profile analysis is very doable.

What I usually do when using Google Webmaster Tools for link profile analysis is to download the entire list of external domains linking to the site.

You can start by going to “Traffic”, and then to “Links to your site”:

Check the full list of domains “who linked the most” to your site by clicking on “More”. Then download the entire list by choosing “download this table”:

 You’ll now have the full list of domains linking to your site in excel format:

Download the Neils Bosma SEO Tools for Excel (unzip the file after downloading it, and drag the SEO Tools XLL add-in file to the spreadsheet that you’ve just downloaded from Webmaster Tools):

I use this tool so I can include more metrics for each domain listed in the excel sheet, which I can use in better understanding the site’s entire link profile.

Next is to add the Alexa Reach scores for each listed domain (I just chose Alexa Reach so I can easily classify the listed domains in the latter part of this audit process – and the Alexa Popularity function doesn’t seem to work these days).

You can start by clicking on the 4th cell after the name of the domain (D2), and select “Offpage” from the “SEOTools” tab, and then choose “AlexaReach”:

After choosing on “AlexaReach”, it will display a pop-up window. The next step is to just simply click on the name of the domain (on cell A2) and hit “enter”.

The chosen cell will now then give you the current Alexa Reach score of the first listed domain. Copy the formula on that cell (press Ctrl+C on D2), and paste it up to the last cell on that column (to automatically extract the Alexa Reach scores for each listed domain).

Note: 0 Alexa Reach Score means the domain hasn’t been ranked by Alexa (N/A). This metric is pretty much the same with Alexa Popularity, the lower the number is the better (ex: Google is ranked #1 and Facebook is ranked #2).

With this upgraded list, you can analyze a lot of areas of a site’s link profile. For instance you can easily see if you’re site is getting sitewide links from low quality domains, just by sorting the “number of links” from the domain from largest to smallest (and seeing each domain’s Alexa Reach with the most links):

After sorting the second column of the spreadsheet, you’ll be able to see low quality domains that may have been containing sitewide links to your site:

Another way to utilize this list when performing link auditing is to use it to determine the ratio of low quality vs. high quality domains linking to your site.

What I usually do in this process of the audit is to sort the list by the listed domains’ Alexa Reach, from largest to smallest.

From there, I would copy the entire D column (where all the Alexa Reach numbers are) and paste it on a new excel worksheet. And then after pasting the numbers in a new spreadsheet, I’ll have to segment it into 4 parts:

  • High Alexa Rank (1,000,000+)
  • Decent Alexa Rank (100,000 – 999,999)
  • Low Alexa Rank (1 – 99,999)
  • No Alexa Rank (0)

And have a quick count of each column and list the amount for each type of domain (preferably on a new tab of the worksheet):

Because in that way, I can create a chart that depicts the type of domains linking to the site (you can easily create the chart by choosing the “Insert” tab, and then “Column”):

The list that you have created, with the help of Webmaster Tools, can also be used in pruning the links that might be passing little to no value to your site (or could be damaging to your site’s ranking performance).

Also, if you have created your own chart, you can easily assess if the site has participated on low quality link schemes in the past – basing on the ratio of low quality vs. high quality domains linking to them.

Lastly, the data you can extract from your Webmaster Tools’ “most linked content” can also help you evaluate if the site has been over optimizing, or even being attacked by negative SEO campaigns (which actually happened to my blog months ago).

There are so many things that you can do and data you can explore with Google Webmaster Tools. And the best thing about it is that Google is continuously enhancing the toolset – so take advantage of it!

If you liked this post, you can subscribe to my feed and follow me on Twitter @jasonacidre.

,

How to Avoid Site Changes that Ruin User Experience and Conversions

This entry is a guest post by Tom Howlett, a Digital Marketing Executive at Koozai. You can follow him on Twitter: @Koozai_Tom

How do you make general and SEO website changes without compromising your site’s user experience and ultimately the number of conversions made through the website? When you start to tinker with content and navigation elements to optimise pages, this can have a profound impact on visitor perception – for better or worse.

So how can you make these website changes without spoiling the website experience you worked so hard to create? This post looks at a number of the most common and important considerations when making any new website changes.

Common SEO Site Changes

Page Titles

This may seem like an obvious consideration, but it is important that when you create a new page for the site or are going through it with a fine SEO comb, that you create a unique title for each one. Each page of the site should have a purpose and that purpose should be reflected in the title – therefore there should be no need for duplicate page titles.

The title element is also an important consideration for helping to improve conversions. When targeting users in organic search, ensuring that the title is relevant to the content of the destination page will also help influence the click. Consequently,it can help bring more visitors to the website and this should hopefully result in increased conversions.

Page Meta

Just like the page title element mentioned above, each page of the website should have a unique Meta Description. The description should provide more detail on what the page offers with reference to the page title.

The description is a powerful element; a good description can make the difference between a click through to your website and someone clicking through to a competitor offering a similar product or service.

Headings

The heading elements (H1’s, H2’s…etc.) not only provide keyword relevance on the page, they help break down the content into easily manageable chunks. Users are able to skim to a section to find the information they are looking for quickly. Typically, if someone visits your website and cannot find what they are looking for within a few seconds, they are likely to revert back to the search engine and look at the next result.

A good heading layout makes for a good user experience and if you can keep visitors on the site, you have their attention to which you can use effectively to persuade them to convert (whatever the conversion may be).

URLs

The URL element of a page is often overlooked by website owners, it is also is a pain to change once there is an established URL structure in place.  Ideally you want your URLs to be as informative as possible and include information relevant to the page content. Try to avoid using a dynamic structure and aim for a fixed format similar to the examples below:

http://www.example.com/category

http://www.example.com/category/product

http://www.example.com/category/service

Descriptive and relevant URLs will help the site rank for the target keywords, a searcher will also be able to see this in the search results and this may help further influence the click if it provides a high relevance to the search query.

Site Copy

Website text or copy is an important consideration. You want to include a certain level of content and also make sure it’s written to a high standard. Here are some important considerations for page content:

  • Ensure that it is natural and relevant
  • Make sure the content is unique to your website
  • Make sure there is enough content to justify the page purpose (this will vary for different types of pages and websites)
  • It should fit naturally into your site design, sometimes when adding content for SEO purposes; websites tend to throw this on the page without consideration of the user

If all the points above are considered when creating or adding content to your website, you should be able to enjoy the benefits of this. It is also worth pointing out that the copy should be useful to the user and help persuade them to take certain actions, especially if the page is designed to sell a product, service or even your brand.

Space for Content

As briefly mentioned in the previous section, when deciding on website content and copy, you will want to make sure that this fits naturally into the website design. There is no use throwing content in for the benefit of SEO; there should be a greater strategy that aims to create valuable content that fits well with the flow of the page and provides users the information they expect to see.

There should also be a content theme that fits with your design, a theme that is utilised on all similar pages of a website. One example would be product pages on an Ecommerce website, people expect the item descriptions, images, specifications etc. to be in the same location on every page. Changing this for each page will decrease the familiarity across the site and as a result you may lose out on valuable conversions.

Internal Linking

Internal linking is a valuable SEO strategy. Most commonly, a website owner will have read about the benefit of internal links and then work through the website whilst adding them whenever a keyword is mentioned on the page. This is an extreme example but it does happen.

Internal linking can be really valuable for the website if done effectively. If referencing a page or product on a page, it makes sense to link to it within the content, just in case someone wants to navigate there without having to find it themselves. You don’t necessarily have to link using the exact keyword term either, you can equally benefit by linking using a longer phrase like the example below.

Internal links can be really useful when referencing similar pages such as similar products (for example ‘you might also be interested in’) or similar Blog posts. This is a useful strategy for getting people to purchase more items or just to keep them on the website. They are much more likely to return to the site or convert if you manage to keep them on the site for longer.

 

Image Titles

Image titles may be the least utilised element mentioned here. For those who don’t know, the title element appears in a small box when hovering over an image or image link. This will usually provide more information regarding the contents of the image or describe where it leads if the image links to another page.

It is unlikely to be an element that means the difference between a conversion and someone leaving your site, but it is useful to include this information or to make sure they are optimised effectively. For image links they can be particularly useful in persuading more people to use the link.

Having a Blog

A Blog is a useful addition for any website or online business; it provides a place to share news and write useful content that will bring in new visits to the site. Quite often the process of adding a Blog to the website is rushed and it is treated as a separate element of the site.

One huge benefit from having a Blog is that you have a great opportunity to grow your brand. With search engines now appearing to have a greater brand focus, this can have a positive impact on your rankings and overall visibility. It makes sense to integrate your Blog with the rest of your site, you will benefit from improving that brand recognition from new visitors that have come into the site through the Blog. So make sure your Blog reflects your brand image and mimics the site design.

There should also be some solid consideration into the types of post that will be included on the Blog. There is no use in creating content for the sake of having content on the Blog – you will want to focus on creating high quality content that is going to result in the site gaining more regular visitors and subscribers. This is where the number of brand mentions and social shares skyrockets and you will experience a much higher brand value and as a result a greater number of conversions (as long as the rest of the site is well optimised).

Social Sharing

Social sharing buttons and links to your social profiles offer a great way of getting people to interact with you online and making it easy for people to share your site or the information you provide within their social circles.

More often than not, the addition of these buttons and links appear to an afterthought and are just thrown on the page with little consideration as to how people are going to utilise these.

You will want to make it clear that you are on these platforms so try to link to these from each page of the site. This may be more prominent on the Home page and the main Blog page to encourage interaction – it is fine to include these links within the header or footer of any other pages, try to include them in the same place on all the other pages for familiarity.

Greater interaction with your brand on these networks will enable you to promote content and products to the greater fan base and increase the number of shares, likes and visits to your site, not to mention improve your conversion rate.

Please note: It is worth including the target=”blank” element within the link to each of the profiles to load a new page so users are not taken away from the website (they may not return). The same goes for any external links.

Tracking for New Pages

When adding new pages to the site, you will want to make sure that you remember to include the relevant code for tracking page visits and conversions. Whether you use Google Analytics or another package, this will give you lots of useful data to help you improve the page and website as a whole.

Common On-Page Changes and Elements

Logo Link and Positioning

How does the logo effect how users browse the site and aid conversions you ask? This is more of a detail regarding convenience. Many people online are impatient and won’t hang around if they cannot find what they are looking for. From a usability perspective, visitors expect there to be a prominent logo within the site header at the top of the page. They also may expect that upon clicking the logo it will send them back to the Home page. If this functionality does not exist, or there is no prominent logo available, you may lose visitors who get frustrated when browsing your website.

Navigation

The navigation is one of the most important features of a website; this is how the majority of your visitors will navigate to the pages of interest to them.

Firstly you will want to make sure you link to all the top level pages from the main navigation. Depending on how large the site is, you may have to split the site into sections and utilise a dropdown menu to fit all the links to the important pages.

Quite often the navigation will be designed to perfectly fit the number of pages the site has, but what happens when you add new pages and areas to the site? Can it easily fit into the current navigation? This is a consideration that should be taken into account when first designing a website. You want users to be able to easily navigate to every part of the website, even the newly added pages.

Most navigation problems can be solved by including dropdown menus, as new links can easily be added whenever required. This is where splitting your site into easily navigable categories is particularly useful, so at least users can take a good guess at where they are going to find what they are looking for. You may also want to think about highlighting new areas of the site or pages on the Home page, this can help improve the familiarity for visitors who will remember this upon returning to the site.

Breadcrumb Links

Breadcrumb links are useful, especially for an Ecommerce website where users can easily navigate back to previous sections of the site. This makes for a trouble free browsing experience and people are much more likely to stay on the site and convert.

If adding large sections of the site or an Ecommerce area, including breadcrumb links will help improve the online experience. Make sure to mirror the current online navigation and design and to include these in a prominent position (typically above the content) for a better experience.

Keeping Information Above The Fold

I have mentioned a number of elements that should be included within the main content area such as text content and images. This content should be located above the fold where possible and aim to capture the attention of the user without them having to scroll.

When designing a website, this should be a consideration from the start. If making these changes to an existing website, careful consideration should be put into making sure that no other element of the website suffers as a result. Each page should also use the same layout so before making any changes it will be useful to create a template to work from.

Footer

The Footer can make a useful navigational feature by including links to the main top level pages. Again, this is a feature that is there for convenience – people are much more likely to stick around if they are able to navigate the site without any troubles.

When adding new content or categories to an existing website, you will benefit from making sure that these links are included within the site Footer if applicable.

The Footer can also be a useful place to link to the main social media platforms for greater interaction with your brand.

Sitemap

Like the Footer, a Sitemap page can provide a useful glossary of pages on the site that users can use to find the information they are looking for.

When adding new pages, it is useful to add these to the existing Sitemap page.

Making Contact

One element every site should have is a method of contact. Whether this information is displayed on every page or the site has a dedicated Contact page, it is important that users can easily find this information should they want to.

Consideration should be made towards this when creating new sections of the site or when making changes to the existing navigation. Always include a prominent link to the contact information.

Theming

Touched on briefly in this post already, the theming for a website can drastically affect the number of conversions your site generates. You want to maintain a brand image as well as creating a familiarity with your site and brand.

For every new page, category or section of the website that is created, care should be taken to make sure that there is a common theme with the rest of the site. Otherwise, regular visitors may believe they have landed on a different website as well as people who have navigated through the site.

Re-Design

If you want to completely re-design your website it is likely that the design will be quite different. There are certain elements that you may want to retain on the new site to maintain a familiar experience – this is more important if you already have a distinguished fan base.

The areas of the site you may wish to transfer are as follows:

  • Use the same page Titles and Meta Descriptions
  • Maintain a familiar navigation structure
  • If you have a clean URL structure, this is worth retaining, otherwise redirect the URLs
  • Transfer the Blog and all the previous Blog posts
  • Make sure the new site contains links to the social profiles whether it did or didn’t beforehand
  • The logo – you may wish to re-design the logo as well as the website. Generally if there is a familiarity in the logo this will help users know they are on the same website as before.

Redirects

If changing the site URL structure, you will want to make sure that each of the old URLs is redirected (301-redirect) to the new URL or the closest equivalent. This will ensure that any previous external links pass their value to the new pages as well as sending users to the right place.

Is the Site Mobile Friendly?

A mobile-optimised website will ensure that users visiting the site on mobile devices will still have a good browsing experience, this may well result in increased conversions from these devices.

When adding new pages and sections to the website, you will want to consider how well these work on the various mobile devices. This consideration is particularly important when adding a Blog; will mobile users be able to read and share the content? This is also relevant to Ecommerce websites; do you want people to be able to purchase from a mobile device? By considering these factors and taking appropriate measures you will be able to maintain conversions on these platforms.

Considerations towards mobile should also be taken when changing any other elements on the site. Changes to the navigation need additional thought on how you are going to transfer this information for mobile.

Summary

As you can see, there are lots of potential website changes and changes that are made regarding SEO on the site. These changes can affect how a site performs in terms of how usable the site is and this can affect the number of conversions on the site. Taking the above points into consideration, you can help preserve and in some cases improve the number of conversions that the website generates.

You can find more tips in our Website Migration Whitepaper.

If you liked this post, you can subscribe to my feed or check out Koozai’s awesome blog for more of their awesome stuff.

4 Ways to Optimize Site Structure for a Solid Search Engine Ranking Strategy

There are so many areas of a website that are needed to be optimized when it’s aiming to rank better on search engines, seeing that search engines use hundreds of ranking factors to assess a site if it’s really worth displaying to searchers. Though a big portion of that usually comes down to how the site is well-structured for ease of use and accessibility for search crawlers.

I’ve always believed that on-site optimization is 80% about relevance, since it’s a huge factor that makes a site or its content usable to searchers (if they are finding your content useful or relevant to what they are really looking for). If you’ll ask me if that belief has changed over my 2 years practicing SEO, I’d definitely say no.

But aside from “relevance”, a lot of factors are now being more prevalent in the optimization processes of a website, such as optimizing it for speed, sociability, usability and conversions.

So I’d want to share in this post some simple processes that can help you strengthen your overall search ranking strategy by optimizing the site structure for relevance in order to improve its usability as well as conversions.

Information Architecture and Strategic Keyword Designation

Knowing which keywords fit certain pages of the site is very important when optimizing for both users and search engines. Your site needs to let users see that they are landing on pages that answer to their queries.

In basic keyword mapping, different types of keywords are designated to different types of pages more often than not. For instance:

  • Industry terms are mostly found more effective when used on categories or subcategories (especially for ecommerce sites, like “men’s shoes”).
  • Long tail or more specific keywords are commonly used on product and/or informational pages, as the page type will more likely give direct answers to the search query (ex. “vans authentic core classics”).

So in this process, it’s best to first map out the entire site to see which keywords should be used on different pages of the site, which will not just make them rank better on SERPs, but also to ensure that the search-driven traffic that they’ll be receiving will find the page(s) relevant to the search terms they’ve used.

A fast way to do this is through a method that I use – using Excel and Neils Bosma’s free SEO Tools for Excel (you can download it for free here).

The first thing to do is to export all the site’s pages to an excel spreadsheet through the site’s XML sitemap.

Open an Excel workbook, go to “Data” > choose “From Web” > type in the URL of your site’s XML sitemap > and click on “import”.

Three more dialog boxes will show up (just hit on OK), then your site’s XML sitemap will be imported to the excel worksheet.

The next part of the method is to use Neils Bosma’s SEO Tools. After downloading the excel add-in tool, unzip the downloaded file and drag-drop the “SEOTools.XLL” file on the workbook you’re working on. (for more detailed instructions, you can also check out the tools’ download page).

Click on “Enable this add-in for this session only”, and then you’ll see the tools added on your excel’s main tabs:

Once installed, you can start extracting all of your pages’ title tags. In my case I’ve deleted the 3 other columns, since I will only need the list of the URLs.

Go to the cell next to the first listed URL > Go to “Onpage” > Choose “HtmlTitle

The formula will show up, where you can click on the space between the parentheses, and then click on the cell where the first listed URL is placed.

Click on “OK” to extract the title tags for each listed URL.

Note: There are also other functions and metrics that you can use with the SEO Tools for Excel (like amount of links, social shares, traffic, PageRank, etc…), though we’ll only be using the URLs and their titles for this task.

As soon as you’ve generated this list, it’ll be easier to assess if your pages are targeting the most relevant keywords for their content (based on the URL, level in site architecture and its title).

This list will also allow you to see if you’re cannibalizing some of your major keywords, or have used them predominantly more than once on other deeper pages of the site (this also occurs when targeting singular and plural form of keywords through separate pages – like having “yoursite.com/headset” and “yoursite.com/headsets”).

Make sure that you are applying the right keywords to their respective informational pages, transactional pages and/or categories to draw more action from search visitors.

Descriptive Navigational Links, Permalink Structure and Page Titles for Relevance

These three areas can initially define to users and search engines what your site’s pages are about. Use descriptive keywords to make sure that people will easily understand what the pages are about that they’ll be visiting when clicking through your site’s navigation.

Apply it to your deeper pages, especially if you have a large site, because these navigational links can tremendously impact how your site is ranking on search engines.

Your site’s structure will eventually pass through the link juice it’s getting from external websites to your deeper pages.

This doesn’t just apply to large ecommerce websites, as any type of website can also use this style to improve structure and relevance. Like what Raventools have done on their site’s main and footer navigation:

Which somehow attributes to why they could be ranking very well on search engine result pages, knowing that many of their pages contain these navigation links:

Use your Unique Selling Point (USP) to present Content

An effective content, web and even as a whole business marketing strategy believes in the principle that every page of the site should be treated as a landing page that can directly or indirectly help in revenue generation.

Leverage what separates your business from your competitors or your USP through your content to instantly generate interest and actions from your visitors. Implementing this on your content strategy is a strong indicator of uniqueness of content, which of course what search engines like these days.

Assess and identify your online brand’s strong points and exemplify it through your site’s content.

Look at Zappos, they offer free shipping and video reviews, and they’re using it as a part of their content strategy.

These kinds of web experiences make users stay longer on the site. Even if you’re just sharing or selling ideas through blog posts, as long as it can compel people to read/share it, the better it’ll perform in terms of search rankings.

Building strong internal links and link passages

This is where your content marketing efforts will come in. Build content assets or strong pages that have high potentials of getting links (editorially or via outreach), social shares and/or ranking better on SERPs for informational search phrases and use these pages to support your site’s important pages through internal linking to help them rank better on search engines and also to make them become more visible to your visitors.

You can also use Traffic Travis (you can download it for free) to check pages that are getting links and have good PageRank that you can pass around to your site’s other topically relevant and important landing pages.

This method is definitely a great substitute for manual link building, as the internal linking process can improve the site’s overall domain authority, plus it also helps in making your important landing pages more visible to your target audience where you can easily trigger their interest (through your compelling content).

What’s next? Build Authority!

Start doing real company shit stuff (#RCS) to establish your site as an industry leader. Earn links, mentions, readership and customer loyalty by doing what real companies do!

There are so many things that you can do to really create an impact for your online marketing campaign in terms of promotion, whether you choose to do viral content marketing or build links that will really add value to your or your client’s business.

All these stuff – proving your brand as an authority – will eventually make your search engine ranking strategy prevail, especially when these authority-passing links/mentions start flowing through your site architecture, which will make all your pages rank competitively on search results.

If you liked this post, you can subscribe to my feed and follow me on Twitter @jasonacidre.

How to Optimize Great Content for Search Engines to easily Crawl, Index and Rank

authorship markup

Content optimization is getting trickier these days, as users’ behavior toward how they use the web tend to grow over time, plus the fact that a lot of web content providers on different niches are pushing out tons of “great content” in a steadily growing rate.

This progression on web usage has allowed search engines to identify more data that they can use to assess webpages that should be ranking well on their search results (based on usability, social, authority and so many other factors).

Miguel Salcido recently asked 17 experts on what on-site ranking factors they think are most important to optimize this year, which inspired me to write this post – and I also think that it’s worth checking out too.

Anyway, content marketing is making a trend, proving its power as a very effective marketing tool that can tremendously grow a brand’s audience, following and customer base. And empowering your content marketing efforts with basic know-hows on optimizing your content for search is just practical to yield more results to your campaign.

So in this post, I will mostly discuss content optimization techniques that you can implement before and after launching your piece to increase the likelihood of your content in ranking better on search results, even without the help of link building.

Accurate Page Titles

Title tags still remain as one of the most important ranking factors when it comes to on-site SEO, based on many experts’ experience. Using your target keywords in the content’s page title can help improve its search rankings.

It’s also important that the title of your web page will meet – or exceed if possible – your target audience’s expectations from the content that they’ll see to improve visitor retention to that page. Other helpful tips when constructing your content page titles are:

  • Make the title actionable to increase click-through rate when targeting users who’ll see your content showing up on search engine result pages (don’t just use your targeted keywords to rank higher on search engines, but rather write your titles with the intent for humans to click on your listings).

  • Apply a different post title or headline to optimize your site for different semantically related or more long-tail search phrases.

  • You can also choose to use your content’s other target keywords in the page’s URL, as Google is also using URLs to display as the title on search results, when it’s found to be more relevant for search queries closely related to your content’s initially targeted keyword. AJ Kohn and Ruth Burr have discussed this recently here and here.

Use LSI keywords within the content

LSI or latent semantic indexing is a process being used by search engines to identify patterns in the relationships between terms and concepts contained in an unstructured collection of text (as defined by Wikipedia). Basically, this process allows search engines to evaluate the context and relevance of content to a search query based on the related terms being contained by a particular content.

The more a content use highly-related/synonymous terms/words, the more search engines understand where the content should be tagged or categorized in their indices.

A good trick on determining which words you can use for your content, to make its context look more relevant and particularly targeted about the industry-specific keyword, is the use of the tilde (~) when searching for your “root keyword” in Google search.

Authorship Markup

Soon enough, Google may start implementing AuthorRank into their list of ranking factors (or maybe they are already using it) in measuring if a page is worth being prominently displayed on their search results. That’s why building a strong author profile using Google+ is deemed necessary these days to prepare your strategy for future game-changing events.

Implementing Authorship markup in your site/blog is not that hard actually, and there several tutorials out there that you can check out, like from AJ Kohn, Joost de Vaulk and from this blog as well.

Nikko Marasigan, one of our in-house SEO strategists (at Xight Interactive), also tested how far can authorship markup help in terms of rankings, which somehow proven how this simple optimization technique played a big role in making a site/page rank without links from other sources (with only a single Google+ profile linking to it). Just imagine how much more power can it produce if you have solid authors contributing content to your site.

Another advantage of having authorship markup applied in your site/blog is that it increases the click-through rate of your site’s pages on Google’s SERPs, as your listings will look more trustworthy and credible to searchers.

Length of Content

It’s undeniable that longer and in-depth documents are ranking very well on search results. Make your content more comprehensive than what your competitors are offering to push your rankings above them.

Thought it is also important to ensure that the quality of information within the content will not be over-saturated or be out of substance for trying too hard to lengthen it. 800+ words will do.

Interactions and engagement in content (UGC)

Encourage your visitors to join the discussions happening within your site’s content (user generated comments, product reviews, star ratings, upvotes, etc…).

Building interactions within your content is very vital, as Google also depends on this factor in evaluating how useful your content is, especially if the amount of relevant comments in your content is high and offering value to the discussion.

There are many ways to improve interactions in your content, such as:

  • Incentivizing interactions, like organizing a comment contest or offering do-follow links to trusted commentators.
  • Making it a call-to-action at the end of your post(s), like adding “would love to hear your thoughts below, at the comment section”.
  • Responding to comments to initiate conversations.

Shareability and ability to collate social signals

As I’ve mentioned above, search engines are strongly dependent on social data these days in detecting pages that are relatively popular, authoritative and up-to-date.  So if you can drive more social signals to your site’s content, you’ll also be building a solid chance for them of ranking better on search results.

Below are few tips on getting more social shares to your content:

  • Make sure that your content is really useful and share-worthy. You can check out this post on the common types of content that make it well on social media and on how you can effectively promote them as well.
  • Make social sharing very easy for your page visitors by making your social sharing buttons visible. You can place it above the fold or at the end of the content, as visitors are more receptive on taking actions from those areas of the page.
  • Do manual social outreach, particularly to people who are really interested and more likely to share your content.
  • For more tips, you can also visit my old post on how to generate more social shares.

Link out to other credible external sources

Linking out to other sites, particularly to known trusted sites (not just Wikipedia), is a strong signal that search engines can use to see if the page is topically relevant and if it is using/citing credible sources for its content.

This action builds up the trust search engines see in your content, since it somehow aligns itself, in terms of information, to the sources that it has mentioned.

Use of rich-media content

Including rich-media content like rich and visually attractive images, cinemagraphs, screenshots, videos, slide presentations and/or data visualizations alongside your text content can also improve its search rankings.

These page elements can help improve the depth of your content and can also enhance user experience, which is another factor that search engines look into in ranking webpages.

You can also optimize their attributes to make your content more relevant to your content’s targeted search phrases, such as using your targeted keyword(s) on your rich-media content’s filenames, alternate texts (for images) and descriptions.

Implementing Schema and Microdata

Schema could really play a big part in the future of search engine optimization, and many experts believe that this optimization process has already been giving some sites the advantage of getting better rankings than other websites that haven’t applied these markups on their sites yet.

Schemas are set of tags that can be used by webmasters to their websites, which is specifically made for search engines, for them to better understand web-based documents and to eventually provide better and more relevant search results to their users (since by that time they’ll have better understanding of what a particular content is about through the help of microdata/schemas).

Learning how to code schemas can be real tricky at first. But the good news is that Raven has recently launched a browser-based tool that allows its users to generate schemas called Schema-creator.org.

You can easily create Schema.org markups using this tool and copy the generated code to your website.

Think Conversions

Every page of your site is a landing page that’s capable of generating you revenue. And Google is also looking at this kind of ability from your content to determine if it can satisfy your site’s visitors.

Implementing conversion-oriented optimization to your site’s content is definitely advantageous in so many ways, not just with rankings, but as well as in business perspectives. Some of the most common ways that you can apply to your content to increase the chances of converting its page traffic are:

  • Improving the page’s loading speed – you can use Google Lab’s Page Speed Online to see the elements that make your page load time slower and this web-based tool will also provide suggestions on how you can optimize these page or site-level elements.
  • Improving your content’s callto-action – make sure that your site offers are visible to your new visitors (you can place it above the fold or below after the content, and continuously test where it works best). It’s also best to make your offers relevant to what your content is about (basing the products, services, newsletters, ebooks or webinars you want to upsell on the topic of the content).
  • Internally linking to other relevant and useful content of the site – promote your site’s other popular/authority content through internal links, and make these links very visible (by placing them within the content and using longer strings of texts for their anchor texts), as this can increase your site’s average page visits and site visit duration/time on site. The more your visitors see your other content, the more you can funnel them to subscribing or returning to your site.

Build strong internal links to your “great content”

Lastly, once you have published your new content, you can then start building internal links to it from your other strong content (old related blog posts/articles). Vary the anchor texts that you’ll use for internal linking and try to make the linking contextual to entice clicks from visitors on these pages.

If you liked this post, you can subscribe to my feed and you can follow me on Twitter @jasonacidre and Google+.

, ,

What to do When Link Building is not enough to Outrank Competitors

Picture1

Winning the SERPs is getting tougher these days, as search engines are getting more intelligent along its continuous evolution, plus the fact that bigger sites/brands in most industries are investing more and trying to monopolize this area to intensify their marketing efforts and capacity to generate leads to their businesses.

Today’s version of Search Engine Optimization involves a lot of processes and factors, unlike before that it’s mostly focused on how the site is optimized to easily be understood by search engines, and on how search engines can assess the site’s popularity/authority based on the site’s link graph – a process which what most of us call “Link Building”.

Link building has been a very vital practice to both search marketers to achieve better search rankings for their sites and search engines in determining the value and relevance of websites, but link building as a marketing practice has grown to be more competitive over the years, leaving smaller players/brands/link builders in situations such as:

  • Exhausting almost every link building method available out there just to acquire more links than their competitors have, which results to grey/black hat dependency in many occasions.
  • Continuously built quality links to a website for almost a year, but hasn’t seen any major movements/improvements in terms of keyword rankings.
  • Competing with sites/pages that have 200+ links from .edu and .gov sites.

Competitive intelligence, it is something we SEOs use not just to determine the sites linking to our competitors (amount, value and impact of these links to their rankings), but also to know how we can outsmart them by knowing where their strong and weak points are.

Smart SEOs analyze their competitors’ weaknesses and strikes where it will hurt the most. We all know that links are important in this business, but if you can’t beat their links, beat their content.

Become more relevant

Relevance of the content is one of the biggest ranking factors in SEO, knowing that the best way search engines can serve their users right is by returning highly relevant pages for their users’ search queries.

After seeing how your competitors’ are doing with their linking activities, it’s best to evaluate their content and take note of the things you feel are lacking from those pages, especially when it comes to information and presentation.

Outpacing your competitors and proving that you are an authority in your industry will require you to be more than a Wikipedia page. Few ideas that you can start with when optimizing your content for better search rankings could be:

  • Making the content of the page you’re trying to optimize more comprehensive, where you can choose to include more text details about the subject (information that isn’t available on your competitors’ pages), including videos and/or rich images that pertains or compliments the page’s target keyword.
  • Let the content host relevant links within it – to both external and internal sources/pages – using thematically related anchor texts (for instance, linking to a “dog training lessons page” from a “dog food” page).

Drive interactions

User generated discussions/interactions is a great signal to search engines, wherein the more interactions are occurring to a certain content, the more search engines will see the authenticity of that page.

There are several forms of interactions that you can try to build to your content (or to the page you want to achieve higher rankings on search results) such as:

  • Blog/user comments
  • User reviews
  • Social shares and votes (Google +1s, Facebook Likes, Stumbleupon views, etc…)
  • Embedded testimonials (you can check out my blog’s about page for samples).

Social signals has played a big role lately in helping search engines determine popular as well as authoritative pages on the web, and implementing a good social media promotion strategy to your landing pages can certainly help in improving its search rankings.

Generate more activity

Think conversions. Search engines, particularly Google, apply usage-data (such as bounce rates, time on site, pageviews and return visits) as ranking factors to determine pages that are not just popular, but also proven to be useful to its users as well.

Test and analyze your pages if they are really converting your visitors (especially traffic from search engines) to do more or stay longer within the site. Some of the elements that you can test and implement to improve user-activity to your target landing page(s) could be:

  • Including internal links that direct to other high-value pages of your website to entice new traffic to view more pages from your site.
  • Adding social buttons (Facebook like/share, Tweet, Google+, etc…) and making them very visible.
  • Improve middle of the funnel offerings, such as inviting your visitors to sign up to your newsletter, giving away free ebooks, putting out more expert content and/or attracting them to subscribe to your blog’s RSS feed.
  • Test and strengthen calls to action, but should not be interruptive to the users’ experience.

Improve your SERP listing’s click-through rate

The more people are clicking for your page from the search results, is the better chances of improving its rankings, considering that Google uses their own SERP click-through data to measure high-converting pages from their results – seeing as pages that get more clicks, no matter what their ranking positions are, signify relevance to the inputted search query.

Below are several ways to optimize your page’s snippet on Google’s search results to effectively entice searchers into clicking your page from the results.

Title page and meta description

Optimize your titles and meta descriptions for “brand marketing”. Don’t just use and stuff these areas with your target keywords. Make it more interesting to searchers for them to click on your page, while using its target keywords to describe what’s within the content.

Implement authorship markups

Apply authorship (rel=”author”) or publisher (rel=”publisher”) markup to your site. This can massively improve your page’s SERP click-through rate, as your page will show up prominently on search results and could possibly get and attract more clicks whatever its ranking position is.

You can check out this guide on how to implement Authorship markup to a site/blog.

Use schemas for rich snippets

As I’ve mentioned above, the more your listings display outstandingly on the search results, the more it can attract people to clicking it. There are many types of microdata tags that you can use to different types of pages (like for events, products, people, movies, videos, reviews, recipes, applications, places, and a lot more) and you can also use Google’s rich snippet testing tool to check if you have installed these markups correctly to your pages.

Improve site/page design for site preview

Having a visually appealing webpage design/layout can also help in attracting searchers to clicking through your page (especially to those who might accidentally hover on the results’ site preview button, it might get them more interested with what you have to offer).

Be on Google+

Searchers that are logged-in to their Google accounts can also see pages that have been recommended by people who’re in their circles, which can also manipulate their decisions into clicking your page’s listing.

You can also check my post on how to build authority on Google+ to make the most out of this strategy.

Build Strong Internal Links through Support Pages

Create strong and highly linkable pages that can naturally generate incoming links and social signals, and make these pages internally link to and support the page you are optimizing to rank higher on search results.

You can use keyword-rich anchor texts when internally linking to your main landing page, as these votes from your internal support pages (that are receiving quality links and social mentions) will continuously grow the link value it passes through to your target landing page, and will help improve its search rankings.

Zappos is using the same strategy to make their categories rank for hardcore keywords, and Eric Siu, from Evergreen Search, wrote a great piece about the strategy.

If you liked this post, you can subscribe to my feed and follow me on Twitter @jasonacidre.

How to Innovate New SEO Strategies Using eBay

This entry is a guest post by Chris Warren from Batteries In A Flash. The views and opinions expressed on this post don’t necessarily reflect my views as an Online Marketer.

eBay logoWhy is SEO innovation crucial? Finding strategies and techniques that are not standardized pieces of SEO knowledge gives you a major tactical advantage. These are the kind of advantages you can build a business on because it creates a barrier to entry between you and your competitors.

This is going to help you compete in both super competitive industries and create huge barriers for your opposition in less optimized arenas because of how difficult your results will be to replicate. Testing a few ideas during a run of the mill project is a great way to add some “home run” potential to your every day work.

For example, we were having a debate in our office about what to change our home page title tag to; this led to hunting around our favorite sites to find examples. I ended up doing a Google search for “eBay” and I noticed something I was not expecting:

SERP eBay

A lot of eBay’s most important pages have title tags that are being truncated. Title tag best practices recommend keeping titles to fewer than 70 characters and the “eBay” search was cutting off the titles at 65 characters but several of eBay’s titles are significantly higher than 70.

Now eBay is not the kind of place that is doing something like this by accident and I am even more certain they are not making a mistake on three very important pages. Their eBay motors page is a great example coming in with a 102 character title:

“eBay Motors – Autos, Used Cars, Motorcycles, Boats, Trucks, Parts, Accessories, RVs and Other Vehicles”

Since we are dealing with eBay I decided to assume that there was a very good reason for writing the title like this and reverse engineer the rationale behind it. One of the things on my mind at the time was the Panda updates and how much Google has been pushing branded authority sites to the top of results. I put together a few searches pulling from that eBay motors page title and I ended up with:

“used truck accessories”

It turned out the motors page was ranking the #1 spot on Google which is a 1300 search per month phrase (broad match) according to Google keyword tool. eBay was able to rank simply based on their site and page authority by including the individual words SOMEWHERE in the title tag.

“eBay Motors – Autos, Used Cars, Motorcycles, Boats, Trucks, Parts, Accessories, RVs and Other Vehicles”

Digging deeper revealed that mixing and matching words from the title tag revealed a lot of phrases the eBay motors page was ranking for:

  • Rank 4: used autos
  • Rank 4: used vehicle parts
  • Rank 5: used truck parts
  • Rank 9: new boat parts
  • Rank 11: new auto accessories
  • Rank 11: boat accessories
  • Rank 15: new car accessories

Google Keyword Tool Searches Per Month (Broad Match)

Google Keyword Tool

Talk about a productive web page! The point is that by following the pack you can end up leaving a lot on the table. Having lots of experiments running and constantly testing even well established “best practices” is going to give you an edge, not to mention how smart you get to look when find something and show it off.

This eBay technique is a great example of a strategy I use where you find something outside of the norm from a source that has demonstrated a high level of SEO savvy, for example SEO book and Mashable are sites I have used like this in the past. Either you find something useful or you can now write a blog post called, “Why I Am Better Than Aaron Wall At SEO.”

About the AuthorChris Warren is an in-house web strategist for Batteries In A Flash give him a shout out @BIAFgreen on Twitter.

If you liked this post, you can subscribe to my feed and/or follow me on Twitter @jasonacidre

10 On-page SEO Tactics for 2011

SEon-page SEOO is a consistently evolving field of modern marketing, given that search engines constantly improve their algorithms’ capabilities of returning high quality and highly relevant pages to their users’ queries. Search engines – especially Google – use hundreds of factors to estimate pages that deserve to show up on their top results.

There are two major processes in search engine optimization that enable websites and pages to rank for its targeted keywords on search results – and these are on-page and off-page optimization.

Off-page optimization is the part of SEO which involves anything that’s related to making a website and its pages popular (from external citations from other websites through link building and social media), while on-page optimization is the area where pages are able to obtain higher search rankings through the relevance of the page’s content to its targeted search term or keyword.

As I have mentioned above, there are hundreds of factors that search engines use to determine high quality pages, and that count just doesn’t stop there, as they are still finding more ways to improve their users’ search experience. So I have listed below some of the on-page optimization techniques that I believe worth exploring and testing this year.

On-page Markup through Schemas

Google, Bing and Yahoo recently introduced Schema.org, as the shared vocabulary that these 3 big search engines will be using to help them better understand web pages’ content – which uses Microdata as its structured data markup.

Schemas like Microdata are set of HTML tags that can be used to specify essential information about a webpage’s content, which include numerous types of markups (more than 100 as mentioned by Google) and new HTML elements (such as “itemscope, itemtype and itemprop”) that can make it much easier for search engines to determine and weigh page relevance.  You can check the full list of markup types here.

Here’s a sample of what your page’s codes will look like if you use this markup:

The more markups you make, the better search engines will understand your content, and the better your pages are presented on search results through rich snippets. Google is not yet considering usage of markups as a search ranking factor, but will eventually be, and thus affect/improve your pages’ SERP click-through rate and its ability to attract traffic as they appear outstanding with the way they are displayed on search results, like this one:

(looks tasty trustworthy to me)

Learn more about Microdata on Schema.org’s getting started page and Schema.org FAQ on Google.

Length of content

The length of the document is a powerful indication of a webpage’s quality, knowing that changes that Google had employed with their Panda update are somehow strict when it comes to classifying quality and crappy content.

In my experience, longer content appears to perform well in search results, since most of my blog posts are composed of thousands of words. Given that they contain so much information, a single post/page is able to target multiple long-tail keywords that in turn allowed my blog to attract more organic visitors.

Sociability

A page’s ability to send massive social signals to search engines also impacts the page’s chances of getting higher search rankings. Pages that are mostly capable of doing this are purposely created to act as a viral content, or basically a piece of work that’s made to draw attention or interest from specifically targeted audiences.

The visibility of social sharing buttons as well as the numbers of social shares is a good way to attract readers to sharing your content and in linking to it.

Building support pages that have strong social signals

Landing pages, particularly sales pages, are kind of awkward and sometimes hard to be externally promoted through link building, especially if you are aiming to acquire editorial links. However, creating high quality pages (pages that are able to attract links and social shares naturally) in your site that can support your important landing pages through internal linking can extremely pass through huge amount of link value, which can improve your landing pages’ search rankings.

You can also choose to build support pages that can be hosted externally through guest blogs, wherein you can contextually link to your landing pages. It’s best to offer interesting content to be that can still be thematic to your landing pages (ex. if your site is offering a dog training course, you can build support pages like “Top 100 dog trainers in the world (widget)”, “Dog name generator (tool)”, “list of 100 dogs that can easily be trained” etc… and make these pages link to your dog training course sales’ page using highly descriptive or branded anchors).

Usability

This is perhaps the most significant ranking factor out of the hundreds that are on different search algorithms’ lists, as search engines are more to vouch for your site if it has proven its worth of being useful to visitors. If you are aiming to rank for highly competitive search terms, it’s imperative to study your important pages’ conversion factors to be able to distinguish areas of the page that needs to be improved.

Knowing your visitors’ activity and behavior once they are on your important pages can simply hint you with what to implement to make it perform better (by determining your site’s traffic performance through Analytics). Here are some of the things that you can do to improve your site’s traffic conversions.

  • Enhance site speed.
  • Include translation features if your site is getting substantial visitors from non-English-speaking countries.
  • Lessen visual distractions like ads and other irrelevant site elements.
  • Invest on a visually attracting web design.
  • Simplify the delivery of your landing pages’ content and reduce irrelevant linking.
  • Accessibility of other thematically helpful pages.
  • Presence of site search feature.
  • Test your pages on different browsers.

Data to keep track of:

  • Top content
  • Top exit pages
  • Top landing pages
  • Languages

Improve domain authority and trust

Yeah, this may seem off topic, since improving a domain’s trust and authority scores is mostly done externally through link building. However, domain level metrics such as domain authority and domain trust are strong factors that can really impact and influence your web pages’ search rankings, especially with highly competitive keywords.

Websites that have high domain authority and trust (can be approximately measured through Open Site Explorer’s full link metrics) are also able to make their newly published pages earn higher positions on SERPs in just a few days – and sometimes in a few hours. Improving your site’s trustrank and authority may take some time, seeing as its development mostly requires long and enduring processes such as:

  • Link diversity – this pertains to the variation of your incoming link’s anchor texts, velocity of the site’s link growth over time, ratio of links pointing to your site’s inner pages to site’s homepage and diversity of the methods used to acquire links, which are assessed through the links’ placements (sidebar, footer, comment, editorial, etc…).
  • Quantity of backlinks that have good placements – number of high value links that the entire site was able to acquire from other authority sites that are topically relevant, have high visibility (within the body) and have high click-through rates.
  • Internal linking – staging a good internal linking structure that helps web crawlers find deep pages in your website. Also, supporting your important pages through internal links from other prominent pages in your site that have high amount of link juice (strong MozRank, PageRank, Page Authortity, high percentage of traffic entrance, etc…)
  • Quantity of strong pages hosted by the site – number of popular pages in the site that have acquired good amount of high quality links from other websites through social shares, editorial citations and may also be authoritative in terms of PageRank, MozRank, Page Authority, PostRank (which is now owned by Google), and have good search rankings.
  • Domain age – uhm… the years the site has been live?

John Doherty of Distilled wrote a good post about this subject recently and you might want to check it out.

Presence of links to high value external pages

It has been a myth that linking out to other websites/web pages within your content reduces that certain page’s PageRank, however, linking out to high quality external pages do establish trust through link relation, given that you may possibly be referencing back to a trusted source.

The point is, having links to reputable websites can build trust, and trust does eventually result to good ranking positions. With appropriate usage, these external links might just help your site obtain good search rankings as search engines are more to trust your content.

SERP Click-through rate

Traffic data that search engines are able to gather through their search engine result pages such as web pages’ click-through rates do seem to have effect on search rankings, for competitive search queries in particular.  Most experts also believe that CTR from search engines to the page for the targeted search term improves the page’s ranking position, as mentioned on SEOmoz’s survey results of search ranking factors for 2011 (Page-level traffic data).

Improving your pages’ SERP click-through rates can be performed in ways such as:

  • Use of strong and actionable words on the page’s title and meta description.
  • Titles that use numbers seem to work well on SERPs, in terms of click through.
  • Good web design, as it might be displayed by users through Google Instant Previews.
  • Aiming for your pages to be displayed with rich snippets on Google through marking up your page with Microdata, testing your pages’ snippets through Google rich snippet tool and by submitting your snippets to Google.

Geotagging

Geotagging is a geographical identification metadata – as described on Wikipedia – that allows users and search engines see your business’ actual geographic location through latitude and longitude coordinates, which is certainly a good way to establish trust to both users and search engines. This method may also benefit you through other means of search (GPS tracking), given that mapping services are able to track your location and be included on their listings.

There are several formats used in Geotagging a website, but the most utilized is Google Earth’s Geo.kml. Luckily, I found Geo Sitemap Generator, a free web-based application that can automate the process of generating a KML file and a Geo Sitemap, which you can upload to your site’s root directory after having those files downloaded from the free application.

Humans.txt

Google recently announced that they’ll be supporting authorship markups and Matt Cutts also mentioned AuthorRank as a new way to measure websites’ importance on an interview by Danny Sullivan on SMX Advanced Seattle.

Humans.txt is HTML 5’s approach to authorship markup (rel=”author”). This text file is pretty much similar to Robots.txt, but intended for both users and web crawlers for them to know the author of the content or a website, which is also a good way to establish trust as well as to credit the creator of the content/site. Below is a sample of what a humans.txt file looks like:

Once you have created a humans.txt file for your site and have uploaded it on your site’s root directory, you can then include an author tag through your site’s <head> section to enable web crawlers in accessing the page; like this:

Google: We know that great content comes from great authors, and we’re looking closely at ways this markup could help us highlight authors and rank search results.

If you enjoyed this post, you may subscribe to my feed or follow me on my new Twitter account.

Image Credit: DpressedSoul