10 On-page SEO Tactics for 2011

By on Jun 14, 2011 in Search | 30 comments

Share On GoogleShare On FacebookShare On Twitter

SEon-page SEOO is a consistently evolving field of modern marketing, given that search engines constantly improve their algorithms’ capabilities of returning high quality and highly relevant pages to their users’ queries. Search engines – especially Google – use hundreds of factors to estimate pages that deserve to show up on their top results.

There are two major processes in search engine optimization that enable websites and pages to rank for its targeted keywords on search results – and these are on-page and off-page optimization.

Off-page optimization is the part of SEO which involves anything that’s related to making a website and its pages popular (from external citations from other websites through link building and social media), while on-page optimization is the area where pages are able to obtain higher search rankings through the relevance of the page’s content to its targeted search term or keyword.

As I have mentioned above, there are hundreds of factors that search engines use to determine high quality pages, and that count just doesn’t stop there, as they are still finding more ways to improve their users’ search experience. So I have listed below some of the on-page optimization techniques that I believe worth exploring and testing this year.

On-page Markup through Schemas

Google, Bing and Yahoo recently introduced Schema.org, as the shared vocabulary that these 3 big search engines will be using to help them better understand web pages’ content – which uses Microdata as its structured data markup.

Schemas like Microdata are set of HTML tags that can be used to specify essential information about a webpage’s content, which include numerous types of markups (more than 100 as mentioned by Google) and new HTML elements (such as “itemscope, itemtype and itemprop”) that can make it much easier for search engines to determine and weigh page relevance.  You can check the full list of markup types here.

Here’s a sample of what your page’s codes will look like if you use this markup:

The more markups you make, the better search engines will understand your content, and the better your pages are presented on search results through rich snippets. Google is not yet considering usage of markups as a search ranking factor, but will eventually be, and thus affect/improve your pages’ SERP click-through rate and its ability to attract traffic as they appear outstanding with the way they are displayed on search results, like this one:

(looks tasty trustworthy to me)

Learn more about Microdata on Schema.org’s getting started page and Schema.org FAQ on Google.

Length of content

The length of the document is a powerful indication of a webpage’s quality, knowing that changes that Google had employed with their Panda update are somehow strict when it comes to classifying quality and crappy content.

In my experience, longer content appears to perform well in search results, since most of my blog posts are composed of thousands of words. Given that they contain so much information, a single post/page is able to target multiple long-tail keywords that in turn allowed my blog to attract more organic visitors.


A page’s ability to send massive social signals to search engines also impacts the page’s chances of getting higher search rankings. Pages that are mostly capable of doing this are purposely created to act as a viral content, or basically a piece of work that’s made to draw attention or interest from specifically targeted audiences.

The visibility of social sharing buttons as well as the numbers of social shares is a good way to attract readers to sharing your content and in linking to it.

Building support pages that have strong social signals

Landing pages, particularly sales pages, are kind of awkward and sometimes hard to be externally promoted through link building, especially if you are aiming to acquire editorial links. However, creating high quality pages (pages that are able to attract links and social shares naturally) in your site that can support your important landing pages through internal linking can extremely pass through huge amount of link value, which can improve your landing pages’ search rankings.

You can also choose to build support pages that can be hosted externally through guest blogs, wherein you can contextually link to your landing pages. It’s best to offer interesting content to be that can still be thematic to your landing pages (ex. if your site is offering a dog training course, you can build support pages like “Top 100 dog trainers in the world (widget)”, “Dog name generator (tool)”, “list of 100 dogs that can easily be trained” etc… and make these pages link to your dog training course sales’ page using highly descriptive or branded anchors).


This is perhaps the most significant ranking factor out of the hundreds that are on different search algorithms’ lists, as search engines are more to vouch for your site if it has proven its worth of being useful to visitors. If you are aiming to rank for highly competitive search terms, it’s imperative to study your important pages’ conversion factors to be able to distinguish areas of the page that needs to be improved.

Knowing your visitors’ activity and behavior once they are on your important pages can simply hint you with what to implement to make it perform better (by determining your site’s traffic performance through Analytics). Here are some of the things that you can do to improve your site’s traffic conversions.

Data to keep track of:

Improve domain authority and trust

Yeah, this may seem off topic, since improving a domain’s trust and authority scores is mostly done externally through link building. However, domain level metrics such as domain authority and domain trust are strong factors that can really impact and influence your web pages’ search rankings, especially with highly competitive keywords.

Websites that have high domain authority and trust (can be approximately measured through Open Site Explorer’s full link metrics) are also able to make their newly published pages earn higher positions on SERPs in just a few days – and sometimes in a few hours. Improving your site’s trustrank and authority may take some time, seeing as its development mostly requires long and enduring processes such as:

John Doherty of Distilled wrote a good post about this subject recently and you might want to check it out.

Presence of links to high value external pages

It has been a myth that linking out to other websites/web pages within your content reduces that certain page’s PageRank, however, linking out to high quality external pages do establish trust through link relation, given that you may possibly be referencing back to a trusted source.

The point is, having links to reputable websites can build trust, and trust does eventually result to good ranking positions. With appropriate usage, these external links might just help your site obtain good search rankings as search engines are more to trust your content.

SERP Click-through rate

Traffic data that search engines are able to gather through their search engine result pages such as web pages’ click-through rates do seem to have effect on search rankings, for competitive search queries in particular.  Most experts also believe that CTR from search engines to the page for the targeted search term improves the page’s ranking position, as mentioned on SEOmoz’s survey results of search ranking factors for 2011 (Page-level traffic data).

Improving your pages’ SERP click-through rates can be performed in ways such as:


Geotagging is a geographical identification metadata – as described on Wikipedia – that allows users and search engines see your business’ actual geographic location through latitude and longitude coordinates, which is certainly a good way to establish trust to both users and search engines. This method may also benefit you through other means of search (GPS tracking), given that mapping services are able to track your location and be included on their listings.

There are several formats used in Geotagging a website, but the most utilized is Google Earth’s Geo.kml. Luckily, I found Geo Sitemap Generator, a free web-based application that can automate the process of generating a KML file and a Geo Sitemap, which you can upload to your site’s root directory after having those files downloaded from the free application.


Google recently announced that they’ll be supporting authorship markups and Matt Cutts also mentioned AuthorRank as a new way to measure websites’ importance on an interview by Danny Sullivan on SMX Advanced Seattle.

Humans.txt is HTML 5’s approach to authorship markup (rel=”author”). This text file is pretty much similar to Robots.txt, but intended for both users and web crawlers for them to know the author of the content or a website, which is also a good way to establish trust as well as to credit the creator of the content/site. Below is a sample of what a humans.txt file looks like:

Once you have created a humans.txt file for your site and have uploaded it on your site’s root directory, you can then include an author tag through your site’s <head> section to enable web crawlers in accessing the page; like this:

Google: We know that great content comes from great authors, and we’re looking closely at ways this markup could help us highlight authors and rank search results.

If you enjoyed this post, you may subscribe to my feed or follow me on my new Twitter account.

Image Credit: DpressedSoul

Jason Acidre

Jason Acidre is Co-Founder and CEO of Xight Interactive, marketing consultant for Affilorama and Traffic Travis, and also the sole author of this SEO blog. You can follow him on Twitter @jasonacidre and on Google+.

More Posts - Twitter - Facebook - Google Plus


    • Kaiserthesage

      June 15, 2011

      Post a Reply

      yeah, I was also excited when I first made my humans.txt file 😀

  1. Honestly, I really hate to see that Google is putting so much emphasis on page length. Don’t get me wrong, I understand why this is the case, but it seems that we will all just start diluting our content with words, making it more difficult to retrieve the concepts from a page simply because they are swimming in a sea of words put there to boost SERP rankings.

    With respect to the Markup Schemas…is there a WordPress plugin for this yet? 😉


    • Kaiserthesage

      June 15, 2011

      Post a Reply

      I was actually thinking that people could have over-estimated the capabilities of search engines in determining high quality content, seeing as they are still in need of new tags to help them understand what really a content is pertaining to.

      I think a plugin for markups in WordPress will be available any time soon, it has been the craze of the SEO world recently, and I believe someone who wants to be rich this year is already up to it :)

      • I hope you are right about the plugin Jason. The more I am reading about the schema markups, the more my eyes just sort of glaze over. I plugin to crawl your posts and suggest and implement the markups on a per page basis would be just fine by me :-)

        The humans.txt file is very interesting. How do you think this would play out for quest posting? Perhaps this would be something that webmasters would screen for? Some sort of moz ranking for the author and the higher the ranking, the higher potential boost the content may receive, even if not posted on the authors own site.

        Will be interesting to see how it plays out for sure.
        Thanks for the detailed post.

  2. Bob Kidman

    June 15, 2011

    Post a Reply

    Nice article Jason, clear, concise and original. I’ve been getting stuck into schema over the past few weeks because I think it will be quite important in the not too distant future along with author mark up.

    Look forward to hearing more from you


    • Kaiserthesage

      June 15, 2011

      Post a Reply

      Thanks Bob, and glad that you liked this post.

      I’ve been studying Microdata recently as well, and it really is time consuming, especially if you have lots of pages. Though I think this step is very practical in optimizing a site’s important landing pages, just to be prepared with what the future might bring :)

      I’m more into authorship markups now, I find it more interesting as I see it more powerful in the future (if played well by G).

  3. Zarko

    June 15, 2011

    Post a Reply

    Hi Jason,

    another great post :)

    I personally loved the schema update and since I was already working on a new design for our website it came in the right time to implement microdata into the code. Still looking at the benefits of humans.txt but so far I have to agree with Graywolf, it can be easily used to steal content as well, especially from sites that have a slow cache rate…

    • Kaiserthesage

      June 15, 2011

      Post a Reply

      Thanks Zarko!

      Looks like a draining process for a site redesign 😀 I’m also thinking of switching to a new design (HTML 5), been really inspired of what Tom Critchlow mentioned on your interview :)

  4. lawmacs

    June 17, 2011

    Post a Reply

    Hi Jason Thanks for the heads up on the human.txt file it seems great to me although fairly new to it i believe it has its place and should surely be added to ur sites mark up. On to the point about usability i could echo the words of Google take care of the users experience and the rest will follow. Thanks Jason a very informative post.

  5. These are really what you call an advantageous strategy, except for the Schema.org part. Is it really beneficial to SEO? I have read debates about this features and from there, I can conclude that many people are skeptical to use this new approach in on-page optimization.

    • Kaiserthesage

      June 24, 2011

      Post a Reply

      Wow, thanks for the awesome comment Chandan and for the future link love :) , really glad that you liked this post.

  6. So rel=author isn’t as new as everyone is making it sound? I like the humans.txt version better because it allows for a twitter badge. To me, that just adds more authority since Twitter is such a huge influence on Google these days. Thanks for the link to Lisa’s post on Sullivan with Cutts. That was an interesting lecture. Glad I got to read up on that.

  7. Devon

    August 18, 2011

    Post a Reply

    Excellent post I must say.. Simple but yet interesting and engaging.. Keep up the awesome work Jason!

  8. Hey,!

    Those were some really good tips. I pretty much knew about all of ‘em before reading this post — but I sure learned something new.

    As for me and my on-page SEO — I think I have some tips for you.

    First of all, always write long texts. Writing a 100 word article and thinking that Google will adore it is completely wrong. That will never happen. Search engines, and especially Google, breathe content, and that is exactly why it is so important. I always aim at about 500 words per article.

    Another tip which is relatively new thanks to the Panda Update is: don’t use lots of ad blocks on your website if you don’t have lots of content written.

    If you, say, have an article consisting of 100 words — don’t go with three AdSense Blocks. That is extremely unwise.

    Not only will your CTR be low — you’ll also be considered as a low-quality website by Google leading to you getting thrown out. I’ve experienced this myself before, so I believe I know what I’m talking about.

  9. These are some fantastic tips here, Jason. All of us webmasters and web owners definitely need to keep up with search engines as they will be changing all throughout the year. Constant updates and change in search algorithms are hard to keep up with. 2011 has been a very busy year so far. We all need to be reminded how important URL structures are. Well, there have been rumors that Google has been paying less attention to keywords in URLs. However, experience proves that keywords in URLs are still significantly critical.

  10. John

    September 8, 2011

    Post a Reply

    Great post! Have you seen any before/after examples from adding these tags?

    John@San Francisco Wedding Photographer

  11. Hi,

    Another great article, thank you.

    We have been sinking our teeth into Schemas, it’s all about keeping up after all! We hadn’t heard of the human.txt file, so thank you for the heads up.

  12. Lavoie

    September 22, 2011

    Post a Reply

    Good post!

    These are some fantastic tips here, Jason. All of us webmasters and web owners definitely need to keep up with search engines as they will be changing all throughout the year.

  13. Modi

    September 27, 2011

    Post a Reply

    Humans.txt was new to me too.

    Re linking out to trustworthy, authority sites. Although in theory it sounds good, is that something you’ve actually tested? It just sounds too good to be true. I just think if that was the case all SEOs would link out to PR9, PR10 sites, .gov, .edu etc. Any more hints?

  14. Betshoot

    October 18, 2011

    Post a Reply

    Nice one, i guess humans.txt can help about the author highlighting in results, but it seems that it needs to be updated frequently, since the ‘Site: Updated’ tag needs to be changing everyday if your website has a daily rythum of posts and news.

    Probably can be done with a simple php function i guess.

    Thanks for your information :)

  15. Dan

    November 2, 2011

    Post a Reply

    Great list, I’m just working on my Schemas mark up on all of my sites, it might actually take forever but I’m hoping it’ll be worth it in the end.

    Never heard of the rel=author thing though, how much relevance do you think search engines will place on this?

  16. Sayed @webuildlink

    November 24, 2011

    Post a Reply

    Excellent Post,
    Jason you are Rocking !
    Learned two new things : a) Humans.txt b) Content Length is Matter to Google(you are writing article over thousands words 😉 )
    in Next Few days i will be busy with Schema.org Many things are waiting there to learn

    Thanks for the Valuable Information

  17. Hiren

    December 4, 2011

    Post a Reply

    Great Read, Just curious to know about humans.txt. How web crawler read this text file? We need to put this on the root folder. Let me know for the same. Thanks.

  18. Bill

    December 30, 2011

    Post a Reply

    Thanks for the Humans.txt file info. I will be implementing that on all my sites in the next few weeks. Good to know that trust can be built that way.

  19. David

    January 23, 2012

    Post a Reply

    thanks mate for sharing this as i was looking for some on page tactics too like you know its a bit techy as well but after reading this blog post i have opend my mind very much to understand things that will help me too now to improve me on page skills Thanks For Sharing it =)


  1. 4 Inbound Marketing Strategies DIYMarketers -Marketing Advice for Small Business Owners and CEOs Marketing Advice for CEOs - DIYMarketers - [...] 10 On-page SEO Tactics for 2011 (kaiserthesage.com) [...]
  2. Sunday Post Round Up #25 Seo Today — lawmacs web design blog - [...] 10 On-page SEO Tactics for 2011 – Seo  on-page SEO is a consistently evolving field of modern marketing, given…
  3. Free Back Links Resources / Free Backlink Generator Tools - [...] 10 On-page SEO Tactics for 2011 (kaiserthesage.com) [...]
  4. On-Page SEO: Improve Rankings and Drive Traffic | Best Global SEM - [...] markup is within the HTML tags and help search engines rank pages for relevance. An example by KAISERTHESAGE illustrates…
  5. Does Google See You as the Author of Your Blog? | SEO and PPC - Cebu, Philippines - [...] the markup. The first one is from WordPress Guru Yoast, and the other one is from a good friend…
  6. SEO Strategies and Inbound Marketing Best Practices for 2012 | Kaiserthesage - [...] users. For more tips on on-site optimization, you can check out my other (older) posts about it here and…
  7. What is the difference between Organic and Inorganic Search Results? | Louie Sison Dot Com - […] http://kaiserthesage.com/on-page-optimization-2011/ by Jason Acidre […]

Submit a Comment

Your email address will not be published. Required fields are marked *