10 On-page SEO Tactics for 2011

by Jason Acidre on June 14, 2011 · 37 comments · Search


SEon-page SEOO is a consistently evolving field of modern marketing, given that search engines constantly improve their algorithms’ capabilities of returning high quality and highly relevant pages to their users’ queries. Search engines – especially Google – use hundreds of factors to estimate pages that deserve to show up on their top results.

There are two major processes in search engine optimization that enable websites and pages to rank for its targeted keywords on search results – and these are on-page and off-page optimization.

Off-page optimization is the part of SEO which involves anything that’s related to making a website and its pages popular (from external citations from other websites through link building and social media), while on-page optimization is the area where pages are able to obtain higher search rankings through the relevance of the page’s content to its targeted search term or keyword.

As I have mentioned above, there are hundreds of factors that search engines use to determine high quality pages, and that count just doesn’t stop there, as they are still finding more ways to improve their users’ search experience. So I have listed below some of the on-page optimization techniques that I believe worth exploring and testing this year.

On-page Markup through Schemas

Google, Bing and Yahoo recently introduced Schema.org, as the shared vocabulary that these 3 big search engines will be using to help them better understand web pages’ content – which uses Microdata as its structured data markup.

Schemas like Microdata are set of HTML tags that can be used to specify essential information about a webpage’s content, which include numerous types of markups (more than 100 as mentioned by Google) and new HTML elements (such as “itemscope, itemtype and itemprop”) that can make it much easier for search engines to determine and weigh page relevance.  You can check the full list of markup types here.

Here’s a sample of what your page’s codes will look like if you use this markup:

The more markups you make, the better search engines will understand your content, and the better your pages are presented on search results through rich snippets. Google is not yet considering usage of markups as a search ranking factor, but will eventually be, and thus affect/improve your pages’ SERP click-through rate and its ability to attract traffic as they appear outstanding with the way they are displayed on search results, like this one:

(looks tasty trustworthy to me)

Learn more about Microdata on Schema.org’s getting started page and Schema.org FAQ on Google.

Length of content

The length of the document is a powerful indication of a webpage’s quality, knowing that changes that Google had employed with their Panda update are somehow strict when it comes to classifying quality and crappy content.

In my experience, longer content appears to perform well in search results, since most of my blog posts are composed of thousands of words. Given that they contain so much information, a single post/page is able to target multiple long-tail keywords that in turn allowed my blog to attract more organic visitors.

Sociability

A page’s ability to send massive social signals to search engines also impacts the page’s chances of getting higher search rankings. Pages that are mostly capable of doing this are purposely created to act as a viral content, or basically a piece of work that’s made to draw attention or interest from specifically targeted audiences.

The visibility of social sharing buttons as well as the numbers of social shares is a good way to attract readers to sharing your content and in linking to it.

Building support pages that have strong social signals

Landing pages, particularly sales pages, are kind of awkward and sometimes hard to be externally promoted through link building, especially if you are aiming to acquire editorial links. However, creating high quality pages (pages that are able to attract links and social shares naturally) in your site that can support your important landing pages through internal linking can extremely pass through huge amount of link value, which can improve your landing pages’ search rankings.

You can also choose to build support pages that can be hosted externally through guest blogs, wherein you can contextually link to your landing pages. It’s best to offer interesting content to be that can still be thematic to your landing pages (ex. if your site is offering a dog training course, you can build support pages like “Top 100 dog trainers in the world (widget)”, “Dog name generator (tool)”, “list of 100 dogs that can easily be trained” etc… and make these pages link to your dog training course sales’ page using highly descriptive or branded anchors).

Usability

This is perhaps the most significant ranking factor out of the hundreds that are on different search algorithms’ lists, as search engines are more to vouch for your site if it has proven its worth of being useful to visitors. If you are aiming to rank for highly competitive search terms, it’s imperative to study your important pages’ conversion factors to be able to distinguish areas of the page that needs to be improved.

Knowing your visitors’ activity and behavior once they are on your important pages can simply hint you with what to implement to make it perform better (by determining your site’s traffic performance through Analytics). Here are some of the things that you can do to improve your site’s traffic conversions.

  • Enhance site speed.
  • Include translation features if your site is getting substantial visitors from non-English-speaking countries.
  • Lessen visual distractions like ads and other irrelevant site elements.
  • Invest on a visually attracting web design.
  • Simplify the delivery of your landing pages’ content and reduce irrelevant linking.
  • Accessibility of other thematically helpful pages.
  • Presence of site search feature.
  • Test your pages on different browsers.

Data to keep track of:

  • Top content
  • Top exit pages
  • Top landing pages
  • Languages

Improve domain authority and trust

Yeah, this may seem off topic, since improving a domain’s trust and authority scores is mostly done externally through link building. However, domain level metrics such as domain authority and domain trust are strong factors that can really impact and influence your web pages’ search rankings, especially with highly competitive keywords.

Websites that have high domain authority and trust (can be approximately measured through Open Site Explorer’s full link metrics) are also able to make their newly published pages earn higher positions on SERPs in just a few days – and sometimes in a few hours. Improving your site’s trustrank and authority may take some time, seeing as its development mostly requires long and enduring processes such as:

  • Link diversity – this pertains to the variation of your incoming link’s anchor texts, velocity of the site’s link growth over time, ratio of links pointing to your site’s inner pages to site’s homepage and diversity of the methods used to acquire links, which are assessed through the links’ placements (sidebar, footer, comment, editorial, etc…).
  • Quantity of backlinks that have good placements – number of high value links that the entire site was able to acquire from other authority sites that are topically relevant, have high visibility (within the body) and have high click-through rates.
  • Internal linking – staging a good internal linking structure that helps web crawlers find deep pages in your website. Also, supporting your important pages through internal links from other prominent pages in your site that have high amount of link juice (strong MozRank, PageRank, Page Authortity, high percentage of traffic entrance, etc…)
  • Quantity of strong pages hosted by the site – number of popular pages in the site that have acquired good amount of high quality links from other websites through social shares, editorial citations and may also be authoritative in terms of PageRank, MozRank, Page Authority, PostRank (which is now owned by Google), and have good search rankings.
  • Domain age – uhm… the years the site has been live?

John Doherty of Distilled wrote a good post about this subject recently and you might want to check it out.

Presence of links to high value external pages

It has been a myth that linking out to other websites/web pages within your content reduces that certain page’s PageRank, however, linking out to high quality external pages do establish trust through link relation, given that you may possibly be referencing back to a trusted source.

The point is, having links to reputable websites can build trust, and trust does eventually result to good ranking positions. With appropriate usage, these external links might just help your site obtain good search rankings as search engines are more to trust your content.

SERP Click-through rate

Traffic data that search engines are able to gather through their search engine result pages such as web pages’ click-through rates do seem to have effect on search rankings, for competitive search queries in particular.  Most experts also believe that CTR from search engines to the page for the targeted search term improves the page’s ranking position, as mentioned on SEOmoz’s survey results of search ranking factors for 2011 (Page-level traffic data).

Improving your pages’ SERP click-through rates can be performed in ways such as:

  • Use of strong and actionable words on the page’s title and meta description.
  • Titles that use numbers seem to work well on SERPs, in terms of click through.
  • Good web design, as it might be displayed by users through Google Instant Previews.
  • Aiming for your pages to be displayed with rich snippets on Google through marking up your page with Microdata, testing your pages’ snippets through Google rich snippet tool and by submitting your snippets to Google.

Geotagging

Geotagging is a geographical identification metadata – as described on Wikipedia – that allows users and search engines see your business’ actual geographic location through latitude and longitude coordinates, which is certainly a good way to establish trust to both users and search engines. This method may also benefit you through other means of search (GPS tracking), given that mapping services are able to track your location and be included on their listings.

There are several formats used in Geotagging a website, but the most utilized is Google Earth’s Geo.kml. Luckily, I found Geo Sitemap Generator, a free web-based application that can automate the process of generating a KML file and a Geo Sitemap, which you can upload to your site’s root directory after having those files downloaded from the free application.

Humans.txt

Google recently announced that they’ll be supporting authorship markups and Matt Cutts also mentioned AuthorRank as a new way to measure websites’ importance on an interview by Danny Sullivan on SMX Advanced Seattle.

Humans.txt is HTML 5’s approach to authorship markup (rel=”author”). This text file is pretty much similar to Robots.txt, but intended for both users and web crawlers for them to know the author of the content or a website, which is also a good way to establish trust as well as to credit the creator of the content/site. Below is a sample of what a humans.txt file looks like:

Once you have created a humans.txt file for your site and have uploaded it on your site’s root directory, you can then include an author tag through your site’s <head> section to enable web crawlers in accessing the page; like this:

Google: We know that great content comes from great authors, and we’re looking closely at ways this markup could help us highlight authors and rank search results.

If you enjoyed this post, you may subscribe to my feed or follow me on my new Twitter account.

Image Credit: DpressedSoul

Jason Acidre

Jason Acidre is Co-Founder and CEO of Xight Interactive, marketing consultant for Affilorama and Traffic Travis, and also the sole author of this SEO blog. You can follow him on Twitter @jasonacidre and on Google+.

More Posts - Twitter - Facebook - Google Plus

{ 30 comments… read them below or add one }

Ivan Walsh June 15, 2011 at 12:57 am

Thanks for the tip on humans.txt – new one for me :)

Reply

Kaiserthesage June 15, 2011 at 8:58 am

yeah, I was also excited when I first made my humans.txt file :D

Reply

Mark@TheBitBot SEM Blog June 15, 2011 at 1:44 am

Honestly, I really hate to see that Google is putting so much emphasis on page length. Don’t get me wrong, I understand why this is the case, but it seems that we will all just start diluting our content with words, making it more difficult to retrieve the concepts from a page simply because they are swimming in a sea of words put there to boost SERP rankings.

With respect to the Markup Schemas…is there a Wordpress plugin for this yet? ;)

Mark

Reply

Kaiserthesage June 15, 2011 at 8:50 am

I was actually thinking that people could have over-estimated the capabilities of search engines in determining high quality content, seeing as they are still in need of new tags to help them understand what really a content is pertaining to.

I think a plugin for markups in Wordpress will be available any time soon, it has been the craze of the SEO world recently, and I believe someone who wants to be rich this year is already up to it :)

Reply

Damon@How to Get Out of Debt June 26, 2011 at 8:41 pm

I hope you are right about the plugin Jason. The more I am reading about the schema markups, the more my eyes just sort of glaze over. I plugin to crawl your posts and suggest and implement the markups on a per page basis would be just fine by me :-)

The humans.txt file is very interesting. How do you think this would play out for quest posting? Perhaps this would be something that webmasters would screen for? Some sort of moz ranking for the author and the higher the ranking, the higher potential boost the content may receive, even if not posted on the authors own site.

Will be interesting to see how it plays out for sure.
Thanks for the detailed post.

Reply

Bob Kidman June 15, 2011 at 2:32 am

Nice article Jason, clear, concise and original. I’ve been getting stuck into schema over the past few weeks because I think it will be quite important in the not too distant future along with author mark up.

Look forward to hearing more from you

Bob

Reply

Kaiserthesage June 15, 2011 at 8:54 am

Thanks Bob, and glad that you liked this post.

I’ve been studying Microdata recently as well, and it really is time consuming, especially if you have lots of pages. Though I think this step is very practical in optimizing a site’s important landing pages, just to be prepared with what the future might bring :)

I’m more into authorship markups now, I find it more interesting as I see it more powerful in the future (if played well by G).

Reply

Zarko June 15, 2011 at 5:05 am

Hi Jason,

another great post :)

I personally loved the schema update and since I was already working on a new design for our website it came in the right time to implement microdata into the code. Still looking at the benefits of humans.txt but so far I have to agree with Graywolf, it can be easily used to steal content as well, especially from sites that have a slow cache rate…

Reply

Kaiserthesage June 15, 2011 at 8:57 am

Thanks Zarko!

Looks like a draining process for a site redesign :D I’m also thinking of switching to a new design (HTML 5), been really inspired of what Tom Critchlow mentioned on your interview :)

Reply

Dennis Edell@ Direct Sales Marketing June 15, 2011 at 7:19 pm

Humans.txt sure looks interesting, I’ll have to do little more research n that one; thanks for the top-off. :)

Reply

lawmacs June 17, 2011 at 3:49 am

Hi Jason Thanks for the heads up on the human.txt file it seems great to me although fairly new to it i believe it has its place and should surely be added to ur sites mark up. On to the point about usability i could echo the words of Google take care of the users experience and the rest will follow. Thanks Jason a very informative post.

Reply

Connie@roofing contractor kalamazoo June 17, 2011 at 11:13 am

These are really what you call an advantageous strategy, except for the Schema.org part. Is it really beneficial to SEO? I have read debates about this features and from there, I can conclude that many people are skeptical to use this new approach in on-page optimization.

Reply

Anna@WhiteHatters June 18, 2011 at 12:39 am

What do you mean by domain authority? what does it have to do with seo?

Reply

Chandan@Earn Money Online In India June 24, 2011 at 1:40 pm

Never get disappointed coming in your blog. Humans.txt was new to me and lot others. You have become like an SEOMonk. Giving everything for free. Would get a pingback soon from my end.

Reply

Kaiserthesage June 24, 2011 at 5:07 pm

Wow, thanks for the awesome comment Chandan and for the future link love :) , really glad that you liked this post.

Reply

Jon June 28, 2011 at 7:01 pm

great piece jason. i always learn something new when i come to your site. keep it up!

Reply

Sprinkler Buff July 8, 2011 at 9:31 am

So rel=author isn’t as new as everyone is making it sound? I like the humans.txt version better because it allows for a twitter badge. To me, that just adds more authority since Twitter is such a huge influence on Google these days. Thanks for the link to Lisa’s post on Sullivan with Cutts. That was an interesting lecture. Glad I got to read up on that.

Reply

Devon August 18, 2011 at 6:41 pm

Excellent post I must say.. Simple but yet interesting and engaging.. Keep up the awesome work Jason!

Reply

DJ Trinity@New Songs 2011 August 20, 2011 at 10:45 pm

Hey,!

Those were some really good tips. I pretty much knew about all of ‘em before reading this post — but I sure learned something new.

As for me and my on-page SEO — I think I have some tips for you.

First of all, always write long texts. Writing a 100 word article and thinking that Google will adore it is completely wrong. That will never happen. Search engines, and especially Google, breathe content, and that is exactly why it is so important. I always aim at about 500 words per article.

Another tip which is relatively new thanks to the Panda Update is: don’t use lots of ad blocks on your website if you don’t have lots of content written.

If you, say, have an article consisting of 100 words — don’t go with three AdSense Blocks. That is extremely unwise.

Not only will your CTR be low — you’ll also be considered as a low-quality website by Google leading to you getting thrown out. I’ve experienced this myself before, so I believe I know what I’m talking about.

Reply

Thomas Jackson@vanderbilt beach fl August 22, 2011 at 3:50 am

These are some fantastic tips here, Jason. All of us webmasters and web owners definitely need to keep up with search engines as they will be changing all throughout the year. Constant updates and change in search algorithms are hard to keep up with. 2011 has been a very busy year so far. We all need to be reminded how important URL structures are. Well, there have been rumors that Google has been paying less attention to keywords in URLs. However, experience proves that keywords in URLs are still significantly critical.

Reply

John September 8, 2011 at 11:52 am

Great post! Have you seen any before/after examples from adding these tags?

John@San Francisco Wedding Photographer

Reply

Rachelle@CeriCom Web Design September 14, 2011 at 8:28 pm

Hi,

Another great article, thank you.

We have been sinking our teeth into Schemas, it’s all about keeping up after all! We hadn’t heard of the human.txt file, so thank you for the heads up.

Reply

Lavoie September 22, 2011 at 8:10 pm

Good post!

These are some fantastic tips here, Jason. All of us webmasters and web owners definitely need to keep up with search engines as they will be changing all throughout the year.

Reply

Modi September 27, 2011 at 3:23 am

Humans.txt was new to me too.

Re linking out to trustworthy, authority sites. Although in theory it sounds good, is that something you’ve actually tested? It just sounds too good to be true. I just think if that was the case all SEOs would link out to PR9, PR10 sites, .gov, .edu etc. Any more hints?

Reply

Betshoot October 18, 2011 at 4:00 am

Nice one, i guess humans.txt can help about the author highlighting in results, but it seems that it needs to be updated frequently, since the ‘Site: Updated’ tag needs to be changing everyday if your website has a daily rythum of posts and news.

Probably can be done with a simple php function i guess.

Thanks for your information :)

Reply

Dan November 2, 2011 at 7:56 pm

Great list, I’m just working on my Schemas mark up on all of my sites, it might actually take forever but I’m hoping it’ll be worth it in the end.

Never heard of the rel=author thing though, how much relevance do you think search engines will place on this?

Reply

Sayed @webuildlink November 24, 2011 at 10:09 am

Excellent Post,
Jason you are Rocking !
Learned two new things : a) Humans.txt b) Content Length is Matter to Google(you are writing article over thousands words ;) )
in Next Few days i will be busy with Schema.org Many things are waiting there to learn

Thanks for the Valuable Information

Reply

Hiren December 4, 2011 at 11:04 pm

Great Read, Just curious to know about humans.txt. How web crawler read this text file? We need to put this on the root folder. Let me know for the same. Thanks.

Reply

Bill December 30, 2011 at 3:49 pm

Thanks for the Humans.txt file info. I will be implementing that on all my sites in the next few weeks. Good to know that trust can be built that way.

Reply

David January 23, 2012 at 3:29 am

thanks mate for sharing this as i was looking for some on page tactics too like you know its a bit techy as well but after reading this blog post i have opend my mind very much to understand things that will help me too now to improve me on page skills Thanks For Sharing it =)

Reply

Leave a Comment

{ 7 trackbacks }

Previous post:

Next post: