SEO is a consistently evolving field of modern marketing, given that search engines constantly improve their algorithms’ capabilities of returning high quality and highly relevant pages to their users’ queries. Search engines – especially Google – use hundreds of factors to estimate pages that deserve to show up on their top results.
There are two major processes in search engine optimization that enable websites and pages to rank for its targeted keywords on search results – and these are on-page and off-page optimization.
Off-page optimization is the part of SEO which involves anything that’s related to making a website and its pages popular (from external citations from other websites through link building and social media), while on-page optimization is the area where pages are able to obtain higher search rankings through the relevance of the page’s content to its targeted search term or keyword.
As I have mentioned above, there are hundreds of factors that search engines use to determine high quality pages, and that count just doesn’t stop there, as they are still finding more ways to improve their users’ search experience. So I have listed below some of the on-page optimization techniques that I believe worth exploring and testing this year.
On-page Markup through Schemas
Google, Bing and Yahoo recently introduced Schema.org, as the shared vocabulary that these 3 big search engines will be using to help them better understand web pages’ content – which uses Microdata as its structured data markup.
Schemas like Microdata are set of HTML tags that can be used to specify essential information about a webpage’s content, which include numerous types of markups (more than 100 as mentioned by Google) and new HTML elements (such as “itemscope, itemtype and itemprop”) that can make it much easier for search engines to determine and weigh page relevance. You can check the full list of markup types here.
Here’s a sample of what your page’s codes will look like if you use this markup:
The more markups you make, the better search engines will understand your content, and the better your pages are presented on search results through rich snippets. Google is not yet considering usage of markups as a search ranking factor, but will eventually be, and thus affect/improve your pages’ SERP click-through rate and its ability to attract traffic as they appear outstanding with the way they are displayed on search results, like this one:
The length of the document is a powerful indication of a webpage’s quality, knowing that changes that Google had employed with their Panda update are somehow strict when it comes to classifying quality and crappy content.
In my experience, longer content appears to perform well in search results, since most of my blog posts are composed of thousands of words. Given that they contain so much information, a single post/page is able to target multiple long-tail keywords that in turn allowed my blog to attract more organic visitors.
A page’s ability to send massive social signals to search engines also impacts the page’s chances of getting higher search rankings. Pages that are mostly capable of doing this are purposely created to act as a viral content, or basically a piece of work that’s made to draw attention or interest from specifically targeted audiences.
The visibility of social sharing buttons as well as the numbers of social shares is a good way to attract readers to sharing your content and in linking to it.
Building support pages that have strong social signals
Landing pages, particularly sales pages, are kind of awkward and sometimes hard to be externally promoted through link building, especially if you are aiming to acquire editorial links. However, creating high quality pages (pages that are able to attract links and social shares naturally) in your site that can support your important landing pages through internal linking can extremely pass through huge amount of link value, which can improve your landing pages’ search rankings.
You can also choose to build support pages that can be hosted externally through guest blogs, wherein you can contextually link to your landing pages. It’s best to offer interesting content to be that can still be thematic to your landing pages (ex. if your site is offering a dog training course, you can build support pages like “Top 100 dog trainers in the world (widget)”, “Dog name generator (tool)”, “list of 100 dogs that can easily be trained” etc… and make these pages link to your dog training course sales’ page using highly descriptive or branded anchors).
This is perhaps the most significant ranking factor out of the hundreds that are on different search algorithms’ lists, as search engines are more to vouch for your site if it has proven its worth of being useful to visitors. If you are aiming to rank for highly competitive search terms, it’s imperative to study your important pages’ conversion factors to be able to distinguish areas of the page that needs to be improved.
Knowing your visitors’ activity and behavior once they are on your important pages can simply hint you with what to implement to make it perform better (by determining your site’s traffic performance through Analytics). Here are some of the things that you can do to improve your site’s traffic conversions.
Enhance site speed.
Include translation features if your site is getting substantial visitors from non-English-speaking countries.
Lessen visual distractions like ads and other irrelevant site elements.
Invest on a visually attracting web design.
Simplify the delivery of your landing pages’ content and reduce irrelevant linking.
Accessibility of other thematically helpful pages.
Presence of site search feature.
Test your pages on different browsers.
Data to keep track of:
Top exit pages
Top landing pages
Improve domain authority and trust
Yeah, this may seem off topic, since improving a domain’s trust and authority scores is mostly done externally through link building. However, domain level metrics such as domain authority and domain trust are strong factors that can really impact and influence your web pages’ search rankings, especially with highly competitive keywords.
Websites that have high domain authority and trust (can be approximately measured through Open Site Explorer’s full link metrics) are also able to make their newly published pages earn higher positions on SERPs in just a few days – and sometimes in a few hours. Improving your site’s trustrank and authority may take some time, seeing as its development mostly requires long and enduring processes such as:
Link diversity – this pertains to the variation of your incoming link’s anchor texts, velocity of the site’s link growth over time, ratio of links pointing to your site’s inner pages to site’s homepage and diversity of the methods used to acquire links, which are assessed through the links’ placements (sidebar, footer, comment, editorial, etc…).
Quantity of backlinks that have good placements – number of high value links that the entire site was able to acquire from other authority sites that are topically relevant, have high visibility (within the body) and have high click-through rates.
Internal linking – staging a good internal linking structure that helps web crawlers find deep pages in your website. Also, supporting your important pages through internal links from other prominent pages in your site that have high amount of link juice (strong MozRank, PageRank, Page Authortity, high percentage of traffic entrance, etc…)
Quantity of strong pages hosted by the site – number of popular pages in the site that have acquired good amount of high quality links from other websites through social shares, editorial citations and may also be authoritative in terms of PageRank, MozRank, Page Authority, PostRank (which is now owned by Google), and have good search rankings.
Domain age – uhm… the years the site has been live?
It has been a myth that linking out to other websites/web pages within your content reduces that certain page’s PageRank, however, linking out to high quality external pages do establish trust through link relation, given that you may possibly be referencing back to a trusted source.
The point is, having links to reputable websites can build trust, and trust does eventually result to good ranking positions. With appropriate usage, these external links might just help your site obtain good search rankings as search engines are more to trust your content.
SERP Click-through rate
Traffic data that search engines are able to gather through their search engine result pages such as web pages’ click-through rates do seem to have effect on search rankings, for competitive search queries in particular. Most experts also believe that CTR from search engines to the page for the targeted search term improves the page’s ranking position, as mentioned on SEOmoz’s survey results of search ranking factors for 2011 (Page-level traffic data).
Improving your pages’ SERP click-through rates can be performed in ways such as:
Use of strong and actionable words on the page’s title and meta description.
Titles that use numbers seem to work well on SERPs, in terms of click through.
Good web design, as it might be displayed by users through Google Instant Previews.
Geotagging is a geographical identification metadata – as described on Wikipedia – that allows users and search engines see your business’ actual geographic location through latitude and longitude coordinates, which is certainly a good way to establish trust to both users and search engines. This method may also benefit you through other means of search (GPS tracking), given that mapping services are able to track your location and be included on their listings.
There are several formats used in Geotagging a website, but the most utilized is Google Earth’s Geo.kml. Luckily, I found Geo Sitemap Generator, a free web-based application that can automate the process of generating a KML file and a Geo Sitemap, which you can upload to your site’s root directory after having those files downloaded from the free application.
Humans.txt is HTML 5’s approach to authorship markup (rel=”author”). This text file is pretty much similar to Robots.txt, but intended for both users and web crawlers for them to know the author of the content or a website, which is also a good way to establish trust as well as to credit the creator of the content/site. Below is a sample of what a humans.txt file looks like:
Once you have created a humans.txt file for your site and have uploaded it on your site’s root directory, you can then include an author tag through your site’s <head> section to enable web crawlers in accessing the page; like this:
Google: We know that great content comes from great authors, and we’re looking closely at ways this markup could help us highlight authors and rank search results.