Not all search engine optimization is keyword research and link building. The technical aspects of your website, from the structure of your content, down to the cold hard code, are as important, or even more important, than building links and keyword optimization. If your site isn’t search engine friendly, or doesn’t adhere to Google’s Webmaster Guidelines, then it will be continually penalized, and will never rank to it’s full potential. According to Moz’s 2015 Ranking Factors Study, these are the top 10 technical SEO ranking factors in order of importance. Hreflang Declaration The hreflang declaration tag (seen as rel=”alternate” hreflang=”x” in HTML code) is an html tag that tells Google what language your site is written in. It is used to signal to search engines which version of a page to look for, depending on the location and language of the searcher. For example, if you have an English and a Spanish version of a page, and the prospect is searching from a Spanish speaking country, Google will choose the page with the hreflang= “es” (Spanish) over the the page tagged hreflang= “en”, as the tag helps Google infer which version is more appropriate. This is what the snippet looks like for an English site in the Unites States: Number of Internal Links Internal links, or links from one page on your site to another page within your site, are important for SEO for several reasons. First, they allow you to pass on authority from your highest authority pages to your lower authority ones. Second, they provide more paths through which Google can crawl your site. The more links from your main pages to your sub-pages, the easier it is for Google to discover these deeper pages and index them. Use this internal link to check out our blog on backlink tools. URL Structure URLs should be kept simple: short, and hyphen free. That’s what Google wants, as long URLs with excessive use of hyphens have proven to perform worse than short and easy URL’s. This makes sense, as Google continues to stress user friendliness and the user experience, and having a short and easy to remember URL fits that criteria. Link to Content Ratio Google likes content, we know this. They also hate link spam, we know that too. Which is why it makes sense that if you have a ton of links on your site but not much content, Google will think you’re trying to pull some kind of link scheme and de-rank you. That’s why it’s good to keep the link to content ratio low, to make sure you’re not raising and red flags. Code to Content Ratio As with the link to content ratio, the code to content ratio is best kept low. Lots of code paired with little content again will raise spam flags with Google, as it makes the seem as though the site isn’t being used. The excess code can also greatly hinder your page speed, which also negatively affects your rank. Google Analytics Tracking Code According to the study by Moz, websites with a Google tracking code installed performed better than those without. Perhaps this is a signal to Google that the website is run by a webmaster who is actively involved in monitoring it, and therefore likely to be more trustworthy. For those of your who don’t know, this is what the Google Analytics Tracking code looks like in HTML. var _gaq = _gaq || []; _gaq.push([‘_setAccount’, ‘UA-1337H@X0R-1’]); _gaq.push([‘_trackPageview’]); (function() { var ga = document.createElement(‘script’); ga.type = ‘text/javascript’; ga.async = true; ga.src = (‘https:’ == document.location.protocol ? ‘https://ssl’ : ‘http://www’) + ‘’; var s = document.getElementsByTagName(‘script’)[0]; s.parentNode.insertBefore(ga, s); })(); Robots.txt Robots.txt are important as they tell search engine spiders like Googlebot how they should interact with the pages and files of your web site. If there are pages, files, or images that you do not want Google to index, you can block them with the robots.txt. Without a robots.txt, Google will indiscriminately index everything on your site. URL is HTTPS Secure websites, or websites with SSL Security Certificates, are shown to do slightly better in the SERP’s. This is likely a signal to Google that your site is safe, secure, and real. Once again, the better the user experience, the better you will rank. XML Sitemap A sitemap is essentially a map of the pages on your site. This map contains metadata and information about the organization and content of your site. Googlebot and other search engine web crawlers use sitemaps as a guide to more intelligently crawl your site. Having a sitemap can help your pages get indexed, and allows you to highlight content that you want search engines to crawl. Markup Schema markup is a way to change the appearance of the meta information presented about your site in the search engine pages. By using a markup, the meta description under your search engine listing can be modified to present information like reviews, employee profiles, etc. Having proper schema markup for certain information can even land your content in the Google answer box, which is a guaranteed way to drive traffic to your site. So, if you’re trying to rank your site up and aren’t seeing much success, these elements may be holding you back, as they are essential to earning Google’s trust. Keep your site content oriented, user friendly, and easily able to be crawled by the Googlebot, while simultaneously ensuring your site is search engine friendly.


Please enter your comment!
Please enter your name here