In my last blog post, we covered 9 Key Areas of Focus for On-page Keyword Placement which focused on keyword research and implementation, popular topics from last year. As we move into 2017, I thought I would provide some high-level insights into some more technical optimizations that should be looked at and monitored on a consistent basis to maximize the functionality of the website and provide an optimal user experience.
The user experience (UX) of a website, in my personal opinion, should be one of the main points of focus for the marketing team of a company of any size and, from an SEO’s standpoint, there is nothing more frustrating than having solid keyword optimization, great visibility in the SERPs, and an increasing click through rate only to have the visitor “bounce” due to their experience once they reach the site.
A web page’s load speed is a significant factor in its usability as this is an increasingly significant factor for determining a page’s organic search ranking potential and keeping users on your site. Google uses speed as an organic search ranking factor for the top 1% of competitive queries.
Page Speed Facts
- Users expect web pages to load in 2 seconds or less.
- 40% of customers will abandon any site that takes longer than 3 seconds to load.
- For every 1 additional second of load time, conversion drops by 7%.
- For every 1 additional second of load time, user satisfaction drops by 16%.
As we are reviewing our clients’ website Analytics, we continue to see massive growth in mobile (smartphone) traffic. Google has also recently stated that they will be focusing on a mobile first experience by indexing and ranking the search listings based on the mobile version of website content. In light of this, as well as the smartphone increasingly becoming the launching point for initial searches and an ever-more popular conversion platform, an optimal mobile experience is crucial to SEO.
Mobile Optimization Best Practices
The ideal solution for mobile sites is responsive design. This is an emerging web development trend whereby instead of serving content to mobile devices from a separate location (e.g. m.mywebsite.com or www.mywebsite.com/mobile/), a site serves a single version of all pages to all devices, using CSS to adjust the page’s dimensions to fit any screen size. This approach has already become the standard in the UX community for its elegant simplicity and dependability, and is rapidly becoming the standard in the SEO community as well, not only because UX informs SEO more and more over time, but also because:
- a responsive design precludes any risk of duplicate content and link dilution issuing from serving two versions of a page through two different portals
- Google rewards responsive sites with higher rankings in mobile search
Few things threaten a webpage’s search visibility like the existence of another page on the web showing the same content under a different URL. When multiple versions of the same page live side-by-side in search engine indices, the different versions begin to earn backlinks and social recommendations independently, thus badly diluting the potential authority of the associated content and fostering internal competition for the same rankings. Best practices dictate that a site be checked for all potential sources of duplicate content, and that any found instances be resolved with 301 redirects or the rel=”canonical” tag, depending on circumstances.
Common Causes of Duplicate Content:
- pages resolving both with and without the WWW prefix (e.g. mywebsite.com v. www.mywebsite.com)
- pages resolving both with and without a trailing slash (e.g. www.mywebsite.com/cool-stuff v. www.mywebsite.com/cool-stuff/)
- pages resolving in both the http and https protocols
- pages with URL case variations resolving independently (e.g. www.mywebsite.com/cool-stuff/ v. www.mywebsite.com/Cool-sTuff/)
- desktop and mobile versions of pages being hosted at separate URLs but showing identical content
- staging sites remaining in search engine indices after real site has launched
- homepages resolving both with and without index URI extensions (e.g. www.mywebsite.com v. www.mywebsite.com/index.php)
Sitemaps are comprehensive lists of all the pages on a website. They come in two forms:
HTML sitemaps: user-facing pages that provide on-page links to every other page on the site, typically arranged in a way that indicates the site architecture
Dynamic XML sitemaps: search-engine-facing documents that list every page on the site in an error-proof vertical format that search engines can understand immediately and perfectly, typically bundling with each page’s URL such auxiliary information as:
- the date of the page’s most recent content update (the “lastmod” line)
- the importance of the page relative to others (the “priority” line, on which the site owner assigns each page a value score on a scale of zero to one).
Sitemap Best Practices:
Sites should be equipped with a dynamic XML sitemap, for the sake of search engines. HTML sitemaps are optional.
Robots.txt is a file that lives in a site’s root directory and is configured to be the very first thing that a search engine spider reads upon arrival. It establishes rules as to which sections of a site a search engine may and may not crawl, and which specific spiders, if any, are to be denied access outright. It also can be used to direct spiders to the location of the site’s XML sitemap, providing the quickest possible path to an overview of a site’s entire information architecture.
Robots.txt Best Practices:
A robots.txt file should:
- use a Disallow instruction to exclude subdirectories that correspond to secure or private content
- not specify a crawl delay
- indicate the location of the XML sitemap
- be named simply “robots.txt” and be hosted in the root directory
404 Error Page:
A website should provide a friendly 404 error page in the event that a user or search engine follows a broken link or a mistyped URL.
404 Error Page Best Practices:
A 404 error page should:
- fire a proper 404 code at the URL entered, instead of redirecting to an all-purpose universal error page URL
- preserve site design and brand voice
- preserve navigation and footer
- explain that the page sought cannot be found
- provide on-page links to top-level pages (the homepage at the very least) to keep search engines from running into “dead ends”
For help and guidance with the technical optimization of your website, Contact Anvil today.