Site iconDigital Marketing Agency | Portland PPC SEO Services | Anvil Media

6 Technical Website Optimizations for SEO

SEO-OptimizationIn my last blog post, we covered 9 Key Areas of Focus for On-page Keyword Placement which focused on keyword research and implementation, popular topics from last year. As we move into 2017, I thought I would provide some high-level insights into some more technical optimizations that should be looked at and monitored on a consistent basis to maximize the functionality of the website and provide an optimal user experience.

The user experience (UX) of a website, in my personal opinion, should be one of the main points of focus for the marketing team of a company of any size and, from an SEO’s standpoint, there is nothing more frustrating than having solid keyword optimization, great visibility in the SERPs, and an increasing click through rate only to have the visitor “bounce” due to their experience once they reach the site.

Page Speed
A web page’s load speed is a significant factor in its usability as this is an increasingly significant factor for determining a page’s organic search ranking potential and keeping users on your site. Google uses speed as an organic search ranking factor for the top 1% of competitive queries.

Page Speed Facts

Mobile Optimization
As we are reviewing our clients’ website Analytics, we continue to see massive growth in mobile (smartphone) traffic. Google has also recently stated that they will be focusing on a mobile first experience by indexing and ranking the search listings based on the mobile version of website content. In light of this, as well as the smartphone increasingly becoming the launching point for initial searches and an ever-more popular conversion platform, an optimal mobile experience is crucial to SEO.

Mobile Optimization Best Practices
The  ideal solution for mobile sites is responsive design. This is an emerging web development trend whereby instead of serving content to mobile devices from a separate location (e.g. m.mywebsite.com or www.mywebsite.com/mobile/), a site serves a single version of all pages to all devices, using CSS to adjust the page’s dimensions to fit any screen size. This approach has already become the standard in the UX community for its elegant simplicity and dependability, and is rapidly becoming the standard in the SEO community as well, not only because UX informs SEO more and more over time, but also because:

Duplicate Content
Few things threaten a webpage’s search visibility like the existence of another page on the web showing the same content under a different URL. When multiple versions of the same page live side-by-side in search engine indices, the different versions begin to earn backlinks and social recommendations independently, thus badly diluting the potential authority of the associated content and fostering internal competition for the same rankings. Best practices dictate that a site be checked for all potential sources of duplicate content, and that any found instances be resolved with 301 redirects or the rel=”canonical” tag, depending on circumstances.

Common Causes of Duplicate Content:

Sitemap
Sitemaps are comprehensive lists of all the pages on a website. They come in two forms:
HTML sitemaps: user-facing pages that provide on-page links to every other page on the site, typically arranged in a way that indicates the site architecture
Dynamic XML sitemaps: search-engine-facing documents that list every page on the site in an error-proof vertical format that search engines can understand immediately and perfectly, typically bundling with each page’s URL such auxiliary information as:

Sitemap Best Practices:
Sites should be equipped with a dynamic XML sitemap, for the sake of search engines. HTML sitemaps are optional.

Robots.txt
Robots.txt is a file that lives in a site’s root directory and is configured to be the very first thing that a search engine spider reads upon arrival. It establishes rules as to which sections of a site a search engine may and may not crawl, and which specific spiders, if any, are to be denied access outright. It also can be used to direct spiders to the location of the site’s XML sitemap, providing the quickest possible path to an overview of a site’s entire information architecture.

Robots.txt Best Practices:
A robots.txt file should:

404 Error Page:
A website should provide a friendly 404 error page in the event that a user or search engine follows a broken link or a mistyped URL.

404 Error Page Best Practices:
A 404 error page should:

For help and guidance with the technical optimization of your website, Contact Anvil today.

Exit mobile version