Cookies

This website uses cookies to ensure you get the best possible experience on our website.

We have made some changes to our privacy policy and terms and conditions about how we use your data.

GOT IT!
Find out more
Cookie Portal
Manage and find out more about the cookies used on this website.
View Cookie Settings
Read Cookie Policy

Accept All Cookies and Close
Close Without Saving
< Back
This website has 3 types of cookies:
Your preferences will not take affect until the next page loads or this page is reloaded.
Strictly Necessary Cookies
Feature Cookies
Performance Cookies
Save and Close
< Back
< Back
Cookie Policy
< Back

Our Top 5 Technical SEO Tips for your ECommerce Site

If your ecommerce website is the lifeblood of your business, you’re doing things right – the majority of well-known retailers either trade or market online. With growing online competition, the race to the top (in this case Google) is fierce, and only the strongest players will survive.

30th Jul 2020
Debbie Rousseau Account Manager 30th July 2020

Top 5 Technical Wins for ECommerce SEO

In the last few years, Google has released several new algorithm changes to its search engine, making it increasingly difficult to rank number one for your key search phrases. We also need to remember that technical SEO represents more than just meta data. This specialist field of search marketing is focused on how well engine spiders can crawl and index your content. Unfortunately, its importance seems to be often underestimated.

We have put together a list of the 5 most common technical errors on ecommerce websites that can affect SEO:

1. Internal 301 Redirects

Your website should never have 301 redirects within the navigation, buttons and internal links. This is not to say that 301 redirects do not have their place on a site - they do, and are a crucial SEO method for handling broken or expired pages and out of stock items.

The problem occurs when a page has been redirected to a newer version and the navigation/site links have not been updated, making on-page links go through a redirect chain.

Why is this a concern? Well, a redirect chain means that when Google visits your website, it will have to work a lot harder to get through to other pages on your site. With a 301 redirect chain in place, Google spends a lot more of its “crawl budget and time allocation” going through these chains and less time on the actual pages. 

Google rewards easy to navigate websites – the easier it is for the bot to crawl your content, the more pages on your website it crawls during each visit. 

 

2. Duplicate Content

Duplicate content is content that’s similar or the same on other websites or on different pages on the same site. Duplicate content can still impact search engine rankings. When there are multiple pieces of “appreciably similar" content in more than one location on the Internet, it can be difficult for search engines to decide which version is more relevant to a given search query.

We still see many websites that have not dealt with tackling the duplicate content problem. This is usually not done intentionally and many brand owners/SEO teams are simply unaware of it.

Quick Tip:

If you are using manufacturer’s copy, or suspect your content may be duplicate, the easiest way to check that is to highlight the passage of text on your website and carry out a Google search to see if it appears anywhere else. You can also use online plagiarism checkers; great tools to try are Copyscape & Siteliner.

The main culprits for duplicate content are as follows:

  • Meta data (page titles, meta descriptions, page copy)
  • Filter and search pages
  • Using manufacturers’ copy
  • Product variants

In most cases, duplicate content can be easily addressed with best practice canonicalisation, meta level NOINDEX directives, and robots.txt crawl blocking.

The first step though is finding out where your site has duplicate content. Again, this can be done using the main SEO software programs like Screaming Frog, as this gives a great visual breakdown of your duplicate content and will tell you if the duplicate content page is canonicalised, set to noindex, blocked by robots.txt, etc.

3. Miss-canonicalisation

Ecommerce sites often rely heavily on canonicalisation, not only to make sure that there is no undeclared duplicated content on a website, but also to place product pages within additional categories to make the consumer journey as seamless as possible.

Using canonicals is not a new thing, and nearly all ecommerce sites benefit from it. That said, not all these sites utilise it to optimal effect. If it is done wrong, you could effectively be blocking over 50% of your website.

Why are miss-matched canonicals so dangerous for SEO? By using the canonical tag, you are declaring to Google that a specific page is not the master page for the site, and that it may be seen as duplicate content. Basically, you come close to telling Google to ignore that page. This is particularly dangerous when the master (canonical) URL is not the same as the URL in the site’s hierarchy (orphan URL). Many ecommerce websites canonicalise to a master product page, which has a unique product orphan URL that can be used on all pages, as it is not defined to any category. In many cases, this is the “easiest” and most used way to canonicalise, but it’s not the best for SEO and organic rankings.

If you look at this from Google’s point of view, they want to find out which pages are the most important for you. And how does Google gather this insight? It looks at how many times you link to a page on your website. As mentioned above, canonical product URLs are often outside the website’s navigation, and therefore have very few, if any, links.Addressing this issue requires a lot more than a quick look through SEO tools. The implementation of canonicals is often based on a historical internal decision, or on expert advice, and the subsequent implementation is often inconsistent.To identify your master (canonical) page on your site, check the source code (right click on a webpage and click the option “view page source”), do a page search for “canonical”, and see what URL it shows. Is this the best URL to have ranking in Google for a given product? Is it consistent with the broader message across your site?

4. Page click depth & page visibility

It is generally considered best practice for the customer’s journey to be 3 clicks away from the homepage. This means that your site should be structured so that the key products can be reached within 3 steps.

Although this is best practice, in reality it is not always followed. Some websites have a very difficult navigation structure, making the whole journey a lot more difficult than it needs to be. The trade-off on these sites is always between the design and functionality.

Many web designers do not like a faceted/detailed navigation. They would rather have links within the pages, as they believe it is more representative of the customer’s journey. However, such structure makes the journey a lot more difficult for the Google bot.

We need to make sure that all important pages are within the website’s navigation and have internal links throughout the site (this can be achieved through contextual links within category pages, breadcrumbs or related product boxes).

5. Site Crawlability

Without a doubt, site crawlability is one of the most important areas of SEO. If Google cannot crawl your site efficiently, you will not rank as highly as you could, even if all your other areas of search marketing are in alignment.

On larger e-commerce sites, crawlability can be an issue. Very often, these sizable websites have multiple categories, filter pages, advanced search parameters, pagination and many other factors that Google needs to be able to crawl in a concise manner.

Google allocates a “crawl budget” to your website, which means that every time the bot visits the site, it will spend a portion of that budget over a 24hr period. If Google spends time crawling pages that have little value compared to you, you have failed; if it spends time crawling pages that are 301 redirect chains, you have failed; and if it encounters broken pages ending in a dead-end for the crawl, guess what, you have failed yet again.

Crawl efficiency and optimisation is not something that can be easily achieved. It involves a specialist technical SEO project to analyse all areas of your website and locate these inefficiencies.

What can be done in-house? How do you start to improve your site’s crawlability? At the most basic level, you need to make sure you have a detailed navigation with all top/sub-level categories, and ensure you have internal links throughout your site’s pages that link to other pages. One of the most important areas is an accurate sitemap that only contains the key pages.

Finally, spring clean your website - if a page has had no traffic in the last 12-18 months, reconsider its value and impact on the overall success (check metrics before deleting/noindexing).

When it comes to technical SEO, there are a lot of areas that you need to get right. For more help and advice on technical SEO get in touch. Alternatively, request a complimentary 1-1 30-minute consultation - book your slot.