As we enter into peak December trading, the time for hiding problems within your website is over.
If there is anything technically wrong with your website at this time of year, you and your marketing team will know about it firsthand (on many occasions from customers, not just Google).
Demand spikes by 300% in some sectors, servers are overloaded, and the warehouse team are working overtime to fulfil orders in time for Christmas.
Many brands will have recently experienced this as the demand for Black Friday creates significant demand within the last Friday of November.
As a retail brand to capitalise on the Christmas trading event you need stock to service the demand, you need a structured ecommerce SEO campaign guided by an experienced agency partner to drive the traffic and you need not only a functioning website you need a website experience that brings a smile to your customer base.
Oh, and on top of all that you need a technically competent website to Google can decide if it likes you enough to give you consistent rankings.
As during December, we enter into “Christmas index madness” where ranks change hourly. This added into the mix with Google seriously loosing the plot recently (I can’t remember the last time I refreshed a search page and was met with the same experience).
Google are testing so much. We know Google loves testing – recently they have emulated a classic “beginner CRO approach” for someone as experienced as testing with them.
“pick a high volume, high demand term – serve different results, change the colour of filters, put more images in the results, shift google shopping paid, put google shopping free listings at bottom of Page 1. Oh, and the cheekiest one of all – put blue link paid sponsored listings mid page one”.
Test all of those then put a survey link “how would you rate the experience”. I mean come on that is the worse CRO/UX test I have ever seen – compounded with some of the direst search experience I have encountered since 2012.
With all this “madness” we need calm and control. We can’t control what Google does but we can make sure we capitalise on the organic opportunity by working though what I would state are the quickest, hopefully non-developer technical updates you can do now within your marketing team to ensure you are in control of your own destiny this Christmas.
Before we get into the tips and tricks.
Why as a performance marketing manager should I care about technical?
Technical SEO is more than simply meta data, and this specialised sector of SEO has recently had a rebirth, which is more than welcome given the situation of ecommerce technical SEO.
According to recent Polaris study, over 98% of enterprise-level “big brand” ecommerce websites have technical flaws that might be readily solved that are impeding their SEO success. These companies either employ in-house SEO teams, pay for pricey digital marketing consultants, or have access to world-class resources. SEO is on their radar, but they continue to lag.
Many e-commerce websites use a variety of ecommerce platforms, which by definition are enormous and will already rank highly for key consumer inquiries (often on page 1 of Google). When you know where to search for these technical difficulties, you can correct them right away, and results come swiftly with these critical modifications.
This is why as a marketing manager or ecommerce manager you should care about technical.
To help ecommerce websites, we’ve decided to compile a list of the five most common technical areas on ecommerce websites that we see most frequently that affect SEO and, when addressed, can have a positive effect on your website’s ranking positions in search results.
We have also recently published another article on additional Christmas SEO quick wins for ecommerce brands (not just technical tweaks). Which is recommended reading as an accompaniment to this article.
Proven SEO tips for ecommerce websites
1. Internal 301 Redirects
There is no excuse, your website should never have 301 redirects within the navigation, buttons and internal links. This is not to say that 301 redirects do not have their place on a website they do and are a crucial SEO method for handling broken pages and out of stock items.
However, in this case the problem exists when a website page has been redirected to a newer version and the navigation/site links have not been updated. So, what happens is that your on-page links go through a redirect chain.
Why is this a concern? Well, a redirect chain means that when Google visits your website it will have to work a lot harder to get to other pages on your site. With a 301 redirect chain in place Google spends a lot more of its “crawl budget and time allocation” (more on this later) going through these chains, so next time it visits your site it may not spend so much time crawling.
Google rewards websites when you make it easy for it to navigate your site, the easier the bot can crawl your site the more pages on your website it crawls each visit. When you bear in mind that Google crawls some ecommerce sites 5 times a day this is a key area to get right.
Another reason to mitigate 301 redirect links is as SEO professionals we need to retain page equity, the higher the equity value of a websites page the more value it has to Google. When we have navigation 301 redirects this link equity passes through the chain and dilutes the power of the link. This means that a 301 link from our homepage (the highest equity page on a website) will be diluted to its final destination page, which on many occasions will have an impact on page level ranking positions.
This area of technical SEO is a relatively quick win and if you have SEO software tools such as Screaming Frog, Moz or any other crawling software you prefer you will be able to see where your 301 redirects lie within your sites navigation. Once you have found these make a note and contact your web development team to get these links fixed.
Quick tip: the key areas that these 301 redirects reside in are usually – the top/footer navigation, breadcrumbs, homepage links, logo link. These can all be easily checked by clicking on the link and checking the http status with a 301 redirect checking bar such as moz (other redirect extensions are available for Chrome or Mac users).
Bonus tip: Pick your high performance pages and ensure that there are no redirects going to these pages from contextual links (links within the body copy). To do this you will need an internal link export from a crawling tool such as Screaming Frog/Sitebulb.
2. Duplicate content
After the panda update 2012 duplicate content was an area fiercely attacked by Google. In 2023/24 duplicate content should be obsolete, however week in, week out we see many ecommerce websites that still have duplicate content. This is usually not done intentionally and many brand owners/SEO teams are unaware of this. Be that as it may this is still an area of SEO that gets overlooked and causes long-term issues with rankings.
The main culprits for duplicate content are as follows:
- Meta data (page titles, meta descriptions, page copy)
- Filter and search pages (facets are the number one culprit)
- Using manufacturer copy
- Product variants
The above website areas may look very simplistic and many of you will be aware of these, so this will come as no big shock. But the surprising thing to note is that 9/10 ecommerce websites we have analysed just this year have all had duplicate content in some form relating to the above.
Size is not a given, brands operating with 100M turnover suffer from duplicate content.
The ultimate fix for duplicate content for the majority of the terms listed above can be easily addressed with best practice canonicalisation, meta level NOINDEX directives and robots.txt crawl blocking.
The first step though is finding where your site has duplicate content. This can be done again by using the main SEO software programs. We would recommend Screaming Frog as this gives a great visual breakdown of your duplicate content and also will state if the duplicate content page is canonicalised, set to no index, blocked by robots, etc…
Quick tip: If your using manufacturer copy or suspect that your content may be duplicate the easy check is to highlight the passage of text on your website and carry out a Google search to see if it appears anywhere else. You can also use online plagiarism checkers; great tools to try are Copyscape & Siteliner.
Bonus tip: check your facet/filter pages – the category content, meta data and URL structure against the main category content – does any of this content match? If so, you have a duplicate content issue. Can also check in site: search.
3. Miss-canonicalization
Ecommerce sites rely heavily on canonicalization, not only to make sure that there is no duplicated content on a website but also to place product pages within other categories to make the consumer journey as seamless as possible.
Canoniclisation is not new and nearly all ecommerce sites use it, that being the case not all of these sites use canonicalisation well or to the best ability to maximise SEO for the website.
You need to be so careful using canonicalization as if it is done wrong, you could be blocking over 50% of your website with canonicals.
Why are miss-matched canonicals so dangerous for SEO? By using the canonical tag, you are telling Google that this page is not the master page for the site and therefore to ignore this page as it contains duplicate content.
In essence you are notifying Google to ignore a page on your website, so therefore reducing your indexing and many times your sites relevance as many times the canonicalised page will be a product page which sits out of the websites navigation (orphan URL).
What page should you canonicalize too then? Many ecommerce websites canonicalise to a master product page which has a unique product orphan URL that can be used on all pages as it is not defined to any category.
This is in many cases the “easiest” way to canonicalise and used by the majority of ecommerce websites, but it’s not the best for SEO and organic rankings.
If we look at this from Google’s point of view they want to find out “what are the most important pages on your website in your eyes”. How does Google gather this insight? It looks at how many times you link to a page on your website. As mentioned above, Canonical product URLs are many times outside the websites navigation and therefore you will be linking to them very little.
To address this issue requires a lot more than a quick look through SEO tools as many times it will have been a decision or based on expert advice to canonicalise to product pages.
However, if you are not sure what is your master canonical page for a product within your inventory locate a product page on your site and check the source code (right click on a webpage and click the option “view page as source”), do a page search for “canonical” and see what URL it shows. If it shows a self-referring canonical (i.e., it matches the URL you are on) then its fine, if it shows anything else look into the URL and see is this the best URL to have ranking in Google for this product?).
Quick tip: carry out a “site:” website search within Google and see what pages are indexed. Go through the first 5-10 pages of Google and look at product page URLs. Are these the pages that you want indexed? Check the URL – is this the correct page or an orphan URL.
Bonus tip: Check Google Search console and the Page report around indexing. If you see that your non-indexed pages are on a rise, then locate the patterns and ensure to canonicalise/block from crawl. Any pattern of 50% not indexed vs indexed is a cause for concern.
4. Page clicks depth & page visibility
It is generally considered best practice that a customer’s journey should only be 3 clicks from the homepage. This means that your site should be structured so that key products can be reached within 3 clicks.
Although this is best practice, in reality this is not always followed. Some websites have a very difficult navigation structure making the whole journey a lot more difficult than it needs to be. The trade-off on these sites is always between design and functionality.
For example, many web designers do not like a faceted/detailed navigation as it in their words “looks ugly”. They would rather have links within the pages as this is what they believe is more representative of a customer’s journey. However, a structure such as this makes the journey a lot more difficult for our main customer Google.
Google is our main customer, if they cannot crawl the website through a convoluted navigation designed by web designers based on the customer journey then it’s simple the site will not rank highly within the SERPs for targeted key terms.
We also have to be aware that If a page is not in the navigation, it has a lower number of in-links therefore low site visibility. So, what in essence we are telling Google is “these pages not in the navigation are not important on our site”. When in reality they will be very important subcategories where the majority of our products lie.
How can this be fixed? We need to make sure that all important pages are within the sites navigation and have internal links throughout the site (this can be achieved though contextual links within category pages, breadcrumbs or related product boxes.
Quick tip: review your websites sitemap or URL export (can be done within some CMS platforms) and see if the top level and sub level categories are in the navigation. An advanced tip would be to utilise the proffered tool of screaming frog again and carry out a crawl on the site, sort by “in-links” and check what pages have the highest number of in links (these should be your top pages).
Bonus tip: Make sure your core seasonal landing pages (highest demand, highest converting) are linked to from the homepage, main and sub-categories – along with product/category themed blog content.
5. Site crawlability
Without a doubt site crawlability is one of the most important areas of SEO. If Google cannot craw your site efficiently, you will not rank as highly as you could even if all you other areas of SEO are in alignment.
On larger e-commerce a sites crawlability can be an issue. Many times, these large sites have multiple categories, filter pages, advanced search parameters, pagination and many other factors that Google needs to be able to crawl in a concise manner.
As stated previously, Google allocates a “crawl budget” to your website, which means that on every occasion that the bot visits it will spend a portion of that budget over a 24hr period. If Google is spending time crawling pages that have little value to you, you have failed, if it spends time crawling pages that are 301 redirect chains, you have failed and if it encounters broken pages ending in a dead end for the crawl guess what you have failed.
Crawl efficiency and optimisation is not something that can be easily achieved it involves a specialist technical SEO project to analyse all areas of your website and locate these inefficiencies.
Surely this only effects companies that have never done SEO? The answer to that question is a resounding no. Over 98% of websites in the UK have never been audited to this level and will have issues within the crawling and indexing of their site. Even e-commerce sites that are turning over £200m per year have these problems that unknown to them will be causing a problem with their ranking potential.
What can be done in-house? How do you start to improve your sites crawlability? At the most basic level you need to make sure you have a detailed navigation with all top/sub level categories, make sure you have internal links throughout your sites pages that link to other pages (as explained above can be achieved through breadcrumbs and related/similar products). One of the most important areas to increase a sites crawlability is to make sure that you have a detailed/up-to-date sitemap that only contains key website pages. Finally, spring clean your site if a web page has had no traffic in 12-18 months consider its value to the site (check metrics before deleting/no indexing).
Quick tip: Don’t underestimate robots.txt. As a key system file this tells Google what pages to crawl and what ones to ignore. Make sure your robots file includes pages that have little value on your site (search parameters, currency variants, pagination, etc..). Carry out another site: check and see what pages are indexed if you see any pages that have little to no value to a customer (or Google) then make a note of these pages/URL parameters and consider adding them to the robots file.
Bonus tip: An advanced level check to make would be to check your log file data sets from the server (but this is complicated without an experienced partner). Something we would recommend though is using a tool such as Screaming Frog and using a Google Search console API check indexing of core category/PLP and PDP pages to not only check they are indexed but also to check that they are being crawled efficiently enough. This is still a little advanced level though. A quicker check would be too look through Google Search Console “crawl stats” export and check are core pages being crawled. On ecommerce sites core pages should be crawled daily.
Conclusion
When it comes to technical SEO there is a lot hidden out of plain site for marketers to be aware of. Some of these factors can be seriously holding back your websites performance if left unaddressed – indexing, crawl, page experience, page load speed and duplicate (conflicting cannibalisation content) can be hidden ailments. That are damaging your websites effectiveness within search.
With Christmas coming, we know many teams will be “let’s just fix this in Q1”.
Be under no illusion, a lot of these technical tweaks can be done quickly within the CMS and will have amazing effects on your websites keyword positions in Google. By increasing the crawl efficiency one December on an ecommerce website within the gift giving space, we were able to move a brand from ranking 6 to position 1 for a high volume (150,000 UK searches a month) transactional term.
As with everything, some quick wins can be done in-house within the team, but the greatest gain comes from working with an experienced SEO partner.
To capitalise within organic search this Christmas you need to act now.
Otherwise, you just as well invest in paid media and burn through your marketing budget as the paid index around Christmas is highly competitive with CPC prices on transactional terms.
For more help or advice on technical SEO for e-commerce websites and for further insight as to how to go about checking your website for technical quick win areas, please feel free to contact our expert ecommerce consultancy team at Polaris.