Between the final week of September and the third week of October 2012, we noticed a considerable drop in rankings for one of our clients.

This was a lot more than the usual ranking fluctuations that we expect to see on all SEO client accounts, and called for further investigation. If anyone else has had problems with their organic ranks over the past six months then they will remember this time well, as a Panda update, Penguin update and the Page Layout 2 (a Panda style update) all hitting within two weeks wreaked havoc for many sites.

The first thing we had to do was diagnose the direct cause of the ranking drops.

It seemed more than likely that this was caused by one of the algorithm updates. Notice use of “more than likely”. Unlike manual penalties where you will be sent a notification, you can never say with absolute certainty that you have been the victim of an algorithmic penalty, so a judgement call is required. In this case, the ranking drop was significant across the site and followed very closely on the heels of Panda 20 with a further drop around the Penguin 3 / Page Layout 2 update.

By reviewing our work on the site over the past 12 months, we were able to rule out with a good degree of certainty any effect from Penguin (our linkbuilding had all been very clean), and so we were left with being very certain that factors on the site did not agree with Google’s Panda criteria.

Then we had to take a look on the site and diagnose the root cause of the problem.

We looked across all pages of the site for the standard things Panda targets, such as duplicate content, poor user experience, over optimisation of keywords etc. What we found were the following:

– The site was split in two, one half targeting offices and one targeting homes, with the exact same pages showing on each side of the site. Almost all pages therefore were duplicated.

– The text on many pages was poorly written, and keyword stuffed in places.

– The site used a splash page, which contained three enormous images and only 5 words of text.

– The social sharing buttons were each using an individual code snippet. Although not the main problem, it looked messy and it could have been picked up as multiple ad units due to non-standard code being used to display the buttons.

A good rule of thumb when analysing a Panda site is to stop thinking like a search engine and ask yourself “is this site easy to use?” before getting to the technical stuff. Often this question alone will help you identify all of the elements that are causing problems in the eyes of Google.

Finally we had to implement the changes.

We removed the splash page, canonicalised all the duplicate pages (using the rel=canonical tag), removed any pages that served no purpose being duplicated, simplified the social sharing links to one code block, and tidied up the on-page content. Within a week, the rankings began rising, and within a month they had risen back to where they previously were pre-Panda 20.

We also recommended to the client that they build a new site. The changes detailed above served as a short term fix of the main issues, but even with these changes in place the site did not provide a good user experience. It was very old, and some of the problems like the duplicate pages were part of the site’s architecture, which made any further panda-compliant work on the site very difficult. A few months later, their new site is live and as predicted, this has helped us push their rankings up even further and capitalise more traffic. As a side benefit, this new website also has a much better conversion rate, so not only have we driven them more traffic, more of the traffic is converting, which in turn has meant that their PPC spend has become more profitable.

Overall this was a great success story. Although the initial shock of a rankings drop was far from ideal, it forced both us and the client into action, and ultimately their business is now in a stronger position across all online channels.

Related posts

The Effects of Panda 21November 12th, 2012