Search

27 . 5 . 14

Panda 4.0 Analysis – Another Case Study


So; Despite all evidence to the contrary (No major updates, plenty of hammocks in their offices), it turns out Google hasn’t actually been having a massive collective nap for the last few months.

Mozcast

For those unfamiliar with Mozcast, they monitor several thousand search terms to take a measure of how much movement they see, with a view of forming a kind of early warning system for major Google algorithm updates. It was clear to anyone paying attention on Monday that something was coming. Then, on Tuesday, Google’s King Head of Webspam, Matt Cutts confirmed that something was up on Twitter.

Google is rolling out our Panda 4.0 update starting today.

— Matt Cutts (@mattcutts) May 20, 2014

Which surprised everyone, because the last tweet on Panda stated that they weren’t going to be tweeting about updates to Panda anymore.

What’s a Panda?

Panda is Google’s on page spam detection algorithm. It was introduced to stop junk content outranking good quality writing. It does not specifically look at backlink profiles, although if a site is flagged by the Panda algorithm that could have a secondary effect on the domains it is linking out to.

This month, eBay was the most notable big loser, and plenty of analysis has been done around the drop they saw. EBay are rather an exceptional case, though; they sell pretty much everything on the planet and the sheer volumes of URLS and the insanely high link and brand authority they have meant they had a tendency of showing up in just about every commercial-themed search, ever, regardless of relevancy. It’s incredibly rare to do some competitive landscape analysis in the Ecommerce space and for EBay (and Amazon, who interestingly don’t appear to be affected) to not appear prominently.

Since I don’t really have anything to add to the conversation about eBay, I thought I’d do a look at a much smaller, more niche domain that we witnessed being affected. The sort of business a mid/large search agency might deal with day to day.

Searchmetrics data

Source; Searchmetrics

DirectFerries are a price aggregator in the Ferry space. Google has always had something of an interesting history with aggregators – back in 2011, they purchased Beatthatquote only to be forced into penalising their own property when it was pointed out how many rankings were supported by poor quality backlinks.

Since aggregators usually want to appear for a really wide variety of terms (As opposed to, say, a ferry operator, who will only want to appear for the routes they service) they tend to try and scale out their content as aggressively as possible.  That often means lots of slightly altered template copy or auto-generated results. The exact sort of low quality content Panda might look to flag.

How deep is that drop?

One interesting note is that , as sharp as that drop is from their previous high, this is not a complete, sitewide annihilation of their search visibility. Panda  4.0 doesn’t tend to characterise itself by the sharp near-total loss of visibility that many sites saw with last year’s Penguin algorithm update.  In fact, DirectFerries saw an increase on nearly as many keywords as they lost out on.

algorithm update

However, of the visibility lost, there are some pretty major keywords in amongst the list.

keyword list

Two themes of visibility lost seem to emerge here;

  • Red Funnel, P&O, North Sea Ferries and Stranraer Ferry are all branded terms for ferry operators.
  • The rest of the major keywords are searches for routes.

My first thought was that part of the Panda algorithm might be about brand/entity recognition – we know Google is focusing on moving away from keyword matching for search results since the Hummingbird update last year. Does Google now know that P&O is a brand, and has moved the aggregator’s out of that space?

SERPS

No.

Aferry, a competing aggregator has remained visible in that branded space just fine.  Google’s problem seems to be with DirectFerries content specifically, not aggregator content in general.

So what could be the problem?

duplicate content review

So, for starters, Duplicated content seems to be a big issue. That is common on aggregators, where the scale of content needed means that autogenerating pages as opposed to handwriting them is tempting economically. In fact, on many pages, nearly the entire page is duplicated from other areas of the site.

duplicate content

Every highlighted word appears on multiple pages…

SERPS2

Yeah.

Rishi noted in his (well worth a read) overview of EBay’s drop, many of EBay’s occurred in a folder named ‘/bhp/’. Every word on these pages will appear on the individual listings results. There are no original words on this page, either.

ebay results

The Takeaway…

There is no new advice here; Duplicate content has always been a no-no. But it looks like Google is getting more serious about killing off boilerplate copy and category pages that offer no unique value.

If you’re reliant on content-less product category pages and filters for traffic, you should consider how you can add original content that does something to enhance the user’s experience. What are the FAQ about the products? Any specific reviews? Canonicalisation and pagination handling is also important, to prevent you creating a whole new duplicate content issue.

Mike is one of our SEO analysts, with over 5 years’ experience in creating complex analysis of user behaviours in order to drive informed search campaigns. Gets perhaps slightly too excited about pivot tables.

Share This:


Comments