A few weeks ago Google launched the Penguin 4.0 algorithm, which deals with how backlinks are evaluated. This led to massive fluctuations in Google rankings. Some sites were hit hard while others showed astonishing recoveries. Actually, I encountered this myself, so let me share this story with you.
Let's first quickly discuss the backgrounds of this story. In the old days of SEO a site's reputation was largely established by evaluating how the site was related to other sites on the web. This was initially done by simply counting the number of links. This caused sites with many backlinks but otherwise similar to rank much higher in Google. This led to many spammy techniques were lots of people simply started buying links by the thousands or building dozens of small sites just for the purpose of linking them to their main site. At some stage, it became clear that this did not indicate quality sites at all. Therefore Google rolled out the so-called Penguin algorithm in 2012 to fight these practices. This led to massacre among sites that participated in these practices. Google fine-tuned the algorithm in the later years, also targeting sites that participated in less obvious practices, like link schemes, press-release sites that allowed backlinks (often their main purpose) etcetera. Also, sites that just had a few links exchanged with non-related sites could be affected, though maybe not as bad as totally malicious sites.
Penguin wasn't real-time
Affected sites could recover (more on that later), but the problem then was that the algorithm wasn't real-time: if your site was hit and the next algorithm update was only a year later, your rankings stayed down until that update.... Especially sites hit in 2014 or later had a big problem, because it took very long for the next update: actually until October 2016. However, by now the algorithm is now real-time, so bad links will cause pretty swift penalties and any recovery actions should cause similar rises in ranking in a few weeks / months.
One of my sites was a really popular dutch site, with around 50.000 monthly visitors, both a blog and a webshop. At the time I was experimenting a lot with SEO. Most of the time I was spot on, because it reached top-rankings on all of the keywords I aimed for. Most of the techniques I used were perfectly legitimate, like writing great content and metadata, technical SEO and nice images. However, I also worked on my links for some time. I never bought any, but I sometimes asked for link exchanges and also put my site on free business directories.
Also, in October 2014 I applied a redesign of the site, making it responsive and also moving a lot of content from the homepage (which was a bit crowded) to dedicated pages, which were perfectly legitimate actions. However, a few months later I started to have a serious issue:
The timing after my redesign made me think that I did something wrong, but I did not have a clue what. However, I also started suspecting that maybe the quality of my links may have had something to do with that, so I started to see what I could do about that.
How to fix your link-profile
The general procedure would be the following:
- Collect all your links: not easy: you can find a lot in Search Console, but this is in no way complete, so you probably need link-research tools to find them all.
- Evaluate each link: you need to split the bad ones from the good ones, and this is the really hard part
- Remove those links if possible: If you can control them, remove any bad links you created yourself. However, many links may have been created on sites that you don't control. For those, you can put them all in a CSV-file and upload it to Google Search Console using the disavow tool. Doing so tells Google to not take those links into account anymore. It is a risky procedure, and upon accessing the tool Google requires you to click multiple times on the message whether you really, really want to access the tool.
Now this is exactly what I did. For the link-evaluation part I used the Cognitive SEO tool. It builds a link-profile of all the links you have, rating them either as Good, Average or Bad. It does so based on reputation of linking sites, but also of the anchor-text used. As an example, a link to another website put on a forum by a genuine user that wants to send his friends to your great site simply puts the URL as a link (like http://example.com/url-a-or-be), but a bad SEO person would craft a nice anchor text with the exact keywords in it because he thinks that Google will like that. However, an unnatural percentage of non-raw URLs as backlinks is unnatural. The Cognitive SEO tool builds a general profile like this:
Besides the automated detection, the tool also allows you to manually go over the list in a convenient way as you, as the one who probably is most responsible for the links created, probably best knows them and can correct any mistakes in the automated discovery.
Massive recovery ;)
After evaluating these links, I had a collection of probably bad links that I sent to the Google disavow tool. And then the waiting game started.... I really needed a Penguin update to happen, but the guys at Google took a really long time. However, early October 2016 it finally came, getting real-time as well. Initially, nothing happened, probably because Google.nl does not adapt as soon as Google.com, but then things got very nice:
My daily number of visitors increased over 150% in 2 weeks time! Of course, I will have to see whether this holds, but I am pretty confident it will.
Now maybe your site shows similar recoveries. Or maybe you have a big drop in numbers: then you know what could be the issue. And if you think you had a bad link profile and the recovery did not come: maybe you only once had good rankings because of (almost) 100% bad links. Now even if you cleaned them up, you have no links left, and you still need good links to recover.