Google Officially Updates Penguin Algorithm – Here’s What You Should Know

By | 2016-09-30T14:30:52+00:00 September 30th, 2016|SEO|0 Comments

Last week Google officially updated its Penguin algorithm again. First released in April of 2012, Google introduced the algorithm code-named Penguin, with the primary function of decreasing the search engine rankings of websites that were found to violate their code of conduct.

First released in April of 2012, Google introduced the algorithm code-named Penguin, with the primary function of decreasing the search engine rankings of websites that were found to violate Google’s Webmaster Guidelines.

Specifically, this focused on penalizing excessive use of keyword-rich links that were typically built unnaturally in bulk – or the websites themselves were owned by the marketers to maliciously bolster website rankings above competitors. This was done by using what are now affectionately called Black Hat SEO techniques, or Spamdexing – in layman’s terms: a deliberate manipulation of search engine indexes.

The History of Penguin

Back on April 24th, 2012, Google shook the SEO landscape by introducing its first revision of the Penguin algorithm. This was at a critical time where SEO was much simpler than it is today, and when a large number of links pointed towards a target website would serve to bolster website rankings and manipulate traffic to accelerate past competitors.

This initial change impacted 3.1% of queries that were manipulated by spammy tactics, in contrast to their future updates the initial launch of Penguin was instrumental in setting the tone for what was yet to come in the coming years.

The algorithm was updated a few times over the years, notably in October of 2012, as Penguin 3 – which affected approximately 0.3% of search engine queries. Penguin 2.0 was released in May of 2013 and affected 2.3% of queries.

Penguin 3.0 was speculatively released on October 18th, 2014 – with Google confirming on the 21st of that month that the Penguin update was only a “refresh” without adding new signals.

These long lulls in the history of updating the Penguin algorithm left many business owners high-and-dry when trying to resolve problems with their current domain, many waiting upwards of two years with hopes of seeing a return to their former glory.

The algorithmic penalty that was applied in each milestone updates would penalize entire sites for utilizing spammy tactics. Despite performing immediate cleanups with genuine intent to correct the problem, webmasters remained penalized until the Google would refresh their status.

Google released Penguin 4.0 in late September 2016, and it represents a number of new features and capabilities: most importantly, it now operates in real-time and is one of Google’s core ranking algorithms.



What’s Changed?

Penguin Now Operates in Real Time

Yes, that’s right – this new update to Penguin operates in real time, meaning that SEOs and marketers don’t have to wait for refreshes and updates to see the results of a cleanup to a backlink profile.

Because data is refreshed so much faster, corrections to your backlink profile can be made and changes are visible much faster; recovery isn’t instant just yet, so it’s still necessary to wait for Google to recrawl and reindex sites that link back to you. Page rankings are re-evaluated each time Google recrawl’s a page, and therefore, the impact of any incoming spam and links are altered almost immediately.

With the algorithm now operating in real time, this probably spells the end of public announcements from Google citing any updates, so don’t expect any more comment from Google’s head-honcho’s on future revamps or refreshes.

Site owners are going to be able to see results much more quickly in Google’s search results as links are disavowed and negated. You’ll still have to wait for a refresh, mind you – this keeps Penguin different from Panda, which still operates on a long-cycle.

What Has Changed in How Google Penalizes Bad Links?

This real-time aspect can have a slightly negative reaction on sites with bad link issues. Penguin has been made part of the core Google Ranking Algorithm, along with Panda, meaning sites will be able to quickly discover that they could have a potential new issue with Penguin.

Google’s Gary Illyes says that the Penguin update “managed to devalue spam instead of demoting.”

This means that spammy link tactics will be de-valued where they are applied to a page, instead of penalizing a brand as a whole, which has caused heated lawsuits in the past and has quite literally handed businesses to bankruptcy.

Illyes also noted publicly that Google still recommends using the Disavow file to recover from Penguin issues, although you technically may not need to, instead focusing on correcting the problem at its origins.

“If you still see the crap, you can help us help you by using it,” Illyes said on a public Facebook post.  “There’s less need… Also, manual actions are still there so if we see that someone is systematically trying to spam, the manual actions team might take harsher action against the site.”

As a result, many SEO’s and webmasters interpret this to mean that Penguin doesn’t penalize anymore – to an extent – it now seems to simply ignore and devalue spammed links instead, adjusting their rankings.

Core Ranking

Penguin is now a part of Google’s core ranking algorithms, along with Panda, one of the company’s strongest spam-fighting algorithms. Penguin is now “baked into” Google’s larger search algorithm network, which encompasses over 200 signals.

The big change here, is that Penguin is now part of the main search algorithm, rather than just a filter which is refreshed during core algorithm updates after extended waits as we outlined above in Penguin’s history.

Penguin will also be implemented in all languages worldwide, so searchers around the globe will begin to notice these changes. This will, of course, take a couple of weeks, so if you haven’t noticed updates or changes yet – they’re coming.

More “Granular”

Google themselves call this Penguin update “more granular” than its predecessor. But what does this actually mean for webmasters and marketers?

Previously, Penguin was usually applied as a site-wide penalty which would suppress traffic until low-quality links pointed towards the penalized site were corrected – from here, the corrections were realized only once a new Penguin update “refresh” had been rolled out.

So when asked what “granular” meant – Google responded by saying: It means it affects finer granularity than sites. It does not mean it only affects pages.”

What we can take into consideration as a result of these insights, is that Google will be focusing on suppressing results to offending pages instead of websites in their entirety. Sections of a website will experience reduced traffic and rankings instead of the entire website.

A good interpretation of this explanation comes from, who translate it as meaning that Penguin may impact specific pages or sections of a website, while other pages remain untouched.

The good news is that the positive outcome of corrections and improvements to these areas can be realized in months, rather than years.

With the calculations happening in real time, marketers can now correct, measure, improve and repeat. This change ultimately puts site owners back in the driver’s seat on amended offenses to Google’s guidelines in a timely manner.

What Does the Future of Penguin Look Like?

This could represent the beginning of a more forgiving, collaborative Google.

Penalizing actions – and not entire brands – means supporting site owners who have potentially fallen victim to poor marketing tactics they’ve been unaware of.

RankBrain, a machine-learning artificial intelligence technology used by Google to help process its search engine results could be calculating these new link factors since it is now part of the core algorithm. If RankBrain is assisting this development, we may see Google learning to more effectively counter spam in the future as computers actually learn to do these things, rather than rely on humans to teach them detailed programming.

It’s business-as-usual for SEO best practices. Continue to avoid heavy use of spammy keywords in links like “Buy Red Shoes,” and continue to focus on branded links and naturally occurring contextual links.

About the Author:

Leave A Comment