How Algorithms Work
Google uses signals called algorithms that they rely on to help users find exactly what they are looking for.
All of these algorithms can be found in the Google Webmaster Guidelines and are aimed at optimizing the results of the websites that are top-ranking.
There are over 200 algorithms currently in use that are used simultaneously to increase the effectiveness of the results when you search.
Some of the signals that they use to help bring up the best results are things like how many times a certain keyword appears on a page, the area in which the content originated from, how recent the information is, as well as are there links on a page that will lead the user to other content that is similar if the page doesn’t have exactly what it is that they are looking for.
Penguin has been included as part of the core algorithm.
Penguin was originally designed to be a filter on search results.
It was created to identify sites that were spamming the search results in ways that would put them at the top of the list even though they didn’t necessarily have the content that the user would be searching for.
Essentially, it was a spam filter.
It worked to evaluate the quality of inbound links to a particular site and to determine if they were quality resources, or not as relevant, or if they were merely spam.
Google encourages websites to always monitor the links that are on the website as the quality of the links is something that is looked at by multiple algorithms that Google uses and will significantly impact the rankings of a site.
They offer a tool to help monitor the inbound links that is available called Google Search Console.
There are tools that can be found on the internet that one can use to test the efficiency of an inbound link.
These tools can be helpful when looking to analyze a site’s rankings as well as to analyze the reasons for which the site’s rankings are likely to be continuously changing.
However, it is highly anticipated that reasoning for slight changes in ranking may be harder to detect due to the constantly changing nature of the update.
Once a site that was spamming the search results is identified, the sites would be penalized.
The way that Penguin used to work was that it would update periodically.
As such sites would be given significant periods of time to spam and run thus leading millions of users to pages that weren’t relevant to their intended search results.
A side effect of which often led penalized sites to go through a waiting period before they would no longer be penalized, even though they may have already made the necessary changes to rid themselves of the penalty.
They would have to wait until the next Penguin update before they would be returned to a regular status.
The last time that Penguin was updated was 2014 in October.
So, what that means is that sites that were penalized have waited nearly two years, regardless of how long ago changes were made, to no longer be affected by the penalties that were imposed upon them.
- Penguin 1.0 on April 24, 2012
- Penguin 1.1 on May 26, 2012
- Penguin 1.2 on October 5, 2012
- Penguin 2.0 on May 22, 2013
- Penguin 2.1 on Oct. 4, 2013
- Penguin 3.0 on October 17, 2014
- Penguin 4.0 & real-time on September 23, 2016
Google did not release the number of queries that were impacted by the newest update as the number would constantly be changing.
Previously they have released data on the percentage and numbers of queries that were impacted by the updates to the algorithm.
The Update: Penguin is now real time
The latest update that was made to Penguin is that it is now operating in real time.
So what this means is that sites will no longer be stuck in their penalties as it will be updating in real time.
Also, this means that this was the last update that Penguin will ever make as it will be operating in real time.
So when someone searches for something, and there is a google web crawl, the sites will be both freed and penalized in real time.
So changes can be made that free the sites almost instantaneously, so as to restore proper functioning as quickly as possible.
Many of Google’s other algorithms already operate this way in that once changes are made the information updates automatically in real time, and changes are made immediately as they pertain to any sort of penalties.
The update that was made with Penguin has been made in all languages, so all sites are impacted by the latest changes.
It was a long time coming, but users seem to be very happy with the update as Google keeps up with the times and sets the bar high for other search providers.
Another important factor to note is that previously when a site was penalized for spamming, it affected the entire site and the entire site’s ranking, rather than just the page that was putting forth the spam content, thus creating huge problems over long periods of time for some users.
Google has released, “Penguin is now more granular.
Penguin now devalues spam by adjusting ranking based on spam signals, rather than affecting the ranking of the whole site.”
However, another factor in the Penguin update is that the changes no longer impact the entire site, but they will impact and penalize the page and possibly certain sections of a website, or groups of pages on a site.
In the previous version of Penguin, Google used to confirm all of the Penguin updates.
Since the updates are now constant, Google will no longer be confirming the updates.
The updates are not fully live as of yet, and Google did not release a specific date that they would be fully live.
People speculate that it may be over the course of the next few weeks before the changes roll out in their entirety.