1. Next Generation of Penguin – Penguin 2.0
This update is to try and target more black hat web spam. The new Penguin 2.0, which is the name Google uses internally for the next gen Penguin, will be much more comprehensive than Penguin 1.0 and it will go deeper and have a larger impact than the original.
Many advertorials (a.k.a., native advertising) violate Google’s quality guidelines. More importantly, they should not flow PageRank.
Google is planning to be a lot stronger on their enforcement of these types of paid links and advertising, disguised as “advertorials”. Cutts did clarify there is nothing wrong with advertorials, simply that they don’t want them to be abused for PageRank and linking reasons. If you use advertorials, Cutts suggested that they should be clearly marked and obvious that it is paid advertising.
3. “Payday Loans” in .co.uk
Cutts mentioned that this is a problematic search, and there are others like it, so they are tackling it a couple of different ways. For those that play in that space, however, you’re out of luck since Cutts isn’t revealing exactly how they are dealing with it, just that it will be happening.
He said that they are targeting specific areas (another example he included was porn queries) that have traditionally been more spammy.
4. Devaluing Upstream Linking
Again, Cutts isn’t going into details about this, but they are working on making link buying less effective and have a couple ideas for detailed link analysis to tackle this issue.
5. Hacked Sites
They want to roll out a next generation of hacked detection, as well as being able to notify webmasters better. They would like to be able to point webmasters to more specific information, such as whether they are dealing with malware or a hacked site, and to hopefully clean it up.
If Google’s algorithms believe you or your site is an authority in a particular area, they want to make sure those sites rank a little bit higher than other sites.
They are looking for some additional signals for sites that are in the “gray area” or “border zone”, and looking for other signals that suggest the site truly is high quality, so it will help those sites who have been previously impacted by Panda.
8. Changes to Cluster of Results From the Same Site
If you’re doing deep searches in Google, and going back 5, 6 or more results pages deep, you can see the same site popping up with a cluster of results on those deep pages.
Google is looking into a change where once you have seen a cluster of results from the same site, you will be less likely to see more and more from that same site as you go deeper. Cutts mentioned this as being something that came specifically from user feedback.