Google Penguin (Google Penguin) is a new algorithm from the Google search engine. What is Google Penguin?


We all know very well, dear friends, how search engines pore over their algorithms, filters, and so on, in order to give ordinary Internet users what they are looking for as clearly as possible. But these same algorithms simultaneously clean the network from the same spam or low-quality resources, which, by hook or by crook, somehow ended up in the TOP. “It won’t work like that” - somehow the minds of the search giants think so, and once again the guys from Google came out with this thought.

So for the second week now, the webmaster part of the network has been buzzing and the majority are indignant: “Google introduced something with something and this hellish mixture cut our positions, which in turn reduced traffic.” Yes, indeed, while analyzing the competition in the niches that interest me, I noticed serious changes. Traffic on many competitive resources was cut by 10, 20, 50, or even more percent. Why go far, look at some SEO blogs, it’s unusual to see traffic of 150-200 users per day.

So, what did Google come up with...

On the 20th of April 2012, a message appeared on the Internet from Google developers approximately as follows:

“In the coming days we are launching an important algorithm change targeting Webspam. The changes will reduce the rankings of sites that we believe violate Google's site quality requirements."

On the night of April 24-25, a new Google algorithm was introduced - Google Penguin (Google Penguin). Google's love for animals has generated a lot of buzz. On the same Serch, several topics were formed with a huge total number of pages (more than 500) discussing the new Google algorithm Penguin. As always, there were significantly more dissatisfied people than satisfied ones, because the satisfied ones sit quietly and do not burn their achievements, which were only eaten by the Google Penguin with a “Hurrah”.

Let's first get acquainted with the very basic requirements for website quality that Google puts forward:

  1. Do not use hidden text or hidden links. Google, and not only Google, have long been marching under the banner of philanthropy. What a person does not see is understood and perceived by search engines as an attempt to influence the search results, and this is comparable to manipulation and is suppressed in the form of pessimization or some other “damage”. I remember that at one time hidden text was very fashionable, of course, because it had an effect. Black text on a black background with a certain optimization, you got to the page and wondered where was what the search engine was showing.
  2. Do not use cloaking or hidden redirects. No need to try to give away search robot one information, and the user another. The content should be the same for everyone. Regarding redirection, there are those who like to rip off money from redirection or sale mobile traffic, Google decided to disappoint a little.
  3. Do not send automated requests toGoogle.
  4. Don't overload your pages with keywords. This principle has long been supported by Yandex and positioning against over-optimization of sites clearly killed the work of adherents of stuffing the site with keys. If earlier it was possible to stuff articles with keywords and enjoy traffic from Google, now, as you can see and understand, not everything is so simple. But don’t forget about user factors or so-called behavioral factors. If they are at a good level, then even slight over-optimization is not a problem, because user behavioral factors have always been, are and most likely will be a priority. Only here everything is somewhat more complicated - imagine for yourself what needs to be given and in what volume so that these same behavioral factors are at their best. This suggests that the content really should be high level and interest, and not a light rewrite from a competing site or niche leader.
  5. Do not create pages, domains or subdomains that substantially duplicate the content of your main site or any other site. At this point, Googlers immediately put together their views on affiliation, site networks, doorways, copy-paste, and low-quality rewriting.
  6. Don't create malicious pages such as phishing or containing a virus, Trojan horse or other malicious software. This point should not be chewed on at all, fight viruses in one word.
  7. Don't create doorways or other pages designed just for search engines.
  8. If your site works with affiliate programs, then make sure that it provides value to the network and users. Provide it with unique and relevant content that will attract users to you in the first place.

Google identified these 8 principles as the main ones. But there are also 4 more that were especially noted by them:

  1. Create site pages primarily for users, not search engines. Do not use disguises or other schemes for working with sites, be extremely transparent.
  2. Don't try to use tricks to increase your site's ranking in search engines.
  3. Do not participate in link building schemes designed to increase your site's ranking in search engines or Google Page Rank. In particular, avoid links that look like spam, link dumps, or bad neighbors on the server (I draw your attention to analyzing and viewing neighboring sites if your site is hosted on a regular shared hosting).
  4. Do not use unauthorized software solutions to automatically contact Google. The developers themselves highlight such as WebPosition Gold™.

But everything seemed to be known to everyone who was interested in the issue before.

In the work of the new Google Penguin algorithm, I was surprised by the strict adherence to the principle of “advertising loading” of the first page. Remember, it was said that the first page (home page) should not be overloaded with advertising. Google Penguin has clearly begun to adhere to this principle. It cut off traffic for sites with several large blocks of advertising on the main page, even if these blocks were their own - Google advertising Adsense. 🙂

Observations on the work of Google Penguin

Now I would like to list a number of my own observations. I understand that they are approximate and represent my personal observations and data (indicators can be perceived as relative), but they have their place and have a right to exist. I analyzed the work of 20 sites, both mine and those of my partners. The analysis took me 3 days, and I assessed a lot of indicators. As you understand, all sites were promoted using different schemes and using different algorithms, had completely different indicators, which made it possible to draw a number of conclusions.

1. Exact entry. If earlier, in order to be in Google in chocolate, you needed a lot of exact entries, now with Google Penguin (Google Penguin) everything is exactly the same and vice versa. This may be exactly the fight against Webspam. Yandex has long been fond of diluting anchors, and now the matter has come to Google.

  • pages with external links have an exact entry of 100% - the drawdown was 75-90%. Average drop of approximately 38 positions;
  • pages with external links exact entry 50% - drawdown was 15-20%. Average drop of approximately 9 positions;
  • pages with external links of exact occurrence less than 25% - an increase of 6-10% was noticed. Average rise 30 positions.

Based on these data, I made a conclusion - we dilute the anchors and dilute them as interesting and deep as possible.

A striking example is the “gogetlinks” request on this blog. Exact occurrences greatly outnumber dilute occurrences, and this is the result:

2. Buying temporary links. I bought temporary links for finishing or quick results on all analyzed resources and with different systems. These were Sape, Webeffector, Seopult and ROOKEE.

Generators automatic promotion Webeffector and ROOKEE gave approximately the same results. The drawdown was practically not noticed at all, only a small one on Webeffector, but it is insignificant and is more related to the dilution of anchors. In other moments, even growth is observed, what can I say, here is a screenshot of the campaign (clickable):

As for Sape, the picture is completely different. All projects for which links were purchased from Sape sank. All the requests that were moving in this exchange flew out of the TOP 100 and even collecting statistics on where they flew there became somehow stressful, which in the end I didn’t do.

Analyzing the impact of Google Penguin on promotion with Sape, I concluded that Google now perceives the placement of links on this exchange as unnatural.

Here I started actively removing links. But it makes sense to give your own examples when you can show you bright ones from our own niche. Let's take my friend's blog - Sergeya Radkevich upgoing.ru. The man worked with Sape for several months and was happy with the increase in traffic until Google Penguin came along. Let's look:

It’s also worth looking at the source chart search traffic:

As you can see, Google Penguin has reduced traffic from Google by more than 7 times.

The conclusion on this point is that some filters still need to be used when working with temporary links and some kind of placement and purchasing algorithms. Automated services work according to certain schemes, unlike Sape. The results are obvious, by the way.

Sites with Seopult are generally expected to increase their positions. Here I used the Seopult Max algorithm for Yandex, but as I see, now it also works with Google.

3. Over-optimization of content. A decline was also noticed here, but not as significant as in the previous parameters. Within 10 positions, only 10-15% of over-optimized articles lost.

Here I conclude that it’s not so scary to slightly over-optimize, giving yourself some guarantees. And you can catch up with lower impressions by purchasing links.

4. Eternal links. Queries promoted by eternal links with a normal and natural appearance only increased their rankings more seriously. Some HF VCs climbed into the TOP 20 without any manipulation, due to a noticeable decline in most of their competitors. Here I once again conclude that my work in the direction of promotion ONLY with eternal links is correct.

1. Take all the information above into account and work on both the content of your site and its promotion.

2. Check for Google notifications on Google Webmaster Tools with a message about spam activity on your resource. To do this, go to http://www.google.com/webmasters/, log in. Next, go to the statistics of your site and go to the messages section (clickable):

If there are still messages, then it will be necessary to take measures to solve the problems indicated in the messages. This may include removing links... Not all yoghurts are equally healthy... 😉

3. Check the site for availability malware V Google Webmaster Tools:

The solution to the issue is the same as in the previous paragraph. We identify programs and files marked by Google as malicious, find them on the server, or replace or delete them.

If everything is really bad and you have given up, then fill out the form and sign the created petition to Google against Google Penguin. This, of course, does not guarantee you anything, but at least a moment of small self-satisfaction and the feeling of “I did everything I could” will definitely come. On the same topic you can use the form feedback contact the algorithm developers.

Personally, I lost a little, since the main emphasis was on truly SDL and promotion with eternal links. I got rid of the negativity within 2 days and began to grow at the same rate. Well, whoever liked to do everything quickly and easily - you are left mostly chewing snot.

As one of my friends said: “Now the hackwork will be visible. Big changes await those who love the fast, cheap and easy. Those who break the bank by promoting mere mortals on the Internet, while openly hacking, will suffer a fiasco and listen to a lot of complaints from clients”...

You can post your thoughts about Google Penguin or the results of the changes in the comments.

Happy and quality changes to everyone!

At school:
- Children, come up with a sentence with the expression “It was a little.”
Kolya:
- Our Olya almost became a beauty queen!
Peter:
- On Saturday, my mother and I almost missed the train...
Vovochka:
- This morning Lyokha and I almost died from a hangover, but we almost had...

07.11.17

In 2012, Google officially launched " anti-web spam algorithm”, aimed against spam links, as well as link manipulation practices.

This algorithm later became officially known as the Google Penguin algorithm after a tweet from Matt Cutts ( Matt Cutts), who later became the head of Google's web spam division. Although Google officially named the algorithm Penguin, there was no official comment on where this name came from.

The Panda algorithm got its name from the engineers who worked on it. One theory for the origin of the name Penguin is that it is a reference to the Penguin, the DC comic book hero Batman.

Before the introduction of Penguin, link count played a significant role in how Google's crawlers rated web pages.

This meant that when sites were ranked by these scores in search results, some low-quality sites and pieces of content appeared in higher positions.

Why was the Penguin algorithm needed?

Google's war on low-quality search results began with the Panda algorithm, and the Penguin algorithm has become an extension and addition to the arsenal.

Penguin was Google's response to its increasing practice of manipulating search results (and rankings) through spam links.

The Penguin algorithm only processes incoming links to the site. Google analyzes links leading to the site and does not perceive outgoing ones reference mass.

Initial launch and influence

When first launched in April 2012 Penguin Google filter influenced more than 3% of search results, according to Google's own estimates.

Penguin 2.0, fourth update ( including original version ) algorithm, was released in May 2013 and affected approximately 2.3% of all search queries.

Key changes and updates to the Penguin algorithm

There have been several changes and updates to the Penguin algorithm since its launch in 2012.

Penguin 1.1: March 26, 2012

This was not a change to the algorithm, but the first update of the data within it. Sites that were initially affected by the Penguin algorithm, but then got rid of low-quality links, saw some improvement in their rankings. At the same time, other sites that were not affected by the Penguin algorithm when it was first launched have seen some impact.

Penguin 1.2: October 5, 2012

This was another data update. It affected requests for English language, as well as international inquiries.

Penguin 2.0: May 22, 2013

More advanced ( from a technical point of view) a version of the algorithm that has changed the degree to which it influences search results. Penguin 2.0 affected approximately 2.3% of English-language queries, and approximately the same share of queries in other languages.

It was also the first Google Penguin algorithm update to look further home page and pages top level looking for evidence of spam links.

Penguin 2.1: October 4, 2013

The only update to the Penguin 2.0 algorithm (version 2.1) was released on October 4 of the same year. It affected about 1% of search queries.

Although there was no official explanation from Google for the update, statistics indicate that the update also increased the depth of page views and introduced additional analysis for the presence of spam links.

Penguin 3.0: October 17, 2014

It was another data update that allowed many sanctioned sites to regain their positions, and others that abused spam links but hid from previous versions of Penguin to feel its impact.

Google employee Pierre Far ( Pierre Far) confirmed this and noted that the update would require " few weeks» for full deployment. And that the update affected less than 1% of English-language search queries.

Penguin 4.0: September 23, 2016

Almost two years after update 3.0 was released last change algorithm. As a result, Penguin became part of the core search engine algorithm. Google systems.

Now, working simultaneously with the basic algorithm, Google Penguin 4 evaluates sites and links in real time. This means that the impact of changes can be seen relatively quickly external links on your site's position in search results Google.

The updated Penguin algorithm was also not aimed strictly at imposing sanctions; it devalued the value of spam links. It's become the opposite previous versions Penguin, when bad links were punished. But research shows that algorithmic sanctions based on external links are still used today.

Algorithmic downgrades in Penguin

Shortly after the launch of the Penguin algorithm, webmasters who practiced link manipulation began to notice a decrease in search traffic and search rankings.

Not all of the downgrades triggered by the Penguin algorithm affected entire sites. Some were partial and affected only certain groups keywords, which were actively clogged with spam links or “ overly optimized».

A site sanctioned by Penguin took 17 months to regain its position in Google's search results.

Penguin's influence also extends across domains. Therefore, changing the domain and redirecting from the old to the new can lead to even more problems.

Research shows that using 301 or 302 redirects does not negate the impact of the Penguin algorithm. On Google webmasters forum John Mueller ( John Mueller) confirmed that using meta refresh redirects from one domain to another can also lead to sanctions.

Recovering from the effects of the Penguin algorithm

The Disavow Links Tool was useful for SEO specialists, And that hasn't changed even now when Google Penguin algorithm functions as part of Google's core algorithm.

What to include in a disavowal file

The Disavow file contains links that Google should ignore. Thanks to this, low-quality links will not reduce the site’s ranking as a result of the Penguin algorithm. But this also means that if you mistakenly included high-quality links in your disavow file, then they will no longer help the site rank higher.

You don't need to include any comments in Disavow unless you want them yourself.

Google doesn't read the comments you make in the file because it is processed automatically. Some people find it easier to add explanations for later reference. For example, the date when a group of links was added to the file or comments about attempts to contact the site's webmaster to remove the link.

After you upload your Disavow file, Google will send you a confirmation. But, although Google will process it immediately, it will not disavow the links at the same time. Therefore, you will not be able to immediately restore your site’s position in search results.

There is also no way to determine which links have been disavowed and which have not, since Google will still include both in the external link report available in Google Search Console.

If you previously downloaded Disavow and submitted it to Google, it will be replaced by the new one rather than added to the old one. Therefore, it is important to ensure that you include new file old links. You can always download a copy of the current file in your Google account Search Console.

Disable individual links or entire domains

For removing a website from Google Penguin It is recommended to disavow links at the domain level instead of disabling individual links.

Thanks to which the search engine Google robot You only need to visit one page on the site to disavow links leading from it.

Disabling links at the domain level also means you don't have to worry about whether a link is indexed with or without www.

Detecting your external links

If you suspect your site has been negatively impacted by the Penguin algorithm, you should audit your external links and disavow low-quality or spam links.

Google Search Console allows website owners to get a list of external links. But it is necessary to pay attention to the fact that Disavow includes links that are marked as “ nofollow" If the link is marked nofollow attribute, it will not have any impact on your site. But it is necessary that the site that posted the link to your resource can remove the “ nofollow» at any time without any warning.

There are also many third-party tools that show links to your site. But due to the fact that some webmasters protect the site from being crawled by third-party bots, the use of such tools may not show all incoming links. In addition, such blocking can be used by some resources to hide low-quality links from detection.

Monitoring external links – necessary task to protect against “Negative SEO” attacks" Their essence is that a competitor buys spam links to your site.

Many people use negative SEO as an excuse when their site gets penalized by Google for low-quality links. But Google claims that after Penguin Google updates The search engine recognizes such attacks well.

A survey conducted by Search Engine Journal in September 2017 found that 38% of SEO specialists have never disavowed external links. Reviewing external links and thoroughly researching the domain of each external link is not an easy task.

Requests to remove links

Some site owners require a certain fee to remove a link. Google recommends not paying for this. Simply include the bad external link in your disavowal file and move on to removing the next link.

Although such requests are effective way restore the site's position after applying sanctions for external links; they are not always mandatory. The Google Penguin algorithm considers the portfolio of external links as a whole, that is, as a ratio of the number of high-quality, natural links and spam links.

Some even include in the agreement for using the site “conditions” for placing external links leading to their resource:

We only support "responsible" placement of external links to our web pages. It means that:

  • If we ask you to remove or change a link to our sites, you will do so promptly.
  • It is your responsibility to ensure that your use of links will not damage our reputation or result in commercial gain.
  • You may not create more than 10 links to our websites or webpages without obtaining permission.
  • The site on which you place a link leading to our resource must not contain offensive or defamatory materials. And also must not violate anyone's copyright.
  • If someone clicks on the link you posted, they should open our site at new page(tab) rather than inside a frame on your site.

Link quality assessment

Don't assume that just because an external link is on a site with a .edu domain, that it must be High Quality. Many students sell links from their personal websites, which are registered in the .edu zone. Therefore, they are regarded as spam and should be disavowed. In addition, many .edu sites contain low-quality links because they have been hacked. The same applies to all top level domains.

Google representatives have confirmed that locating a site in a specific domain zone does not help or harm its position in search results. But you need to make a decision for each specific site and take into account Penguin Google updates.

Beware of links from obviously high-quality sites

When looking at a list of external links, don't assume they are high quality just because they are on a particular site. Unless you are 100% sure of its quality. Just because a link is from a reputable site like the Huffington Post or the BBC doesn't make it high quality. Google's eyes. Many of these sites sell links, some of which are disguised advertising.

Promo links

Examples of promotional links that are paid in the eyes of Google include links placed in exchange for free product for a review or a discount on a product. While these types of links were acceptable a few years ago, they must now be marked with a " nofollow" You can still benefit from posting such links. They can help increase brand awareness and drive traffic to your website.

It is extremely important to evaluate every external link. It is necessary to remove bad links because they affect the position of the resource in search results as a result of exposure Google Penguin algorithm, or may lead to manual sanctions. Should not be deleted good links, because they help you rank higher in search results.

Can't you recover from the sanctions of the Penguin algorithm?

Sometimes, even after webmasters do a lot of work cleaning up inbound links, they still don't see any improvement in traffic volume or rankings.

There are several reasons for this:

  • The initial increase in traffic and improvement in search rankings that occurred before the algorithmic penalties were imposed were and came from bad inbound links.
  • When the bad external links were removed, no effort was made to generate new high quality link juice.
  • Not all spam links have been disabled.
  • The problem is not with external links.

When you recover from penalties imposed by the Penguin algorithm, don't expect your search rankings to return or to happen instantly.

Add to this the fact that Google is constantly changing Penguin Google filter, so factors that were beneficial in the past may not have as much of an impact now.

Myths and misconceptions associated with the Penguin algorithm

Here are a few myths and misconceptions about the Penguin algorithm that have arisen in recent years.

Myth: Penguin is a punishment

One of the biggest myths about the Penguin algorithm is that people call it a punishment ( or what Google calls manual sanctions).

Penguin is strictly algorithmic in nature. It cannot be applied or removed by specialists Google in manual mode.

Despite the fact that the action of the algorithm and manual sanctions can lead to a big drop in search results, there is a big difference between them.

Manual sanctions occur when Google's webspam specialist responds to a complaint, conducts an investigation, and determines that a domain should be sanctioned. In this case, you will receive a notification in Google Search Console.

When manual sanctions are applied to a site, you need to not only analyze external links and send a Disavow file containing spam links, but also send a request for a review of your “case” by the Google team.

The lowering of positions in search results by the algorithm occurs without any intervention from Google specialists. Does everything Google Penguin algorithm.

Previously, you had to wait for an algorithm change or update, but now Penguin works in real time. Therefore, the restoration of positions can occur much faster ( if the cleaning work was done efficiently).

Myth: Google will notify you if the Penguin algorithm affects your site

Unfortunately, this is not true. Google Search Console won't notify you that search positions your site has deteriorated as a result of the Penguin algorithm.

This again shows the difference between algorithm and manual sanctions - you will be notified if a site has been penalized. But the process of recovery from the effects of an algorithm is similar to recovery from manual sanctions.

Myth: Disavowing bad links is the only way to recover from the effects of the Penguin algorithm

While this tactic can remove many low-quality links, it is not helpful. The Penguin algorithm determines the proportion of good, quality links relative to spam links.

So instead of focusing all your energy on removing low-quality links, you should focus on increasing the number of quality inbound links. This will have a positive effect on the ratio that the Penguin algorithm takes into account.

Myth: You won't recover from the effects of the Penguin algorithm.

Can remove the site from Google Penguin. But this will require some experience in interacting with Google's ever-changing algorithms.

The best way to get rid of negative influence The Penguin algorithm is to forget about all existing external links leading to the site and start collecting new ones. The more quality inbound links you collect, the easier it will be to free your site from the confines of the Google Penguin algorithm.

This publication is a translation of the article “ A Complete Guide to the Google Penguin Algorithm Update", prepared by the friendly project team

Good bad

Google Penguin is a search engine algorithm aimed at eliminating web spam. Launched in April 2012 to combat purchased links, it has undergone several changes. The September 2016 release made the Penguin filter part of Google's core algorithm.

Principle of operation

Google Penguin has been working in real time since 2014, but only since 2016 the database has not been updated manually. If previously you had to wait for an update, which could take time, now you can get out from under the filter literally immediately after correction and re-indexing.

Sanctions in the form of demotion in ranking apply to individual pages, and not the entire site as a whole.

How to determine if a site is under a filter

Signs:

  • subsidence of positions by 30-50 points;
  • significant reduction in traffic from Google.

If the restriction is imposed as a result manual check, in Search Console, in the Manual Actions section, you can see a message about penalties for web spam.

There is no notification for automatic assignment. To find out whether a site was damaged by Google “Panda” or “Penguin”, you need to check the texts and incoming links. If the problem is in the content, it’s “Panda”; if bad backlinks are visible, it’s “Penguin”.

How to remove a site from a filter

The site is lowered in search results due to low-quality link mass, so you need to conduct an audit of purchased links and remove spam ones from satellites, irrelevant resources, link sinks, etc. At the same time, it is worth increasing the number of natural links.

The reason for falling under Google filter Penguin can become:

  • an abundance of spam anchors. If most of the anchor list is direct occurrences of promoted commercial queries, the risk is very high;
  • poor link growth dynamics. Sharp jumps and mass shooting indicate unnaturalness;
  • discrepancy between the topics of the donor and acceptor. The higher the relevance, the more natural the placement looks;
  • spam placement of the backlink (background color, small print, single-pixel image, in the wrong place, etc.);
  • donor spam. If there are a lot of outgoing links from a page or the site as a whole, this raises suspicions about sales;
  • low quality of referring sites. Understood in a broad sense.

All identified low-quality backlinks must be removed or rejected using a special tool in the Search Console.

It may take anywhere from a week to several months for Google Penguin to remove ranking restrictions.

How to protect yourself

  • be sure to dilute the anchor list so that there are not only direct and diluted occurrences of keys, but also anchors like “here”, “here”, etc.;
  • ensure a smooth increase in the link mass;
  • control the ratio of purchased and natural links;
  • strive for a diversity of donors so that there is not an unnaturally large number of backlinks from one resource;
  • provide links to different pages site, and not just the home page;
  • monitor the quality of each donor.

The best way to protect yourself from the risk of being targeted by Google Penguin is to increase the quality of links, not their quantity.

Did you know that search algorithms Google takes into account more than 200 ranking factors and the degree of importance of each of them is individual for each individual site? Did you know that over the course of a year, Google cumulatively makes more than 500 changes to its algorithms, and before full implementation, each changed version is tested on a small sample of search engine users?

Today we will talk about perhaps the most famous Google algorithm – Penguin (“Penguin”).

What is Google Penguin?

Google Penguin is an algorithm created to combat low-quality sites and web spam. The date of his “birth” is considered to be April 24, 2012. In order not to fall under the influence of the algorithm, Google provides a number of recommendations that must be taken into account when promoting a site in this search engine. Here are some of them:

    the site should not contain automatically generated content;

    the site should not contain pages with unoriginal or duplicate content;

    the site should not provide different content or different URLs to users and search engines;

    redirecting to another URL should not be used to cheat search engine;

    It is prohibited to use hidden text and hidden links to influence the site's ranking in Google search results;

    on sites participating in affiliate programs, your own content should prevail over the content provided by the partner platform;

    the keywords on the page must fully correspond to its content;

    It is prohibited to create malicious pages for phishing and installation of viruses, Trojans or other malware;

    incorrect use of markup for extended descriptions of web pages is prohibited;

    sending prohibited automatic requests on Google.

If your site has any of the following: this list and he fell under the Penguin sanctions, then it will be possible to get out from under them only with the next update of the algorithm and only if the errors on the site that led to the penalty are corrected. With each new version“Penguin”, the frequency of updates is increasing, which means that the speed of exit from sanctions is also increasing.

What does Google Penguin do?

The algorithm affects the ranking of the site. Some say that this is not a decrease in search results, but only sending the site to the position it should actually occupy. But this is not an entirely true statement, since a decrease in rankings in search results is associated with the presence and assessment of the weight of negative factors of the site, and, as we wrote above, for each site the degree of influence of negative factors is individual, and this is already a penalty.

Algorithm and manual moderation

As a reminder, Penguin is an automated algorithm and will always operate as an automated algorithm. But there is also such a thing as “manual moderation” of a site, which can also lead to pessimization of positions in search results. At the same time, both algorithmic and manual moderation can be carried out on your site simultaneously. It is worth noting that getting out of the sanctions of the Penguin algorithm occurs automatically, but in order to get out of the sanctions imposed in manual moderation mode, you need to contact Google technical support and report what steps have been taken to correct errors on the site .

Problem Definition

To determine the reasons for the penalization of sites by the Penguin algorithm, it is necessary to check internal factors, such as excessive “nausea” of keywords on the page. Text spammed with keywords and other manipulations are what needs to be checked and corrected first of all if the site falls under Penguin sanctions.

There is another common reason for penalization - the presence of external spam links to your resource. As a rule, their appearance is associated with attempts to carry out independent external SEO optimization, during which some site owners run it through free catalogs, bulletin boards and similar low-quality sites. To avoid similar problems, it is best to either contact proven SEO specialists or trust the website promotion automated systems, such as SeoPult, which have their own customized filters and filter out low-quality sites, preventing them from accessing client sites. If you did SEO yourself and received links from spam sites, then it is worth conducting a link audit to determine which links negatively affect your site and which of them should be removed or disabled from search engine registration (using the Google Disavow Tools). You can get a list of external links for auditing via Google Webmaster Tools. But since Google may not see all the links, it is worth supplementing the list using third-party tools such as: Bing Webmaster Tools, Ahrefs, Majestic, Open Site Explorer and SEMrush. They all have their own link bases that can complement the Google Webmaster Tools list.

Disavow tool (disavow links)

As soon as you receive full list external links and determine which of them influenced the penalization of your resource, you can reject their inclusion in Google search. There is a tool for this Disavow Tool, which contains a list of links whose impact on your site you should stop considering. Google's John Mueller says the tool is similar to a tag, but is designed for use on external resources. You can prohibit indexing both individual external links and all links leading from a specific domain. John says adding links to Disavow is more than enough to get you out of Penguin's sanctions. If you are subject to manual moderation (penalization), you should record all your actions taken to get out of sanctions and later report the changes to the support service.

Re-verification request

In case of receiving a manual penalty the only way exit from sanctions is to submit a request for re-inspection in the account Google Webmasters. You need to send it only after you, from your point of view, have corrected everything that could lead to the site being penalized. Your request must indicate what errors you identified and what steps you took to correct them. The more detailed the description of your actions, the higher the likelihood of a quick exit from Google sanctions.

High-quality links that will help you get out of Penguin sanctions

According to John Mueller from Google, already known to us, a reasonable ratio of a large number of high-quality external links to a small number of spam links can help get out of the Penguin sanctions with the least loss of time and effort or not fall under them at all. Unfortunately, no one knows the exact quantitative ratio of high-quality links to low-quality links, so try to systematically check the quality of your link mass and immediately remove them or add them to Disavow if you detect web spam.

Black SEO

Black hat SEO is a way to combat competitors, with which you can greatly reduce the position of a competitive site in search results. One such method, for example, is “link explosion”. But it is worth remembering that the tools black SEO may not always be effective, especially if they are applied to sites with high trust and high-quality link mass. In some cases, a link explosion can be a plus for a competitor site, since new links (even low-quality ones) can significantly “weight” the accumulated high-quality link mass and, instead of a negative effect, bring the competitor’s site to higher positions (relative to the previous ones). It is also worth noting that Google is learning to recognize the manipulations of competitors and protect the sites against which they are directed.

Conclusion

Determining on your own whether a site is subject to Google sanctions is quite difficult, especially if they are algorithmic rather than manual in nature. IN Google moderator A proposal has already been made to add a tool to the webmaster’s account that allows you to determine the presence of “Penguin” or “Panda” sanctions imposed on a particular site, or the absence thereof. If this feature is added, it will become much easier to diagnose and correct site errors identified by specific algorithms.

A new algorithm called “Penguin”. All webmasters who automated the process of website promotion as much as possible felt the effect of the filter in a sharp drop in traffic. What happened this year and what current situation with Penguin?

Brief Introduction

Literally a week after the launch of the new algorithm, analytics appeared on MicrositeMasters, the main essence of which was as follows. The sites that suffered the most were those with backlinks:

  • Many direct occurrences of keywords.
  • Few occurrences in the form of a site address (URL).
  • Few thematic donors.

It is worth noting that Penguin affected only 3-5% of search queries (Panda – 12%), that is, it is not used for all topics, but it was difficult not to notice its influence.

No quick fixes

Since many companies began to lose part of their business due to the new algorithm, everyone was looking for quick solutions exit from the filter. One way out of the situation was a 301 redirect to new domain(which previously did not transmit the filter, but transmitted all the positive qualities of the links). However, in August 2012, the Google webspam team disabled this feature and, practically, all sites that were temporarily excluded from Penguin were again excluded from the TOP.

Currently, 301 redirects do not work to escape the filter.

Moreover, since Google finds mirrors not only by redirect, but also by similar content, even changing the site (domain) did not help (if the content remained the same), since Google understood that this was the same site with old story. Matt Cutts states that it will take time for the site to regain trust.

During the year there were only 2 updates to the Penguin algorithm - May 25 and October 5, 2012. And as stated at the SMX West conference, the largest Penguin update is expected this year (exactly when is not yet known, it could be just on the anniversary). All this time cases of exit from under the filter - units. Some of the cases are Western, Russian-speaking ones are difficult to find in the public, they mostly circulate on the sidelines between specialists from different companies. But all of them are based on careful/painstaking cleaning of spam, not only in links, but also in the content and code of pages.

What do we know now?

Now we know that there are 3 main reasons for applying the Penguin filter:

Low quality links
This includes spamming anchors with direct occurrences of keys, placing links on low-quality sites, exchanging links to increase each other’s PageRank, participating in link schemes to manipulate search algorithms.

Selling links from the site
This includes the presence of non-thematic outgoing links on the site, as well as general spamming of the site with commercial links.

Template content
Many duplicates in searches are a sign of template content. As a result of the filter, the site sags for many low-frequency queries.

In general, Google places a lot of emphasis on defining internal and external factors. If a search engine has a bad opinion about a site based on a number of factors, then the link disavow tool will not help get rid of the filter.

It is difficult to give recommendations when there is a shortage of positive cases, however, the following can be said. Since Google pays a lot of attention to links, you first need to conduct a link audit, identify bad donors and get rid of them, or add them to Google’s link disavow tool.

Information from a recent video meeting with Google.

Rinat Safin (Google):“Dispel the myth: they say that “penguinized” sites can no longer be pulled out of the depths of the filter - is it easier to quit and make a new one?” Semyon from Kirov asks.

Vladimir Ofitserov (Google): Well, I’ll tell you this - if it’s easier for you to abandon a site on which you spent some part of your life, then it means that the site was not very valuable. Probably no. Penguin, no penguin - any algorithm that we implement will sooner or later be updated. We try to do it as often as possible, but it is not always possible to do it every day. Accordingly, if your site... there are no paid links to it, if you removed the remaining paid links through the disavow tool, if there are no manual measures against you, your site will return to where it deserves to be. And then, strictly speaking, it depends on how much users like your site, the content that is provided there, and, accordingly, how interesting it is.

Use an integrated approach, don’t stop working on cleaning up links, improve the site as much as possible for the user. And be sure to share your positive case and other observations in the comments.







2024 gtavrl.ru.