Myths and misconceptions associated with the Penguin algorithm. Automatic filter for backlinks


If you are wondering: how to remove a site from Penguin? How to remove Google manual filter? Then, this guide will help you solve these problems, regain positions and traffic.

In the search engine Google system There are dozens of well-known filters that can greatly affect promotion, as well as hundreds that few people know about.

Today we will talk about the most basic and common filters. Namely:

  1. Google automatic filter for backlinks

All of them in one way or another relate to an algorithm called Google Penguin, which came into force more than a year ago and has already managed to make a lot of noise.

Symptoms of such a filter

  1. collapse of positions
  2. sharp decline in website traffic

In practice it will look like this:

Not exactly a pleasant situation. Especially when in most cases your main source of attracting visitors is search traffic.

Now let’s take a closer look at each type of filter for backlinks.

Manual Google filter for artificial incoming links

Often it all starts with a message in the Google Webmasters panel. It looks like this:

The notification contains a message of this nature:

Messages can be different, for example:

To find manual filter messages, do this:

After receiving a notification about artificial incoming links, the following are usually the consequences:

A) Within 2-3 weeks, positions drop significantly, after which traffic from Google search disappears

B) Positions immediately disappear and traffic drops

Reasons for Google Manual Filter

The main signal that causes such a notification to arrive is the anchor link and its overspam.

In the example, the text of backlinks to one of the sites that was affected by the filter. The main reason for the filter is anchor spam.

Another example:

If you look at unique domains, then we get the following picture:

Using backlinks with commercial or other keywords leads to manual filtering, loss of rankings and traffic.

What then to do and what to do?

The solution is extremely simple - do not use large percentage links with keywords.

Step-by-step instructions for removing Google manual filter

  1. Google Notification
    Checking if there is a notification in Google Webmaster Tools. If yes, then move on to the next point.
  2. Request for review
    In the first review request, it is important to clarify which links violate the search rules and ask what needs to be done to remove manual sanctions.
  3. We get the answer

    In most cases, the response indicates those links, according to which the site violates the search rules. For example:

    As a result, we can determine which links Google pays attention to and considers them spam.

  4. We carry out the indicated actions

    What you do next depends heavily on your link profile.

    Situation 1

    There are a large number of rented links to your site, and in most cases these are direct anchors that contain keywords.

    In this case, it is necessary:

    1. clean links with anchors (including those in the Google example)
    2. move on to the next point - new request for review

    If you really remove most of these links, then the manual filter can be removed from one or two queries.

    Situation 2

    In this case, it is necessary:

    1. view all backlinks (you can use backlink checking services, for example: Ahrefs, MajesticSeo, LinkPad)
    2. make a list of links that are unwanted (main criteria: anchor, low quality of the site)
    3. add links to Disawov Tool - https://www.google.com/webmasters/tools/disavow-links-main?/
    4. wait 1-2 weeks (from practice it is necessary for the links to be re-indexed)
    Next, move on to the next point and submit a request for review.
  5. Submitting a new request for review

    Your request for reconsideration must:

    1. clearly describe what you did
    2. simple and clear
    3. and clarify what else needs to be done to remove the filter

    After that, you need a little time to revise your site. This may take from a couple of days to 3-4 weeks.

  6. Intermediate answer

    The intermediate response usually contains the following content:

    This notification from Google comes almost immediately after you submit your review request.

  7. We are waiting for a decision

    The answer usually comes within 1-2 weeks, sometimes longer, and sometimes faster.

    It either contains a negative answer, for example:

    If the answer is negative, That:

    1. reviewing the links again
    2. add to Google Disawov
    3. or just remove links
    4. in any case, action needs to be taken
    5. submit a new request for review

    But the answer after the above steps may be positive, For example:

    In this case, congratulations! The Google manual filter has been removed.

In our example, there was no quick way removing a site from Google’s manual filter, this is what the chronology of events looks like:

Important points when removing the filter yourself:

  1. To not give up
  2. take action (not just send requests for review)
  3. if you do everything as described above, then the manual filter from Google will be removed, you will return positions and traffic

Here is information that will complement this material and help you successfully remove the filter:

  1. USA themes
  2. Case study: how to remove manual Google sanctions using the Disavow Tool and return positions and traffic
  3. Case study: how to remove Google’s manual sanctions and return positions and traffic

There is nothing wrong with a manual filter; the most important thing is not to be lazy and do a series of actions that will lead to the desired result.

The result is a return of attendance to its previous level, for example:

IN in this example A couple of sites were considered, the filters from which were removed at different times.

By filter removal time:

  1. 4 day record
  2. longest period 3.5 months

But in any case, you need to act promptly, and also take into account the points described above.

Automatic filter for backlinks

With Google's automatic filter, everything is much more complicated. Since there is no message about it.

Important: automatic sanctions may be lifted if official update Google algorithm Penguin, which occurs on average once every 2-4 months.

No matter what you do, until the algorithm is updated, nothing will change if these are automatic sanctions.

We've removed the automatic filter more than once, but things don't always happen quickly.

Filter Signs

Everything is the same as with the manual one, but no notification comes.

  1. positions first decline at – 30-50
  2. traffic to the site drops

And then he doesn’t come back for a long time.

Main reasons:

  1. Direct entry of link anchor
  2. Low quality backlinks

Step-by-step guide to removing the automatic filter

IN in this case Unlike manual, everything takes an order of magnitude longer and is more complicated.


In conclusion about the Google Penguin filter

The algorithm was launched in 2012 and is constantly being improved. Its main task is to combat spam and artificial manipulation of search results.

To avoid falling under the filter, necessary:

  1. do not use a direct anchor sheet very noticeably, or not use it at all
  2. attract quality links to the site
  3. try to focus on the product and its quality in order to receive natural links (yes, this can be done even in the realities of the RuNet)
  4. maintain dynamics - Google likes constant dynamics

Then you will not have any questions with filters and your traffic will constantly grow. Naturally, if you work on:

  1. site content
  2. attracting natural links (not purchased)

Even if you have already stepped on a rake and received sanctions from Google, then you should not be disappointed - there is always a solution.

I hope this step by step guide will help solve your problem - return positions and traffic from search engine Google.

Important: This article only covers filters related to the Google Penguin algorithm and backlinks.

If your rankings and traffic have dropped sharply, then the reason may not be in the links at all, but, for example, in the content. The algorithm is responsible for this Google Panda.

Good luck to you! See you soon on the pages of the blog site

You can watch more videos by going to
");">

You might be interested

Anchor - what is it and how important are they in website promotion? Disavow links or how to determine which Google filter (Panda or Penguin) a site is under
SEO terminology, acronyms and jargon Why you can get banned from Yandex, fall under the AGS or a footcloth filter, as well as ways to get out of these sanctions
Rel Nofollow and Noindex - how to block indexing by Yandex and Google external links Online
Site trust - what is it, how to measure it in XTools, what influences it and how to increase the authority of your site

We all know very well, dear friends, how search engines pore over their algorithms, filters, and so on, in order to give ordinary Internet users what they are looking for as clearly as possible. But these same algorithms simultaneously clean the network from the same spam or low-quality resources, which, by hook or by crook, somehow ended up in the TOP. “It won’t work like that” - somehow the minds of the search giants think so, and once again the guys from Google came out with this thought.

So for the second week now, the webmaster part of the network has been buzzing and the majority are indignant: “Google introduced something with something and this hellish mixture cut our positions, which in turn reduced traffic.” Yes, indeed, while analyzing the competition in the niches that interest me, I noticed serious changes. Traffic on many competitive resources was cut by 10, 20, 50, or even more percent. Why go far, look at some SEO blogs, it’s unusual to see traffic of 150-200 users per day.

So, what did Google come up with...

On April 20, 2012, a message from Google developers appeared on the Internet with approximately the following content:

“In the coming days we are launching an important algorithm change targeting Webspam. The changes will reduce the rankings of sites that we believe violate Google's site quality requirements."

On the night of April 24-25, a new Google algorithm was introduced - Google Penguin (Google Penguin). Google's love for animals has generated a lot of buzz. On the same Search, several topics were formed with a huge total number of pages (more than 500) discussing the new Google Penguin algorithm. As always, there were significantly more dissatisfied people than satisfied ones, because the satisfied ones sit quietly and do not burn their achievements, which were only eaten by the Google Penguin with a “Hurrah”.

Let's first get acquainted with the very basic requirements for website quality that Google puts forward:

  1. Do not use hidden text or hidden links. Google, and not only Google, have long been marching under the banner of philanthropy. What a person does not see is understood and perceived by search engines as an attempt to influence the search results, and this is comparable to manipulation and is suppressed in the form of pessimization or some other “damage”. I remember that at one time hidden text was very fashionable, of course, because it had an effect. Black text on a black background with a certain optimization, you got to the page and wondered where was what the search engine was showing.
  2. Do not use cloaking or hidden redirects. No need to try to give away search robot one information, and the user another. The content should be the same for everyone. Regarding redirection, there are those who like to rip off money from redirection or sale mobile traffic, Google decided to disappoint a little.
  3. Don't send automatic requests VGoogle.
  4. Don't overload your pages with keywords. This principle has long been supported by Yandex and positioning against over-optimization of sites clearly killed the work of adherents of stuffing the site with keys. If earlier it was possible to stuff articles with keywords and enjoy traffic from Google, now, as you can see and understand, not everything is so simple. But don’t forget about user factors or so called behavioral factors. If they are at a good level, then even slight over-optimization is not a problem, because user behavioral factors have always been, are and most likely will be a priority. Only here everything is somewhat more complicated - imagine for yourself what needs to be given and in what volume so that these same behavioral factors are at their best. This suggests that the content really should be high level and interest, and not a light rewrite from a competing site or niche leader.
  5. Do not create pages, domains or subdomains that substantially duplicate the content of your main site or any other site. At this point, Googlers immediately put together their views on affiliation, site networks, doorways, copy-paste, and low-quality rewriting.
  6. Don't create malicious pages such as phishing or containing a virus, Trojan horse or other malicious software. This point should not be chewed on at all, fight viruses in one word.
  7. Don't create doorways or other pages designed just for search engines.
  8. If your site works with affiliate programs, then make sure it provides value to the network and users. Provide it with unique and relevant content that will attract users to you in the first place.

Google identified these 8 principles as the main ones. But there are also 4 more that were especially noted by them:

  1. Create site pages primarily for users, not search engines. Do not use disguises or other schemes for working with sites, be extremely transparent.
  2. Don't try to use tricks to increase your site's ranking in search engines.
  3. Do not participate in link building schemes designed to increase your site's ranking in search engines or Google Page Rank. In particular, avoid links that look like spam, link dumps, or bad neighbors on the server (I draw your attention to analyzing and viewing neighboring sites if your site is hosted on a regular shared hosting).
  4. Do not use unauthorized software solutions to automatically contact Google. The developers themselves highlight such as WebPosition Gold™.

But everything seemed to be known to everyone who was interested in the issue before.

In the work of the new Google Penguin algorithm, I was surprised by the strict adherence to the principle of “advertising loading” of the first page. Remember, it was said that the first page (home page) should not be overloaded with advertising. Google Penguin has clearly begun to adhere to this principle. It cut off traffic for sites with several large blocks of advertising on the main page, even if these blocks were their own - Google advertising Adsense. 🙂

Observations on the work of Google Penguin

Now I would like to list a number of my own observations. I understand that they are approximate and represent my personal observations and data (indicators can be perceived as relative), but they have their place and have a right to exist. I analyzed the work of 20 sites, both mine and those of my partners. The analysis took me 3 days, and I assessed a lot of indicators. As you understand, all sites were promoted using different schemes and using different algorithms, had completely different indicators, which made it possible to draw a number of conclusions.

1. Exact entry. If earlier, in order to be in Google in chocolate, you needed a lot of exact entries, now with Google Penguin (Google Penguin) everything is exactly the same and vice versa. This may be exactly the fight against Webspam. Yandex has long been fond of diluting anchors, and now the matter has come to Google.

  • pages with external links have an exact entry of 100% - the drawdown was 75-90%. Average drop of approximately 38 positions;
  • pages with external links exact entry 50% - drawdown was 15-20%. Average drop of approximately 9 positions;
  • pages with external links of exact occurrence less than 25% - an increase of 6-10% was noticed. Average rise 30 positions.

Based on these data, I made a conclusion - we dilute the anchors and dilute them as interesting and deep as possible.

A striking example is the “gogetlinks” request on this blog. Exact occurrences greatly outnumber dilute occurrences, and this is the result:

2. Buying temporary links. I bought temporary links for finishing or quick results on all analyzed resources and with different systems. These were Sape, Webeffector, Seopult and ROOKEE.

Automatic promotion generators Webeffector and ROOKEE gave approximately the same results. The drawdown was practically not noticed at all, only a small one on Webeffector, but it is insignificant and is more related to the dilution of anchors. In other moments, even growth is observed, what can I say, here is a screenshot of the campaign (clickable):

As for Sape, the picture is completely different. All projects for which links were purchased from Sape sank. All the requests that were moving in this exchange flew out of the TOP 100 and even collecting statistics on where they flew there became somehow stressful, which in the end I didn’t do.

Analyzing the impact of Google Penguin on promotion with Sape, I concluded that Google now perceives the placement of links on this exchange as unnatural.

Here I started actively removing links. But it makes sense to give your own examples when you can show you bright ones from our own niche. Let's take my friend's blog - Sergeya Radkevich upgoing.ru. The man worked with Sape for several months and was happy with the increase in traffic until Google Penguin came along. Let's look:

It’s also worth looking at the source chart search traffic:

As you can see, Google Penguin has reduced traffic from Google by more than 7 times.

The conclusion on this point is that some filters still need to be used when working with temporary links and some kind of placement and purchasing algorithms. Automatic services they just work according to certain schemes, unlike the same Sape. The results are obvious, by the way.

Sites with Seopult are generally expected to increase their positions. Here I used the Seopult Max algorithm for Yandex, but as I see, now it also works with Google.

3. Over-optimization of content. A decline was also noticed here, but not as significant as in the previous parameters. Within 10 positions, only 10-15% of over-optimized articles lost.

Here I conclude that it’s not so scary to slightly over-optimize, giving yourself some guarantees. And you can catch up with lower impressions by purchasing links.

4. Eternal links. Queries promoted eternal links with a normal and natural appearance, they only increased their ranking more seriously. Some HF VCs climbed into the TOP 20 without any manipulation, due to a noticeable decline in most of their competitors. Here I once again conclude that my work in the direction of promotion ONLY with eternal links is correct.

1. Take all the information above into account and work on both the content of your site and its promotion.

2. Check for notifications from Google in Google Webmaster Tools with a message about spam activity on your resource. To do this, go to http://www.google.com/webmasters/, log in. Next, go to the statistics of your site and go to the messages section (clickable):

If there are still messages, then it will be necessary to take measures to solve the problems indicated in the messages. This may include removing links... Not all yoghurts are equally healthy... 😉

3. Check the site for availability malware in Google Webmaster Tools:

The solution to the issue is the same as in the previous paragraph. We identify programs and files marked by Google as malicious, find them on the server, or replace or delete them.

If everything is really bad and you have given up, then fill out the form and sign the created petition to Google against Google Penguin. This, of course, does not guarantee you anything, but at least a moment of small self-satisfaction and the feeling of “I did everything I could” will definitely come. On the same topic you can use the form feedback contact the algorithm developers.

Personally, I lost a little, since the main emphasis was on truly SDL and promotion with eternal links. I got rid of the negativity within 2 days and began to grow at the same rate. Well, whoever liked to do everything quickly and easily - you are left mostly chewing snot.

As one of my friends said: “Now the hackwork will be visible. Big changes await those who love the fast, cheap and easy. Those who break the bank by promoting mere mortals on the Internet, while openly hacking, will suffer a fiasco and listen to a lot of complaints from clients”...

You can post your thoughts about Google Penguin or the results of the changes in the comments.

Happy and quality changes to everyone!

At school:
- Children, come up with a sentence with the expression “It was a little.”
Kolya:
- Our Olya almost became a beauty queen!
Peter:
- On Saturday, my mother and I almost missed the train...
Vovochka:
- This morning Lyokha and I almost died from a hangover, but we almost had...

Despite the fact that changes in Google algorithms are one of the hottest topics in the field of SEO, many marketers cannot say with certainty how exactly the Panda, Penguin and Hummingbird algorithms affected the ranking of their sites.

Moz specialists have summarized the most significant changes to Google's algorithms and literally broken down the information about what each update is responsible for.

Google Panda – Quality Inspector

The Panda algorithm, whose main goal is improving the quality of search results, was launched on February 23, 2011. With its appearance, thousands of sites lost their positions, which excited the entire Internet. At first, SEOs thought that Panda was penalizing sites found to be participating in link schemes. But, as it later became known, the fight against unnatural links is not within the mandate of this algorithm. All he does is assess the quality of the site.

To find out if you are at risk of falling under Panda's filter, answer these questions:

  • Would you trust the information posted on your website?
  • Are there pages on the site with duplicate or very similar content?
  • Would you trust your credit card information to a site like this?
  • Do the articles contain spelling or stylistic errors or typos?
  • Are articles for the site written taking into account the interests of readers or only with the goal of getting into search results for certain queries?
  • Does the site have original content (research, reports, analytics)?
  • Does your site stand out from the competitors that appear alongside it on the search results page?
  • Is your site an authority in its field?
  • Do you pay due attention to editing articles?
  • Do the articles on the site provide complete and informative answers to users' questions?
  • Would you bookmark the site/page or recommend it to your friends?
  • Could you see an article like this in a printed magazine, book, or encyclopedia?
  • Does advertising on the site distract readers' attention?
  • Do you pay attention to detail when creating pages?

Nobody knows for sure what factors Panda takes into account when ranking sites. Therefore, it is best to focus on creating the most interesting and useful website. In particular, you need to pay attention to the following points:

  • "Insufficient" content. In this context, the term “weak” implies that the content on your site is not new or valuable to the reader because it does not adequately cover the topic. And the point is not at all in the number of characters, since sometimes even a couple of sentences can carry a considerable semantic load. Of course, if most of your site's pages contain only a few sentences of text, Google will consider it low quality.
  • Duplicate content. Panda will consider your site to be of low quality if most of its content is copied from other sources or if the site has pages with duplicate or similar content. This is a common problem with online stores that sell hundreds of products that differ in only one parameter (for example, color). To avoid this problem, use the canonical tag.
  • Low quality content. Google loves sites that are constantly updated, so many SEOs recommend publishing new content daily. However, if you publish low-quality content that does not provide value to users, then such tactics will cause more harm.

How to get out from under the Panda filter?

Google updates the Panda algorithm monthly. After each update, search robots review all sites and check them for compliance with established criteria. If you fell under the Panda filter and then made changes to the site (changed insufficient, low-quality and non-unique content), your site’s position will improve after the next update. Please note that you will most likely have to wait several months for your positions to be restored.

Google Penguin – Link Hunter

The Google Penguin algorithm was launched on April 24, 2012. Unlike Panda, this algorithm aims to combat unnatural backlinks.

The authority and significance of a site in the eyes of search engines largely depends on which sites link to it. Moreover, one link from an authoritative source can have the same weight as dozens of links from little-known sites. Therefore, in the past, optimizers tried to get the maximum number of external links in every possible way.

Google has learned to recognize various manipulations with links. How exactly Penguin works is known only to its developers. All SEOs know is that this algorithm hunts for low-quality links that are manually created by webmasters in order to influence a site's rankings. These include:

  • purchased links;
  • exchange of links;
  • links from irrelevant sites;
  • links from satellite sites;
  • participation in link schemes;
  • other manipulations.

How to get out from under the Penguin filter?

Penguin is the same filter as Panda. This means that it regularly updates and reviews sites. To get out of the Penguin filter, you need to get rid of all unnatural links and wait for an update.

If you conscientiously follow Google's guidelines and don't try to gain links through unfair means, you can regain favor with the search engines. However, to regain the top positions, it will not be enough for you to simply remove low-quality links. Instead, you need to earn natural editorial links from trusted sites.

Google Hummingbird is the most “understanding” algorithm

The Hummingbird algorithm is a completely different beast. Google announced the launch of this update on September 26, 2013, but it also mentioned that this algorithm has been in effect for a month. This is why website owners whose rankings fell in early October 2013 mistakenly believe that they fell under the Hummingbird filter. If this were really the case, they would have felt the effect of this algorithm a month earlier. You may ask, what in this case caused the decrease in traffic? Most likely, this was another Penguin update, which came into force on October 4 of the same year.

The Hummingbird algorithm was developed to better understand user requests. Now, when a user enters the query “What places can you eat deliciously in Yekaterinburg,” the search engine understands that by “places” the user means restaurants and cafes.

How to raise yourself in the eyes of Google Hummingbird?

Since Google strives to understand users as best as possible, you should do the same. Create content that will provide the most detailed and useful answers to user queries, instead of focusing on keyword promotion.

Finally

As you can see, all search algorithms have one goal - to force webmasters to create high-quality, interesting and useful content. Remember: Google is constantly working to improve the quality of search results. Create content that will help users find solutions to their problems, and you will be guaranteed to rank first in search results.


07.11.17

In 2012, Google officially launched " anti-web spam algorithm”, aimed against spam links, as well as link manipulation practices.

This algorithm later became officially known as the Google Penguin algorithm after a tweet from Matt Cutts ( Matt Cutts), who later became the head of Google's web spam division. Although Google officially named the algorithm Penguin, there was no official comment on where this name came from.

The Panda algorithm got its name from the engineers who worked on it. One theory for the origin of the name Penguin is that it is a reference to the Penguin, the DC comic book hero Batman.

Before the introduction of Penguin, link count played a significant role in how Google's crawlers rated web pages.

This meant that when sites were ranked by these scores in search results, some low-quality sites and pieces of content appeared in higher positions.

Why was the Penguin algorithm needed?

Google's war on low-quality search results began with the Panda algorithm, and the Penguin algorithm has become an extension and addition to the arsenal.

Penguin was Google's response to its increasing practice of manipulating search results (and rankings) through spam links.

The Penguin algorithm only processes incoming links to the site. Google analyzes the links leading to the site and does not perceive the outgoing link mass.

Initial launch and influence

When first launched in April 2012 Penguin Google filter influenced more than 3% of search results, according to Google's own estimates.

Penguin 2.0, fourth update ( including the original version) algorithm, was released in May 2013 and affected approximately 2.3% of all search queries.

Key changes and updates to the Penguin algorithm

There have been several changes and updates to the Penguin algorithm since its launch in 2012.

Penguin 1.1: March 26, 2012

This was not a change to the algorithm, but the first update of the data within it. Sites initially affected Penguin algorithm, but after getting rid of low-quality links, they received some improvement in their positions. At the same time, other sites that were not affected by the Penguin algorithm when it was first launched have seen some impact.

Penguin 1.2: October 5, 2012

This was another data update. It affected requests for English language, as well as international inquiries.

Penguin 2.0: May 22, 2013

More advanced ( from a technical point of view) a version of the algorithm that has changed the degree to which it influences search results. Penguin 2.0 affected approximately 2.3% of English-language queries, and approximately the same share of queries in other languages.

It was also the first Google Penguin algorithm update to look beyond the homepage and top-level pages for evidence of spam links.

Penguin 2.1: October 4, 2013

The only update to the Penguin 2.0 algorithm (version 2.1) was released on October 4 of the same year. It affected about 1% of search queries.

Although there was no official explanation from Google for the update, statistics indicate that the update also increased the depth of page views and introduced additional analysis for the presence of spam links.

Penguin 3.0: October 17, 2014

It was another data update that allowed many sanctioned sites to regain their positions, and others that abused spam links but hid from previous versions of Penguin to feel its impact.

Google employee Pierre Far ( Pierre Far) confirmed this and noted that the update would require " few weeks» for full deployment. And that the update affected less than 1% of English-language search queries.

Penguin 4.0: September 23, 2016

Almost two years after update 3.0 was released last change algorithm. As a result, Penguin became part of Google's core search engine algorithm.

Now, working simultaneously with the basic algorithm, Google Penguin 4 evaluates sites and links in real time. This means you can see the relatively quick impact of changes to external links on your site's rankings. search results Google.

The updated Penguin algorithm was also not aimed strictly at imposing sanctions; it devalued the value of spam links. It's become the opposite previous versions Penguin, when bad links were punished. But research shows that algorithmic sanctions based on external links are still used today.

Algorithmic downgrades in Penguin

Shortly after the launch of the Penguin algorithm, webmasters who practiced link manipulation began to notice a decrease in search traffic and search rankings.

Not all of the downgrades triggered by the Penguin algorithm affected entire sites. Some were partial and affected only certain groups keywords, which were actively clogged with spam links or “ overly optimized».

A site sanctioned by Penguin took 17 months to regain its position in Google's search results.

Penguin's influence also extends across domains. Therefore, changing the domain and redirecting from the old to the new can lead to even more problems.

Research shows that using 301 or 302 redirects does not negate the impact of the Penguin algorithm. On forum Google webmasters John Mueller ( John Mueller) confirmed that using meta refresh redirects from one domain to another can also lead to sanctions.

Recovering from the effects of the Penguin algorithm

The Disavow Links Tool was useful for SEO specialists, And that hasn't changed even now when Google Penguin algorithm functions as part of Google's core algorithm.

What to include in a disavowal file

The Disavow file contains links that Google should ignore. Thanks to this, low-quality links will not reduce the site’s ranking as a result of the Penguin algorithm. But this also means that if you mistakenly included high-quality links in your disavow file, then they will no longer help the site rank higher.

You don't need to include any comments in Disavow unless you want them yourself.

Google doesn't read the comments you make in the file because it is processed automatically. Some people find it easier to add explanations for later reference. For example, the date when a group of links was added to the file or comments about attempts to contact the site's webmaster to remove the link.

After you upload your Disavow file, Google will send you a confirmation. But, although Google will process it immediately, it will not disavow the links at the same time. Therefore, you will not be able to immediately restore your site’s position in search results.

There is also no way to determine which links have been disavowed and which have not, since Google will still include both in the external link report available in Google Search Console.

If you previously downloaded Disavow and submitted it to Google, it will be replaced by the new one rather than added to the old one. Therefore, it is important to ensure that you include new file old links. You can always download a copy of the current file in your Google account Search Console.

Disable individual links or entire domains

For removing a website from Google Penguin It is recommended to disavow links at the domain level instead of disabling them individual links.

Thanks to which the search engine Google robot You only need to visit one page on the site to disavow links leading from it.

Disabling links at the domain level also means you don't have to worry about whether a link is indexed with or without www.

Detecting your external links

If you suspect your site has been negatively impacted by the Penguin algorithm, you should audit your external links and disavow low-quality or spam links.

Google Search Console allows website owners to get a list of external links. But it is necessary to pay attention to the fact that Disavow includes links that are marked as “ nofollow" If the link is marked nofollow attribute, it will not have any impact on your site. But it is necessary that the site that posted the link to your resource can remove the “ nofollow» at any time without any warning.

There are also many third-party tools that show links to your site. But due to the fact that some webmasters protect the site from being crawled by third-party bots, the use of such tools may not show all incoming links. In addition, such blocking can be used by some resources to hide low-quality links from detection.

Monitoring external links – necessary task to protect against “Negative SEO” attacks" Their essence is that a competitor buys spam links to your site.

Many people use negative SEO as an excuse when their site gets penalized by Google for low-quality links. But Google claims that after Penguin Google updates The search engine recognizes such attacks well.

A survey conducted by Search Engine Journal in September 2017 found that 38% of SEO specialists have never disavowed external links. Reviewing external links and thoroughly researching the domain of each external link is not an easy task.

Requests to remove links

Some site owners require a certain fee to remove a link. Google recommends not paying for this. Simply include the bad external link in your disavowal file and move on to removing the next link.

Although such requests are effective way restore the site's position after applying sanctions for external links; they are not always mandatory. The Google Penguin algorithm considers the portfolio of external links as a whole, that is, as a ratio of the number of high-quality, natural links and spam links.

Some even include in the agreement for using the site “conditions” for placing external links leading to their resource:

We only support "responsible" placement of external links to our web pages. It means that:

  • If we ask you to remove or change a link to our sites, you will do so promptly.
  • It is your responsibility to ensure that your use of links will not damage our reputation or result in commercial gain.
  • You may not create more than 10 links to our websites or webpages without obtaining permission.
  • The site on which you place a link leading to our resource must not contain offensive or defamatory materials. And also must not violate anyone's copyright.
  • If someone clicks on a link you post, it should open our site in a new page (tab) and not inside a frame on your site.

Link quality assessment

Don't assume that just because an external link is on a site with a .edu domain that it is necessarily high quality. Many students sell links from their personal websites, which are registered in the .edu zone. Therefore, they are regarded as spam and should be disavowed. In addition, many .edu sites contain low-quality links because they have been hacked. The same applies to all top level domains.

Google representatives have confirmed that locating a site in a specific domain zone does not help or harm its position in search results. But you need to make a decision for each specific site and take into account Penguin Google updates.

Beware of links from obviously high-quality sites

When looking at a list of external links, don't assume they are high quality just because they are on a particular site. Unless you are 100% sure of its quality. Just because a link is from a reputable site like the Huffington Post or the BBC doesn't make it high quality. Google's eyes. Many of these sites sell links, some of which are disguised advertising.

Promo links

Examples of promotional links that are paid in the eyes of Google include links placed in exchange for free product for a review or a discount on a product. While these types of links were acceptable a few years ago, they must now be marked with a " nofollow" You can still benefit from posting such links. They can help increase brand awareness and drive traffic to your website.

It is extremely important to evaluate every external link. It is necessary to remove bad links because they affect the position of the resource in search results as a result of exposure Google Penguin algorithm, or may lead to manual sanctions. Should not be deleted good links, because they help you rank higher in search results.

Can't you recover from the sanctions of the Penguin algorithm?

Sometimes, even after webmasters do a lot of work cleaning up inbound links, they still don't see any improvement in traffic volume or rankings.

There are several reasons for this:

  • The initial increase in traffic and improvement in search rankings that occurred before the algorithmic penalties were imposed were and came from bad inbound links.
  • When the bad external links were removed, no effort was made to generate new high quality link juice.
  • Not all spam links have been disabled.
  • The problem is not with external links.

When you recover from penalties imposed by the Penguin algorithm, don't expect your search rankings to return or to happen instantly.

Add to this the fact that Google is constantly changing Penguin Google filter, so factors that were beneficial in the past may not have as much of an impact now.

Myths and misconceptions associated with the Penguin algorithm

Here are a few myths and misconceptions about the Penguin algorithm that have arisen in recent years.

Myth: Penguin is a punishment

One of the biggest myths about the Penguin algorithm is that people call it a punishment ( or what Google calls manual sanctions).

Penguin is strictly algorithmic in nature. It cannot be applied or removed manually by Google specialists.

Despite the fact that the action of the algorithm and manual sanctions can lead to a big drop in search results, there is a big difference between them.

Manual sanctions occur when Google's webspam specialist responds to a complaint, conducts an investigation, and determines that a domain should be sanctioned. In this case, you will receive a notification in Google Search Console.

When manual sanctions are applied to a site, you need to not only analyze external links and send a Disavow file containing spam links, but also send a request for a review of your “case” by the Google team.

The lowering of positions in search results by the algorithm occurs without any intervention from Google specialists. Does everything Google Penguin algorithm.

Previously, you had to wait for an algorithm change or update, but now Penguin works in real time. Therefore, the restoration of positions can occur much faster ( if the cleaning work was done efficiently).

Myth: Google will notify you if the Penguin algorithm affects your site

Unfortunately, this is not true. Google Search Console will not notify you that your site's search rankings have deteriorated as a result of the Penguin algorithm.

This again shows the difference between algorithm and manual sanctions - you will be notified if a site has been penalized. But the process of recovery from the effects of an algorithm is similar to recovery from manual sanctions.

Myth: Disavowing bad links is the only way to recover from the effects of the Penguin algorithm

While this tactic can remove many low-quality links, it is not helpful. The Penguin algorithm determines the proportion of good, quality links relative to spam links.

So instead of focusing all your energy on removing low-quality links, you should focus on increasing the number of quality inbound links. This will have a positive effect on the ratio that the Penguin algorithm takes into account.

Myth: You won't recover from the effects of the Penguin algorithm.

Can remove the site from Google Penguin. But this will require some experience in interacting with Google's ever-changing algorithms.

The best way to get rid of negative influence The Penguin algorithm is to forget about all existing external links leading to the site and start collecting new ones. The more quality inbound links you collect, the easier it will be to free your site from the confines of the Google Penguin algorithm.

This publication is a translation of the article “ A Complete Guide to the Google Penguin Algorithm Update", prepared by the friendly project team

Good bad

We've released a new book, Social Media Content Marketing: How to Get Inside Your Followers' Heads and Make Them Fall in Love with Your Brand.

Subscribe

The Google Penguin filter is one of the latest algorithms that the company uses to rank sites in search results.

Today, Google takes into account more than two hundred factors when ranking sites. To take them all into account, one algorithm is not enough; several are needed, each of which will solve its own problems.

The main task of the Penguin filter is to identify and block sites that use dishonest promotion methods, the main one of which is the purchase of link mass. The algorithm is constantly being improved and currently the Google Penguin filter is updated almost continuously.

History of the development of the Penguin algorithm

Google Penguin was released to the world in April 2012. Over the next two months, it was updated twice, the developers adjusted the first version of the filters. The second version of the algorithm appeared almost a year later; the updated Penguin acted more subtly and took into account not only the level of link spam, but also the overall level of the page.

In the fall of 2014, the algorithm was updated again. It must be said that at that time he acted in such a way that the sites that fell under his filters had to wait a long time after the correction for the release of the next update in order to pass the check again. The situation changed in 2016, after the release of Google Penguin 4.0, which operated in real time and was updated continuously. Latest versions The algorithms act extremely gently - the level of the site, the quality of the pages are taken into account, and low-quality links are canceled without sending the entire site to a ban.

What does Google Penguin punish for?

Experts believe that the Penguin algorithm should complement the Google Panda algorithm, which is responsible for checking website content. To prevent your resource from falling under the Google Penguin filter, you need to carefully work with external links to the site and avoid what experts call link manipulation. The main methods of such manipulation are:

  • “Trading” links, when the site owner publishes links to other people’s sites on his resource for money or other payment.
  • Obviously artificial link exchange, when sites link to each other due to collusion of owners, and not because of the quality of the content.
  • Using a large number of texts on the site, which contain many “far-fetched” anchors and keywords.
  • Using services that automatically generate links to the site.
  • The presence on the site of links that have direct keywords in the anchor.
  • Using cross-links with anchor keywords in the sidebar and footer of the site.
  • Comments on site materials with links to spam resources.
  • Excessive quantity contextual advertising on home page site.

For using such dishonest link schemes, the Google Penguin filter will quickly and reliably “drop” your site by many pages in the search results. Moreover, it will be very difficult for you to regain your position, since Google Penguin checks the site only twice a year.

How to find out if Google Penguin has applied sanctions

Unlike Google's algorithm, which only works in automatic mode, Penguin is also used for manual moderation. If you notice a sharp drop in traffic, go to Google Webmaster Tools, in the “Manual measures taken” section and check if there are messages from moderators there.

If there is a letter, all you have to do is correct the shortcomings indicated in it and send a request for a new check.

However, most often the algorithm works automatically. In this case, it's worth going to Moz.com and checking if there have been any recent Penguin updates. If there have been updates, then the diagnosis has been established correctly, and it’s time to start “treating” the site. You can also identify this correspondence using the PenguinTool service from the Barracuda website. True, for this you will have to give the service access to your account in Google Analytics, so that it compares the period of drop in traffic and the time of release of the new update. The comparison result will help you understand whether you are caught by Penguin’s filters or not.

What to do? if Google Penguin caught you

If you fall under the filters of this algorithm, then the worst thing you can do is start to panic delete all links. This will completely destroy the resource.

A site that is deemed to be of poor quality by the search engine needs a calm and thoughtful reorganization. Herself Google company proposes to gain link mass again, slowly, naturally and mainly through the creation of unique content.

The first thing you need to do to get out from under the filter is to analyze the link profile of the resource. You will need to understand which links come from quality sites, that is, from useful, interesting and visited ones, and which from spam ones. You can do this using the Majestic SEO service. Links to spam sites ( internal links) must be neutralized using noindex and nofollow bans, which will block “bad” links from indexing and block transitions to them. To remove external links you will need to use Google service to disavow links. The service is called , the links included in it Penguin Google It just doesn't take it into account.

The second step is changing link anchors. It is performed in two ways. The first is to change the link to a non-anchor link, but only an experienced webmaster can do this. The second method is to build up your link profile by creating new non-anchor links.

The third step is to expand your base of link donors, that is, make sure that the links come from different sources: from forums, from social networks, from directories, from blogs and media such as online magazines and news portals. Complete site sanitization and removal from filters usually takes 3-4 months.

To avoid getting hit Google filters Penguin, you need to attract only high-quality links, maintain constant growth dynamics in your link profile, and not use direct anchors in your links. Quality content and natural link building from different sources will protect you from search engine sanctions better than any specialist.







2024 gtavrl.ru.