How did I get out of Google Penguin's filter for unnatural links? but the panda is still with me! How I came out from under the Gogol Penguin manual filter.


After Google Panda punished me for duplicate content in February 2012 and for more than 2 months the traffic sank below the baseboard (I showed you this in this article about ), then I really gave up and didn’t want to promote the blog at all.

I stopped writing articles, doing link promotion, organizing competitions, etc. with the same enthusiasm. In reality, to be honest, there was some depression. Well, how is it, you puff up, puff up, everything goes uphill as it should be and then BAM, the traffic sags and you don’t know what’s going on.

And just for this decline in activity, Google Penguin crept up behind me and punished me in addition. For what? And for suspecting me of unnatural link promotion. And probably not even for the unnatural incoming links themselves, but for the decline in their number.

Now, in 2014, I already figured out what was wrong (why attendance has dropped sharply since February), it’s a matter of DUPLICATES, but unfortunately it’s too late, the penguin applied the filter. Thank God everything ended well and I successfully got out from under him.

Let’s make everything clear to beginners, I’ll start from the very beginning and in order. Here's what you'll learn from this article:

Google Penguin and Panda filters - what they punish for

Let's start with an understanding of what Google mainly punishes blogs for and imposes the so-called panda and penguin filters on them. Let's look at the panda first, since she punished me first.

The Google Panda algorithm penalizes content. That is, for your mistakes in the content component of the blog. And believe me, we have a lot of schools. I highlighted several points here:

1. The content itself. If your blog is mostly not unique content(rewrite or even worse, copy-paste, stupidly copied and pasted), then you are provided with a filter. The content on the blog must be unique. I copied and pasted it for myself - remember that.

2. Content re-optimization. If all your articles are stuffed with keywords, there is too much highlighting keywords, underlining, italics, too many keywords in h1, h2, h3, etc., overspam in the title, description, keywords, etc., then you can also get a header for this. It’s not that it’s possible, that’s exactly what you’ll get in the head.

If you highlight something, then do it in such a way that it is convenient for the user first of all, and not to somehow cheat and optimize it. Do you know a funny joke about a SEO? Listen:

An SEO specialist comes to a bar, restaurant, buy alcoholic drinks, clubs, the best bars in Moscow, order a banquet...

This shouldn't happen to you!)))

3. Duplicate content. My favorite topic of the last 2 articles. IN google results should include only the main page, pages and posts (articles). All. There should be no other pages with the same content in the search results. As practice shows, more than 90% of bloggers suffer from the disease of duplicates. Solution and .

4. Too many ads. If you have a lot of advertising on your blog, then Google Panda also sees this and lowers your blog in search results. key queries. The panda especially does not like advertising at the beginning of articles. Again, 90% of bloggers have Google or Yandex advertising at the very beginning of their articles.))

This is very bad. I personally conducted an experiment on my blog. I wanted to know how advertising affects positions. First, I removed advertisements at the beginning of articles. Traffic has increased a little. Next, I collected all those articles that are most often visited by visitors from search engines, COMPLETELY removed advertising from these articles and...

After 1-1.5 weeks, traffic on them almost doubled. Here's a joke for you. In short, think about it. do not make your blog a traffic light of advertising, insert it so that it is invisible to the visitor.

5. Bounce rate. What is this? In Russian, this is when a visitor came to your blog for some article and after 2-3 seconds, well, 5 or 10, closed the tab. And the more it is, the worse it is. 100 people came to the article, 80 people immediately left - this suggests that 80% of people consider it not informative.

That is, people did not find in it what they were looking for. Therefore, it is worth lowering it in the search results. What conclusions can you draw? Perhaps the title of the article is not right, there is not enough content, the person was scared off by the design, maybe some kind of music starts playing in the background (this is generally fucked up, I would immediately filter such sites)

If you take all these 6 points together, then you probably already understand which article will be the Google Panda's favorite. And this is a cool, interesting, unique article whose title will be catchy and informative. So that the visitor reads the title and understands, “That’s it, this is what I need, I found what I was looking for.”

An article in which there is no copy-paste, an article which is not stuffed with keywords, not over-optimized, in which there are no sales links and a bunch of advertising... An article which does not have twin brothers and which is read from cover to cover not in 5-10 seconds, but in 1-2- 3-5 minutes and they also want to read other articles. =)

This is what we should strive for, comrades. This is according to google panda. This little animal came to me in February 2012, looked at the thousands of duplicates that suddenly appeared out of nowhere and imposed sanctions on the entire blog.

Until this moment, traffic reached 6,200 visitors per day and continued to grow by leaps and bounds. Well, I screwed up.))

Now the duplicates are slowly disappearing and I see an increase in traffic from Google:

There have always been 150-170 people per day from Google, but here there are already 280-330. The process has begun. I feel when the duplicates go away, traffic from Google will even overtake Yandex =)

What's happened good blog? This is a blog that is constantly evolving and gives people high-quality, unique and useful content right? Right. How will Google determine whether it is of high quality or not? Interesting or not?

Very simple. In terms of behavioral factors - once. For incoming link mass - two. The first is clear, about behavioral factors I won’t say much, read and I have articles on my blog about them on the Internet.

Let's analyze the incoming reference mass. If the blog develops and constantly produces useful and interesting content, then in any case other sites and blogs on the network should link to this blog. So? That is, there should be a constant increase in the link mass. Links should go and go...

From websites and blogs, from social networks, bookmarks, forums, etc. and so on. Anyway... BUT! Links are natural and high quality - one, keep going constantly - two. You can read about natural and non-natural links on the Internet. I'll tell you more about quality later.

Briefly about natural and unnatural. Natural - those that other people put without your requests, etc. They liked your post and included the link. Not natural - purchased or specially placed somewhere by you, as long as there is a link leading to your blog, which will help increase your position in the PS. Something like this.

If the Google penguin sees that there are links to you from other sites and blogs, but most of them are not natural, then it will punish you. If the Google penguin sees that there are natural links to you, great, but if they were coming and then bam and stopped coming, then he will punish you too. For what?

And because your blog has stopped developing. It's simple. This is a kind of signal to Google, which makes it clear that either your blog has really faded, that is, visitors loved your content and shared links to it in other sources, and then stopped, or you were buying links, then stopped doing this.

That is, for example, there were 200 incoming links to you, and now there are 100. This means that somewhere you cheated, you probably bought temporary links, or simply where there were links to you, those sites and blogs died.

Google understands all this, it looks at the dynamics of link mass growth. If he sees a collapse, then sanctions may be imposed on you. It’s not possible, but they will impose it.

Conclusion: the blog must be constantly developed. Constantly write useful and interesting content, thereby increasing CONSTANTLY incoming link mass.

OK. We understand why the penguin punishes Google, but how can you find out if any penguin sanctions have been imposed on your blog? To determine this, you first need to know what types of filters Penguin has.

What are the Google Penguin filters (types)

Penguin has 2 types of filters:

1. Manual filter
2. Automatically filter

How to find out which filter your blog is under and whether it is at all? Let's start with the manual filter. I’ll tell you how my blog came under it and how I got out of it quite quickly, although I was under it for almost 2 years.

Moreover, in reality the issue was resolved quickly, but my unwillingness to understand this issue delayed the exit from under the manual filter for as much as 2 years. In short, I didn’t take him seriously and, figuratively speaking, I only pretended that I wanted to get out from under him.

So! In general, May 4, 2012 comes to my panel google management webmasters (section - search traffic manual measures ) here is the letter:

That is, Google tells me that it has found unnatural links leading to my blog and imposes a manual filter on me. Why manual? But because I received a letter. When applying an automatic filter, the letter does not arrive.

Well, in short, I looked at this letter, nodded my head, smiled and went about my business. Since the traffic had dropped due to duplicates (which I didn’t know about then) since February, I honestly didn’t give a damn, filters weren’t filters, pandas, penguins, etc.

I gave up on everything. I just stupidly wrote posts sometimes, was busy moving to a new house, then to, etc. In short, there was a severe decline in my activity. A WHOLE YEAR has passed!)))) And only then did I think something - isn’t it time to figure out the issue of removing the filter?)))

And only on March 3, 2013, I send the first request for re-verification to Google and ask them to review my blog for the presence of non-natural incoming links and remove the filter:

I say how white and fluffy I am, I won’t do that again, so I changed the design of the blog, the pictures are all unique, I write all the articles myself, la la la... There is such a button in the same section:

Sent. After 3 days I receive an answer:

That is, they checked my blog and it seems like it still violates quality assurance guidelines. The links to me are still unnatural. OK. I send the request again and then I’m even more indignant, like, how come, I didn’t violate it, it’s not my fault, he came himself and everything like that.

Again I get the answer, no, you came yourself... Guilty...))) That is, they don’t remove the filter. OK! I send the request again and am already swearing... Like, what is it, show me the links, there are already 1500 of them coming to me, I’m not a magician and I can’t understand which are good and which are bad.

I get this response:

That is, they again say that the site violates the recommendations, but they also even showed 3 links as an example. I looked at all these links and saw that this is fucked up. How can you consider them not natural if people themselves put them on their blogs??? Well, ok, I agree about one, I held a competition, but the rest are 2 for what?

I’m writing a letter to Google again, saying, comrades, have you gone crazy with your algorithms? You consider natural links to be unnatural links. The people installed them themselves. I get the answer again - your site still violates the recommendations and again they show more examples of links.

I look at them again and see that they are all natural. I didn’t buy them, people installed them themselves. And just like that, I sent verification requests 9 times. It's all useless. I already wanted to write obscenities to them in support, but I realized that this would definitely not lead to anything good, so I went to the Google forum.

I decided to ask questions there. I explain the situation, I say Google is sending letters that my blog violates the recommendations, how can it be, because all the links are natural. I received a bunch of useless answers there and ended up leaving there.

My conclusion was clear: Google unfairly put a filter on me.

In fact, guys, I believe that Gogol imposed a filter on me precisely not for unnatural links, but for a decline in the total number of incoming links. I relaxed after the panda, stopped receiving high-quality incoming links and so the penguin hit me with its fin =)

How I got out of Gogol's penguin manual filter

But I’m not going to sit, cry into a handkerchief and leave everything as it is? At the beginning of 2014, I decided to pull myself together and get rid of this scourge, this google filter. I started digging various information on removing the filter, read a bunch of articles and cases, watched videos, attended webinars on SEO, watched courses, etc.

As a result, I got a clear understanding of what I need to do to get the blog out of the filter. And I had to do the following:

1. Collect full list all incoming links to my blog
2. Identify bad links from the list and disavow them in the Disavow Tool
3. Remove all those links from other sites (that google showed me)
4. Start getting quality natural links

Throughout January and February I did just that. On February 22 I send a request for re-verification and on March 2 HURRAY!!! I receive a letter from Google where it says that the filter has been removed:

After reading this letter, I jumped out of my chair, started running around the room in my shorts shouting “Google I love you!!!”, then again in my shorts I ran to the store, bought a bottle of cognac and got drunk like a beast =))) I’m kidding, of course ... But my joy knew no bounds.

So there you go! What did I actually do? First of all, I asked all those webmasters to remove those links to which Google pointed me. Almost everything was removed without any problems, for which many thanks to them! But there are also links that I can’t remove, right? What to do with them?

We paste this whole thing into a simple txt document:

OK. We have a complete list of all incoming domains for our blog. Now we need to figure out all the bad links from this list and tell Google not to take them into account at all when ranking.

Google has a special tool called Disavow Tool. There we can upload a list of links that should not be taken into account by Google, and after some time, after several updates, all these links will not be taken into account.

I will tell you how this is done below, but the question is - how to calculate from the list which links are good and which are bad? There are 2 options here:

Well, the first option certainly works, but is not very effective. Why? But because you can look at a resource and see that it seems to be normal, but in fact it is of poor quality. Quality is, first of all, TRUST and SPAM. More on this later.

That is, you do not reject a link, thinking that it is good, but it is actually bad. The first option is also bad because it will take you a lot of time. Here I have 1500 incoming links. Imagine if I walk on each one? This is tough =)

The second option is cool in that the Checktrust service itself will follow all the links that you show it, select all the bad, non-trusted and spammy ones from them and give you a READY FILE, which you just need to upload to the Disavow Tool and that’s it.

I definitely recommend that you use this option, as it is a 100% solution. I submitted requests for review to Google 9 times and only 10 times when I used this service (received ready file and sent it to the Disavow Tool) the filter was removed from me.

In order to use the checktrust service you only need to:

1. Register
2. Add a list of all check links to the BackLinks Checker tool
3. Get a finished file with bad links
4. Send it to the Disavow Tool.

Let's begin. You will have no problems with registration. Next we create in the office new project BackLinks Checker:

Click “Next” and “Confirm and place the project.” All. In 5-10 minutes, or even earlier, the service will process all the links and give you a ready-made file, which you will need to upload to the Disavow Tool in order to reject bad links.

As soon as Checktrust has finished everything, go to “Home”, click on the name of our project and a txt file with all the bad links will be automatically downloaded to us:

So with this file we go here to Disavow Tool. Click on “Disavow links”:

Once again - “Reject links” and upload our file there, which checktrust gave us:

All! We rejected bad links! But this was not enough for me to get out of the filter. I definitely needed to restore the dynamics of link mass growth. That is, I needed to make sure that various sites and blogs started linking to me again.

In short, I needed to start improving my link profile, which is what I did. What is a link profile, how to analyze and improve it, I explained in my course How to become a blogger 3.0 thousandaire

Naturally, the most important thing in improving your link profile is the constant increase in new high-quality links. New incoming links need to be received constantly, and the Checktrust service also helped me and is still helping me with this.

This is actually now the number 1 tool in promoting my blog. I also explained how to use it when buying links in the KSBT 3.0 course. Owner of this service Alexander Alaev, about which I have already told you more than once. Sanya, thank you for such a cool thing.

It really helps you not to throw away 90% of your budget when buying links. In general, I began to receive incoming links every day (free and paid, buying them on exchanges). Have you heard that Yandex has canceled links? =) Believe this further.))))

In general, this is how I got out of Google’s manual filter. There is also an automatic filter. How is it different from manual? And it differs in that your blog (site) falls out of the top 50-100 in all positions, the traffic sags sharply, but the notifications in the panel google tools you don't get it. It's simple!

And if you can, in principle, get rid of a manual filter quickly, only by using checktrust, then getting rid of an automatic one will be much more difficult because it is removed not by people who look at your application for review and make a decision, but by robots (algorithms).

Do robots have certain numbers exit from under the filter and until your resource reaches these numbers, you will not leave under the filter, you also need to wait for updates to the gogole penguin algorithms. The output diagram for the automatic penguin filter is the same as for the manual one:

If you see, for example, that you have good traffic from Yandex, and from Google from Gulkin Klyuk, then this is a signal - you are under the filter. Or a panda or a penguin. You already know what the panda punishes for. If you think that everything is “zero gut” with the panda, then everything is not so good with the penguin.

We need to take action. But don’t immediately panic and run to disavow links. Maybe you should just look at all your incoming links, analyze them in checktrust, look at their quality and visually evaluate the whole picture.

That is, which links are more bad or good. If you really see that there are few quality resources linking to you, then you need to correct this situation and start getting quality links. I will notice QUALITY!

Well, that's probably all. I will end here and in conclusion I have an interesting proposal for you:

WHO HAS REGISTERED IN THE CHECKTRUST SERVICE BEFORE AUGUST 15, 2014 MY AFFILIATE LINK AND WILL REPLENISH THE BALANCE OF HIS ACCOUNT BY A MINIMUM OF 1000 RUBLES, HE WILL RECEIVE A VERY GOOD PRESENT FROM ME.

AFTER REGISTERING AND REPLENISHING YOUR BALANCE, JUST WRITE TO ME IN SUPPORT AND RECEIVE YOUR GIFT. IF YOU REPLENISH YOUR BALANCE BY 2000 RUBLES, THEN IN ADDITION TO THE GIFT I WILL PERSONALLY LOOK AT YOUR BLOG, ANALYZE IT FOR ERRORS AND TELL YOU WHAT NEEDS TO BE CORRECTED.

IF FOR EXAMPLE YOU HAVE SOME DIFFERENCES WITH ELIMINATING DUPLICIES ON YOUR BLOG, WHICH I SPEAKED ABOUT IN THE TWO PREVIOUS ARTICLES, THEN I WILL ALSO HELP YOU WITH THIS AND SET EVERYTHING SET UP!

BUT! Don’t think that I wrote this article just to promote the checktrust service and earn commissions. This is wrong. This is truly a real service that I myself use almost every day and highly recommend that you use it.

And I’m making you an offer because I’m participating in a competition from checktrust, where an iPhone 5 is being raffled off, which I would like to win and give to the winner of the “” competition. =) These are the pies.

If anyone has any questions, please ask in the comments. But in theory, I think I explained everything clearly.

Guys, do you know why most bloggers today have a sad situation with traffic from Google and Yandex? Well, yes, technical mistakes play a role, but one of the most important reasons is the lack of a good incoming link mass or its complete absence.

How many letters do I receive in support - “Alexander, what should I do? I’ve been writing this blog for two years now, but there’s still no traffic from PS. The articles are unique, interesting, useful, there are a lot of readers... HELP!”

It’s all great that you have a lot of readers and your articles are great, but that’s not enough. You have to show and prove that they are really cool. Only high-quality, trusted, incoming links can show this. And the more there are, the better.

If an authoritative site links to you, then this is a direct indicator that your site is interesting. If you've been blogging for 2-3 years and have 100 incoming links, then don't expect mega traffic. Look at the projects that reach tens of thousands of visitors.

There, link promotion is thought out even before the start of the project. There are whole methods, analyses, link promotion planning, etc. That is why some resources are developing well, while others are in the ass.

In general, the last thing - use link promotion and you will be happy, and do not listen to any statements that the links do not work, they have been canceled, etc. Good luck!

P.S. Probably the most last question you say, “What should I do now? I didn’t receive a letter from Google, it seems like I’m not under the filter, there’s no traffic, but it’s coming. What then?

And then just evaluate the quality of all links coming to your blog through the checktrust service (I talked about how to do this in KSBT 3.0) and if you see that the picture is sad there are few links and the majority are not very good good quality, then take action.

Trust me! You will start receiving quality links and see an increase in traffic. I clearly demonstrated this until February 2012, when I received a panda for duplicates.

Well, that's it, bye everyone! I look forward to your questions in the comments!

Best regards, Alexander Borisov

Penguin - algorithm search engine Google, whose initial role was to combat unnatural (purchased) links. The first release date for Penguin 1.0 is April 24, 2012.

Next updates:

The launch of the Penguin 4.0 update has been delayed several times. The release of the update was planned for the end of 2015, however, work to improve the algorithm lasted until September 2016.

We present to you a translation of an article from the official Google blog: https://webmasters.googleblog.com/2016/09/penguin-is-now-part-of-our-core.html

Penguin became part of Google's core algorithm

Google's algorithm uses more than 200 unique signals, or "hints," to quickly find what a user is looking for. These signals include certain words on site pages, content update frequency, site region, and PageRank. One such Google signal is the Penguin algorithm, which was launched in 2012 and was updated on September 23, 2016.

After improvements and testing, Google is now rolling out the Penguin algorithm update in all languages. We present to you the key changes that you will see, they were among the popular requests of wemasters:

  • Now Penguin works in real time. Historically, the list of sites affected by Penguin was periodically updated. When webmasters improved their sites, many Google algorithms updated information about them fairly quickly, but algorithms like Penguin took time to update the data. Now Penguin data is updated in real time, which means that changes on the site will be noticed by Google much faster. As a rule, they take effect immediately after the robot crawls and indexes the pages again. This means that Google is not going to announce any more updates.
  • The penguin has become more detailed. It currently demotes spam pages by changing rankings based on spam signals, rather than affecting the ranking of the site as a whole. That is, there is an underestimation in the issuance of one or several pages, and not the entire site.

The Internet has changed significantly over the years, but as Google said in its original post, webmasters should be free to focus on creating compelling and useful sites. It's also important to remember that updates like Penguin are just one of over 200 signals Google uses to rank sites.

If you are wondering: how to remove a site from Penguin? How to remove Google manual filter? Then, this guide will help you solve these problems, regain positions and traffic.

In the search engine Google system There are dozens of well-known filters that can greatly affect promotion, as well as hundreds that few people know about.

Today we will talk about the most basic and common filters. Namely:

  1. Automatic Google filter for backlinks

All of them in one way or another relate to an algorithm called Google Penguin, which came into force more than a year ago and has already managed to make a lot of noise.

Symptoms of such a filter

  1. collapse of positions
  2. sharp decline in website traffic

In practice it will look like this:

Not exactly a pleasant situation. Especially when in most cases your main source of attracting visitors is search traffic.

Now let’s take a closer look at each type of filter for backlinks.

Manual Google filter for artificial incoming links

Often it all starts with a message in the Google Webmasters panel. It looks like this:

The notification contains a message of this nature:

Messages can be different, for example:

To find manual filter messages, do this:

After receiving a notification about artificial incoming links, the following are usually the consequences:

A) Within 2-3 weeks, positions drop significantly, after which traffic from Google search disappears

B) Positions immediately disappear and traffic drops

Reasons for Google Manual Filter

The main signal that causes such a notification to arrive is the anchor link and its overspam.

In the example, the text backlinks to one of the sites that was affected by the filter. The main reason for the filter is anchor spam.

Another example:

If you look at unique domains, then we get the following picture:

Using backlinks with commercial or other keywords leads to manual filtering, loss of rankings and traffic.

What then to do and what to do?

The solution is extremely simple - do not use large percentage links with keywords.

Step-by-step instructions for removing Google manual filter

  1. Google Notification
    Checking if there is a notification in Google Webmaster Tools. If yes, then move on to the next point.
  2. Request for review
    In the first review request, it is important to clarify which links violate the search rules and ask what needs to be done to remove manual sanctions.
  3. We get the answer

    In most cases, the response indicates those links, according to which the site violates the search rules. For example:

    As a result, we can determine which links Google pays attention to and considers them spam.

  4. We carry out the indicated actions

    What you do next depends heavily on your link profile.

    Situation 1

    It's worth it to your site a large number of rented links, and in most cases these are direct anchors that contain keywords.

    In this case, it is necessary:

    1. clean links with anchors (including those in the Google example)
    2. move on to the next point - new request for review

    If you really remove most of these links, then the manual filter can be removed from one or two queries.

    Situation 2

    In this case, it is necessary:

    1. view all backlinks (you can use backlink checking services, for example: Ahrefs, MajesticSeo, LinkPad)
    2. make a list of links that are unwanted (main criteria: anchor, low quality of the site)
    3. add links to Disawov Tool - https://www.google.com/webmasters/tools/disavow-links-main?/
    4. wait 1-2 weeks (from practice it is necessary for the links to be re-indexed)
    Next, move on to the next point and submit a request for review.
  5. Submitting a new request for review

    Your request for reconsideration must:

    1. clearly describe what you did
    2. simple and clear
    3. and clarify what else needs to be done to remove the filter

    After that, you need a little time to revise your site. This may take from a couple of days to 3-4 weeks.

  6. Intermediate answer

    The intermediate response usually contains the following content:

    This notification from Google comes almost immediately after you submit your review request.

  7. We are waiting for a decision

    The answer usually comes within 1-2 weeks, sometimes longer, and sometimes faster.

    It either contains a negative answer, for example:

    If the answer is negative, That:

    1. reviewing the links again
    2. add to Google Disawov
    3. or just remove links
    4. in any case, action needs to be taken
    5. submit a new request for review

    But the answer after the above steps may be positive, For example:

    In this case, congratulations! The Google manual filter has been removed.

In our example, there was no quick way removing a site from Google’s manual filter, this is what the chronology of events looks like:

Important points when removing the filter yourself:

  1. To not give up
  2. take action (not just send requests for review)
  3. if you do everything as described above, then the manual filter from Google will be removed, you will return positions and traffic

Here is information that will complement this material and help you successfully remove the filter:

  1. USA themes
  2. Case study: how to remove manual Google sanctions using the Disavow Tool and return positions and traffic
  3. Case study: how to remove Google’s manual sanctions and return positions and traffic

There is nothing wrong with a manual filter; the most important thing is not to be lazy and do a series of actions that will lead to the desired result.

The result is a return of attendance to its previous level, for example:

IN in this example A couple of sites were considered, the filters from which were removed at different times.

By filter removal time:

  1. 4 day record
  2. longest period 3.5 months

But in any case, you need to act promptly, and also take into account the points described above.

Automatic filter for backlinks

With Google's automatic filter, everything is much more complicated. Since there is no message about it.

Important: automatic sanctions may be lifted if official update Google algorithm Penguin, which occurs on average once every 2-4 months.

No matter what you do, until the algorithm is updated, nothing will change if these are automatic sanctions.

We've removed the automatic filter more than once, but things don't always happen quickly.

Filter Signs

Everything is the same as with the manual one, but no notification comes.

  1. positions first decline at – 30-50
  2. traffic to the site drops

And then he doesn’t come back for a long time.

Main reasons:

  1. Direct entry of link anchor
  2. Low quality backlinks

Step-by-step guide to removing the automatic filter

IN in this case Unlike manual, everything takes an order of magnitude longer and is more complicated.


In conclusion about the Google Penguin filter

The algorithm was launched in 2012 and is constantly being improved. Its main task is to combat spam and artificial manipulation of search results.

To avoid falling under the filter, necessary:

  1. do not use a direct anchor sheet very noticeably, or not use it at all
  2. attract quality links to the site
  3. try to focus on the product and its quality in order to receive natural links (yes, this can be done even in the realities of the RuNet)
  4. maintain dynamics - Google likes constant dynamics

Then you will not have any questions with filters and your traffic will constantly grow. Naturally, if you work on:

  1. site content
  2. attracting natural links (not purchased)

Even if you have already stepped on a rake and received sanctions from Google, then you should not be disappointed - there is always a solution.

I hope this step by step guide will help solve your problem - return positions and traffic from the Google search engine.

Important: This article only covers filters related to the Google Penguin algorithm and backlinks.

If your rankings and traffic have dropped sharply, then the reason may not be in the links at all, but, for example, in the content. The Google Panda algorithm is responsible for this.

Good luck to you! See you soon on the pages of the blog site

You can watch more videos by going to
");">

You might be interested

Anchor - what is it and how important are they in website promotion? Disavow links or how to determine which Google filter (Panda or Penguin) a site is under
SEO terminology, acronyms and jargon Why you can get banned from Yandex, fall under the AGS or a footcloth filter, as well as ways to get out of these sanctions
Rel Nofollow and Noindex - how to block external links on a website from indexing by Yandex and Google
Site trust - what is it, how to measure it in XTools, what influences it and how to increase the authority of your site

We all know very well, dear friends, how search engines pore over their algorithms, filters, and so on, in order to give ordinary Internet users what they are looking for as clearly as possible. But these same algorithms simultaneously clean the network from the same spam or low-quality resources, which, by hook or by crook, somehow ended up in the TOP. “It won’t work like that” - somehow the minds of the search giants think so, and once again the guys from Google came out with this thought.

So for the second week now, the webmaster part of the network has been buzzing and the majority are indignant: “Google introduced something with something and this hellish mixture cut our positions, which in turn reduced traffic.” Yes, indeed, while analyzing the competition in the niches that interest me, I noticed serious changes. Traffic on many competitive resources was cut by 10, 20, 50, or even more percent. Why go far, look at some SEO blogs, it’s unusual to see traffic of 150-200 users per day.

So, what did Google come up with...

On April 20, 2012, a message from Google developers appeared on the Internet with approximately the following content:

“In the coming days we are launching an important algorithm change targeting Webspam. The changes will reduce the rankings of sites that we believe violate Google's site quality requirements."

On the night of April 24-25, a new Google algorithm was introduced - Google Penguin (Google Penguin). Google's love for animals has generated a lot of buzz. On the same Search, several topics were formed with a huge total number of pages (more than 500) discussing the new Google Penguin algorithm. As always, there were significantly more dissatisfied people than satisfied ones, because the satisfied ones sit quietly and do not burn their achievements, which were only eaten by the Google Penguin with a “Hurrah”.

Let's first get acquainted with the very basic requirements for website quality that Google puts forward:

  1. Do not use hidden text or hidden links. Google, and not only Google, have long been marching under the banner of philanthropy. What a person does not see is understood and perceived by search engines as an attempt to influence the search results, and this is comparable to manipulation and is suppressed in the form of pessimization or some other “damage”. I remember that at one time hidden text was very fashionable, of course, because it had an effect. Black text on a black background with a certain optimization, you got to the page and wondered where was what the search engine was showing.
  2. Do not use cloaking or hidden redirects. No need to try to give away search robot one information, and the user another. The content should be the same for everyone. Regarding redirection, there are those who like to rip off money from redirection or sale mobile traffic, Google decided to disappoint a little.
  3. Don't send automatic requests VGoogle.
  4. Don't overload your pages with keywords. This principle has long been supported by Yandex and positioning against over-optimization of sites clearly killed the work of adherents of stuffing the site with keys. If earlier it was possible to stuff articles with keywords and enjoy traffic from Google, now, as you can see and understand, not everything is so simple. But don’t forget about user factors or so-called behavioral factors. If they are at a good level, then even slight over-optimization is not a problem, because user behavioral factors have always been, are and most likely will be a priority. Only here everything is somewhat more complicated - imagine for yourself what needs to be given and in what volume so that these same behavioral factors are at their best. This suggests that the content should really be of a high level and interest, and not a light rewrite from a competing site or niche leader.
  5. Do not create pages, domains or subdomains that substantially duplicate the content of your main site or any other site. At this point, Googlers immediately put together their views on affiliation, site networks, doorways, copy-paste, and low-quality rewriting.
  6. Don't create malicious pages such as phishing or containing a virus, Trojan horse or other malicious software. This point should not be chewed on at all, fight viruses in one word.
  7. Don't create doorways or other pages designed just for search engines.
  8. If your site works with affiliate programs, then make sure it provides value to the network and users. Provide it with unique and relevant content that will attract users to you in the first place.

Google identified these 8 principles as the main ones. But there are also 4 more that were especially noted by them:

  1. Create site pages primarily for users, not search engines. Do not use disguises or other schemes for working with sites, be extremely transparent.
  2. Don't try to use tricks to increase your site's ranking in search engines.
  3. Do not participate in link building schemes designed to increase your site's ranking in search engines or Google Page Rank. In particular, avoid links that look like spam, link dumps, or bad neighbors on the server (I draw your attention to analyzing and viewing neighboring sites if your site is hosted on a regular shared hosting).
  4. Do not use unauthorized software solutions to automatically contact Google. The developers themselves highlight such as WebPosition Gold™.

But everything seemed to be known to everyone who was interested in the issue before.

In the work of the new Google Penguin algorithm, I was surprised by the strict adherence to the principle of “advertising loading” of the first page. Remember, it was said that the first page (home page) should not be overloaded with advertising. Google Penguin has clearly begun to adhere to this principle. It cut off traffic for sites with several large blocks of advertising on the main page, even if these blocks were their own - Google advertising Adsense. 🙂

Observations on the work of Google Penguin

Now I would like to list a number of my own observations. I understand that they are approximate and represent my personal observations and data (indicators can be perceived as relative), but they have their place and have a right to exist. I analyzed the work of 20 sites, both mine and those of my partners. The analysis took me 3 days, and I assessed a lot of indicators. As you understand, all sites were promoted using different schemes and using different algorithms, had completely different indicators, which made it possible to draw a number of conclusions.

1. Exact entry. If earlier, in order to be in Google in chocolate, you needed a lot of exact entries, now with Google Penguin (Google Penguin) everything is exactly the same and vice versa. This may be exactly the fight against Webspam. Yandex has long been fond of diluting anchors, and now the matter has come to Google.

  • pages with external links have an exact entry of 100% - the drawdown was 75-90%. Average drop of approximately 38 positions;
  • pages with external links exact entry 50% - drawdown was 15-20%. Average drop of approximately 9 positions;
  • pages with external links of exact occurrence less than 25% - an increase of 6-10% was noticed. Average rise 30 positions.

Based on these data, I made a conclusion - we dilute the anchors and dilute them as interesting and deep as possible.

A striking example is the “gogetlinks” request on this blog. Exact occurrences greatly outnumber dilute occurrences, and this is the result:

2. Buying temporary links. I bought temporary links for finishing or quick results on all analyzed resources and with different systems. These were Sape, Webeffector, Seopult and ROOKEE.

Generators automatic promotion Webeffector and ROOKEE gave approximately the same results. The drawdown was practically not noticed at all, only a small one on Webeffector, but it is insignificant and is more related to the dilution of anchors. In other moments, even growth is observed, what can I say, here is a screenshot of the campaign (clickable):

As for Sape, the picture is completely different. All projects for which links were purchased from Sape sank. All the requests that were moving in this exchange flew out of the TOP 100 and even collecting statistics on where they flew there became somehow stressful, which in the end I didn’t do.

Analyzing the impact of Google Penguin on promotion with Sape, I concluded that Google now perceives the placement of links on this exchange as unnatural.

Here I started actively removing links. But it makes sense to give your own examples when you can show you bright ones from our own niche. Let's take my friend's blog - Sergeya Radkevich upgoing.ru. The man worked with Sape for several months and was happy with the increase in traffic until Google Penguin came along. Let's look:

It’s also worth looking at the source chart search traffic:

As you can see, Google Penguin has reduced traffic from Google by more than 7 times.

The conclusion on this point is that some filters still need to be used when working with temporary links and some kind of placement and purchasing algorithms. Automated services work according to certain schemes, unlike Sape. The results are obvious, by the way.

Sites with Seopult are generally expected to increase their positions. Here I used the Seopult Max algorithm for Yandex, but as I see, now it also works with Google.

3. Over-optimization of content. A decline was also noticed here, but not as significant as in the previous parameters. Within 10 positions, only 10-15% of over-optimized articles lost.

Here I conclude that it’s not so scary to slightly over-optimize, giving yourself some guarantees. And you can catch up with lower impressions by purchasing links.

4. Eternal links. Queries promoted eternal links with a normal and natural appearance, they only increased their ranking more seriously. Some HF VCs climbed into the TOP 20 without any manipulation, due to a noticeable decline in most of their competitors. Here I once again conclude that my work in the direction of promotion ONLY with eternal links is correct.

1. Take all the information above into account and work on both the content of your site and its promotion.

2. Check for Google notifications on Google Webmaster Tools with a message about spam activity on your resource. To do this, go to http://www.google.com/webmasters/, log in. Next, go to the statistics of your site and go to the messages section (clickable):

If there are still messages, then it will be necessary to take measures to solve the problems indicated in the messages. This may include removing links... Not all yoghurts are equally healthy... 😉

3. Check the site for availability malware in Google Webmaster Tools:

The solution to the issue is the same as in the previous paragraph. We identify programs and files marked by Google as malicious, find them on the server, or replace or delete them.

If everything is really bad and you have given up, then fill out the form and sign the created petition to Google against Google Penguin. This, of course, does not guarantee you anything, but at least a moment of small self-satisfaction and the feeling of “I did everything I could” will definitely come. On the same topic you can use the form feedback contact the algorithm developers.

Personally, I lost a little, since the main emphasis was on truly SDL and promotion with eternal links. I got rid of the negativity within 2 days and began to grow at the same rate. Well, whoever liked to do everything quickly and easily - you are left mostly chewing snot.

As one of my friends said: “Now the hackwork will be visible. Big changes await those who love the fast, cheap and easy. Those who break the bank by promoting mere mortals on the Internet, while openly hacking, will suffer a fiasco and listen to a lot of complaints from clients”...

You can post your thoughts about Google Penguin or the results of the changes in the comments.

Happy and quality changes to everyone!

At school:
- Children, come up with a sentence with the expression “It was a little.”
Kolya:
- Our Olya almost became a beauty queen!
Peter:
- On Saturday, my mother and I almost missed the train...
Vovochka:
- This morning Lyokha and I almost died from a hangover, but we almost had...

07.11.17

In 2012, Google officially launched " anti-web spam algorithm”, aimed against spam links, as well as link manipulation practices.

This algorithm later became officially known as the Google Penguin algorithm after a tweet from Matt Cutts ( Matt Cutts), who later became the head of Google's web spam division. Although Google officially named the algorithm Penguin, there was no official comment on where this name came from.

The Panda algorithm got its name from the engineers who worked on it. One theory for the origin of the name Penguin is that it is a reference to the Penguin, the DC comic book hero Batman.

Before the introduction of Penguin, link count played a significant role in how Google's crawlers rated web pages.

This meant that when sites were ranked by these scores in search results, some low-quality sites and pieces of content appeared in higher positions.

Why was the Penguin algorithm needed?

Google's war on low-quality search results began with the Panda algorithm, and the Penguin algorithm has become an extension and addition to the arsenal.

Penguin was Google's response to its increasing practice of manipulating search results (and rankings) through spam links.

The Penguin algorithm only processes incoming links to the site. Google analyzes the links leading to the site and does not perceive the outgoing link mass.

Initial launch and influence

When first launched in April 2012 Penguin Google filter influenced more than 3% of search results, according to Google's own estimates.

Penguin 2.0, fourth update ( including original version ) algorithm, was released in May 2013 and affected approximately 2.3% of all search queries.

Key changes and updates to the Penguin algorithm

There have been several changes and updates to the Penguin algorithm since its launch in 2012.

Penguin 1.1: March 26, 2012

This was not a change to the algorithm, but the first update of the data within it. Sites that were initially affected by the Penguin algorithm, but then got rid of low-quality links, saw some improvement in their rankings. At the same time, other sites that were not affected by the Penguin algorithm when it was first launched have seen some impact.

Penguin 1.2: October 5, 2012

This was another data update. It affected requests in English as well as international requests.

Penguin 2.0: May 22, 2013

More advanced ( from a technical point of view) a version of the algorithm that has changed the degree to which it influences search results. Penguin 2.0 affected approximately 2.3% of English-language queries, and approximately the same share of queries in other languages.

It was also the first Google Penguin algorithm update to look beyond the home page and pages top level looking for evidence of spam links.

Penguin 2.1: October 4, 2013

The only update to the Penguin 2.0 algorithm (version 2.1) was released on October 4 of the same year. It affected about 1% of search queries.

Although there was no official explanation from Google for the update, statistics indicate that the update also increased the depth of page views and introduced additional analysis for the presence of spam links.

Penguin 3.0: October 17, 2014

It was another data update that allowed many sanctioned sites to regain their positions, and others that abused spam links but hid from previous versions of Penguin to feel its impact.

Google employee Pierre Far ( Pierre Far) confirmed this and noted that the update would require " few weeks» for full deployment. And that the update affected less than 1% of English-language search queries.

Penguin 4.0: September 23, 2016

Almost two years after update 3.0 was released last change algorithm. As a result, Penguin became part of Google's core search engine algorithm.

Now, working simultaneously with the basic algorithm, Google Penguin 4 evaluates sites and links in real time. This means that the impact of changes can be seen relatively quickly external links on your site's position in search results Google.

The updated Penguin algorithm was also not aimed strictly at imposing sanctions; it devalued the value of spam links. It's become the opposite previous versions Penguin, when bad links were punished. But research shows that algorithmic sanctions based on external links are still used today.

Algorithmic downgrades in Penguin

Shortly after the launch of the Penguin algorithm, webmasters who practiced link manipulation began to notice a decrease in search traffic and search rankings.

Not all of the downgrades triggered by the Penguin algorithm affected entire sites. Some were partial and only affected certain groups of keywords that were actively clogged with spam links or " overly optimized».

A site sanctioned by Penguin took 17 months to regain its position in Google's search results.

Penguin's influence also extends across domains. Therefore, changing the domain and redirecting from the old to the new can lead to even more problems.

Research shows that using 301 or 302 redirects does not negate the impact of the Penguin algorithm. On forum Google webmasters John Mueller ( John Mueller) confirmed that using meta refresh redirects from one domain to another can also lead to sanctions.

Recovering from the effects of the Penguin algorithm

The Disavow Links Tool was useful for SEO specialists, And that hasn't changed even now when Google Penguin algorithm functions as part of Google's core algorithm.

What to include in a disavowal file

The Disavow file contains links that Google should ignore. Thanks to this, low-quality links will not reduce the site’s ranking as a result of the Penguin algorithm. But this also means that if you mistakenly included high-quality links in your disavow file, then they will no longer help the site rank higher.

You don't need to include any comments in Disavow unless you want them yourself.

Google doesn't read the comments you make in the file because it is processed automatically. Some people find it easier to add explanations for later reference. For example, the date when a group of links was added to the file or comments about attempts to contact the site's webmaster to remove the link.

After you upload your Disavow file, Google will send you a confirmation. But, although Google will process it immediately, it will not disavow the links at the same time. Therefore, you will not be able to immediately restore your site’s position in search results.

There is also no way to determine which links have been disavowed and which have not, since Google will still include both in the external link report available in Google Search Console.

If you previously downloaded Disavow and submitted it to Google, it will be replaced by the new one rather than added to the old one. Therefore, it is important to ensure that you include new file old links. You can always download a copy of the current file in your Google account Search Console.

Disable individual links or entire domains

For removing a website from Google Penguin It is recommended to disavow links at the domain level instead of disabling them individual links.

Thanks to which the search engine Google robot You only need to visit one page on the site to disavow links leading from it.

Disabling links at the domain level also means you don't have to worry about whether a link is indexed with or without www.

Detecting your external links

If you suspect your site has been negatively impacted by the Penguin algorithm, you should audit your external links and disavow low-quality or spam links.

Google Search Console allows website owners to get a list of external links. But it is necessary to pay attention to the fact that Disavow includes links that are marked as “ nofollow" If the link is marked nofollow attribute, it will not have any impact on your site. But it is necessary that the site that posted the link to your resource can remove the “ nofollow» at any time without any warning.

There are also many third-party tools that show links to your site. But due to the fact that some webmasters protect the site from being crawled by third-party bots, the use of such tools may not show all incoming links. In addition, such blocking can be used by some resources to hide low-quality links from detection.

Monitoring external links – necessary task to protect against “Negative SEO” attacks" Their essence is that a competitor buys spam links to your site.

Many people use negative SEO as an excuse when their site gets penalized by Google for low-quality links. But Google claims that after Penguin Google updates The search engine recognizes such attacks well.

A survey conducted by Search Engine Journal in September 2017 found that 38% of SEO specialists have never disavowed external links. Reviewing external links and thoroughly researching the domain of each external link is not an easy task.

Requests to remove links

Some site owners require a certain fee to remove a link. Google recommends not paying for this. Simply include the bad external link in your disavowal file and move on to removing the next link.

Although such requests are effective way restore the site's position after applying sanctions for external links; they are not always mandatory. The Google Penguin algorithm considers the portfolio of external links as a whole, that is, as a ratio of the number of high-quality, natural links and spam links.

Some even include in the agreement for using the site “conditions” for placing external links leading to their resource:

We only support "responsible" placement of external links to our web pages. It means that:

  • If we ask you to remove or change a link to our sites, you will do so promptly.
  • It is your responsibility to ensure that your use of links will not damage our reputation or result in commercial gain.
  • You may not create more than 10 links to our websites or webpages without obtaining permission.
  • The site on which you place a link leading to our resource must not contain offensive or defamatory materials. And also must not violate anyone's copyright.
  • If someone clicks on the link you posted, they should open our site at new page(tab) rather than inside a frame on your site.

Link quality assessment

Don't assume that just because an external link is on a site with a .edu domain, that it must be High Quality. Many students sell links from their personal websites, which are registered in the .edu zone. Therefore, they are regarded as spam and should be disavowed. In addition, many .edu sites contain low-quality links because they have been hacked. The same applies to all top level domains.

Google representatives have confirmed that locating a site in a specific domain zone does not help or harm its position in search results. But you need to make a decision for each specific site and take into account Penguin Google updates.

Beware of links from obviously high-quality sites

When looking at a list of external links, don't assume they are high quality just because they are on a particular site. Unless you are 100% sure of its quality. Just because a link is from a reputable site like the Huffington Post or the BBC doesn't make it high quality. Google's eyes. Many of these sites sell links, some of which are disguised advertising.

Promo links

Examples of promotional links that are considered paid links in Google's eyes include links posted in exchange for a free product to review or a discount on a product. While these types of links were acceptable a few years ago, they must now be marked with a " nofollow" You can still benefit from posting such links. They can help increase brand awareness and drive traffic to your website.

It is extremely important to evaluate every external link. It is necessary to remove bad links because they affect the position of the resource in search results as a result of exposure Google Penguin algorithm, or may lead to manual sanctions. Should not be deleted good links, because they help you rank higher in search results.

Can't you recover from the sanctions of the Penguin algorithm?

Sometimes, even after webmasters do a lot of work cleaning up inbound links, they still don't see any improvement in traffic volume or rankings.

There are several reasons for this:

  • The initial increase in traffic and improvement in search rankings that occurred before the algorithmic penalties were imposed were and came from bad inbound links.
  • When the bad external links were removed, no effort was made to generate new high quality link juice.
  • Not all spam links have been disabled.
  • The problem is not with external links.

When you recover from penalties imposed by the Penguin algorithm, don't expect your search rankings to return or to happen instantly.

Add to this the fact that Google is constantly changing Penguin Google filter, so factors that were beneficial in the past may not have as much of an impact now.

Myths and misconceptions associated with the Penguin algorithm

Here are a few myths and misconceptions about the Penguin algorithm that have arisen in recent years.

Myth: Penguin is a punishment

One of the biggest myths about the Penguin algorithm is that people call it a punishment ( or what Google calls manual sanctions).

Penguin is strictly algorithmic in nature. It cannot be applied or removed by specialists Google in manual mode.

Despite the fact that the action of the algorithm and manual sanctions can lead to a big drop in search results, there is a big difference between them.

Manual sanctions occur when Google's webspam specialist responds to a complaint, conducts an investigation, and determines that a domain should be sanctioned. In this case, you will receive a notification in Google Search Console.

When manual sanctions are applied to a site, you need to not only analyze external links and send a Disavow file containing spam links, but also send a request for a review of your “case” by the Google team.

The lowering of positions in search results by the algorithm occurs without any intervention from Google specialists. Does everything Google Penguin algorithm.

Previously, you had to wait for an algorithm change or update, but now Penguin works in real time. Therefore, the restoration of positions can occur much faster ( if the cleaning work was done efficiently).

Myth: Google will notify you if the Penguin algorithm affects your site

Unfortunately, this is not true. Google Search Console won't notify you that search positions your site has deteriorated as a result of the Penguin algorithm.

This again shows the difference between algorithm and manual sanctions - you will be notified if a site has been penalized. But the process of recovery from the effects of an algorithm is similar to recovery from manual sanctions.

Myth: Disavowing bad links is the only way to recover from the effects of the Penguin algorithm

While this tactic can remove many low-quality links, it is not helpful. The Penguin algorithm determines the proportion of good, quality links relative to spam links.

So instead of focusing all your energy on removing low-quality links, you should focus on increasing the number of quality inbound links. This will have a positive effect on the ratio that the Penguin algorithm takes into account.

Myth: You won't recover from the effects of the Penguin algorithm.

Can remove the site from Google Penguin. But this will require some experience in interacting with Google's ever-changing algorithms.

The best way to get rid of negative influence The Penguin algorithm is to forget about all existing external links leading to the site and start collecting new ones. The more quality inbound links you collect, the easier it will be to free your site from the confines of the Google Penguin algorithm.

This publication is a translation of the article “ A Complete Guide to the Google Penguin Algorithm Update", prepared by the friendly project team

Good bad







2024 gtavrl.ru.