There are no quick fixes. What does Google Penguin punish for?


If you are wondering: how to remove a site from Penguin? How to remove Google manual filter? Then, this guide will help you solve these problems, regain positions and traffic.

In the search engine Google system There are dozens of well-known filters that can greatly affect promotion, as well as hundreds that few people know about.

Today we will talk about the most basic and common filters. Namely:

  1. Automatic Google filter for backlinks

All of them in one way or another relate to an algorithm called Google Penguin, which came into force more than a year ago and has already managed to make a lot of noise.

Symptoms of such a filter

  1. collapse of positions
  2. sharp decline in website traffic

In practice it will look like this:

Not exactly a pleasant situation. Especially when in most cases your main source of attracting visitors is search traffic.

Now let’s take a closer look at each type of filter for backlinks.

Manual Google filter for artificial incoming links

Often it all starts with a message in the Google Webmasters panel. It looks like this:

The notification contains a message of this nature:

Messages can be different, for example:

To find manual filter messages, do this:

After receiving a notification about artificial incoming links, the following are usually the consequences:

A) Within 2-3 weeks, positions drop significantly, after which traffic from Google search disappears

B) Positions immediately disappear and traffic drops

Reasons for Google Manual Filter

The main signal that causes such a notification to arrive is the anchor link and its overspam.

In the example, the text backlinks to one of the sites that was affected by the filter. The main reason for the filter is anchor spam.

Another example:

If you look at unique domains, then we get the following picture:

Using backlinks with commercial or other keywords leads to manual filtering, loss of rankings and traffic.

What then to do and what to do?

The solution is extremely simple - do not use large percentage links with keywords.

Step-by-step instructions for removing Google manual filter

  1. Google Notification
    Checking if there is a notification in Google Webmaster Tools. If yes, then move on to the next point.
  2. Request for review
    In the first review request, it is important to clarify which links violate the search rules and ask what needs to be done to remove manual sanctions.
  3. We get the answer

    In most cases, the response indicates those links, according to which the site violates the search rules. For example:

    As a result, we can determine which links Google pays attention to and considers them spam.

  4. We carry out the indicated actions

    What you do next depends heavily on your link profile.

    Situation 1

    There are a large number of rented links to your site, and in most cases these are direct anchors that contain keywords.

    In this case, it is necessary:

    1. clean links with anchors (including those in the Google example)
    2. move on to the next point - new request for review

    If you really remove most of these links, then the manual filter can be removed from one or two queries.

    Situation 2

    In this case, it is necessary:

    1. view all backlinks (you can use backlink checking services, for example: Ahrefs, MajesticSeo, LinkPad)
    2. make a list of links that are unwanted (main criteria: anchor, low quality of the site)
    3. add links to Disawov Tool - https://www.google.com/webmasters/tools/disavow-links-main?/
    4. wait 1-2 weeks (from practice it is necessary for the links to be re-indexed)
    Next, move on to the next point and submit a request for review.
  5. Submitting a new request for review

    Your request for reconsideration must:

    1. clearly describe what you did
    2. simple and clear
    3. and clarify what else needs to be done to remove the filter

    After that, you need a little time to revise your site. This may take from a couple of days to 3-4 weeks.

  6. Intermediate answer

    The intermediate response usually contains the following content:

    This notification from Google comes almost immediately after you submit your review request.

  7. We are waiting for a decision

    The answer usually comes within 1-2 weeks, sometimes longer, and sometimes faster.

    It either contains a negative answer, for example:

    If the answer is negative, That:

    1. reviewing the links again
    2. add to Google Disawov
    3. or just remove links
    4. in any case, action needs to be taken
    5. submit a new request for review

    But the answer after the above steps may be positive, For example:

    In this case, congratulations! Google's manual filter has been removed.

In our example, there was no quick way removing a site from Google’s manual filter, this is what the chronology of events looks like:

Important points when removing the filter yourself:

  1. To not give up
  2. take action (not just send requests for review)
  3. if you do everything as described above, then the manual filter from Google will be removed, you will return positions and traffic

Here is information that will complement this material and help you successfully remove the filter:

  1. USA themes
  2. Case study: how to remove manual Google sanctions using the Disavow Tool and return positions and traffic
  3. Case study: how to remove Google’s manual sanctions and return positions and traffic

There is nothing wrong with a manual filter; the most important thing is not to be lazy and do a series of actions that will lead to the desired result.

The result is a return of attendance to its previous level, for example:

IN in this example A couple of sites were considered, the filters from which were removed at different times.

By filter removal time:

  1. 4 day record
  2. longest period 3.5 months

But in any case, you need to act promptly, and also take into account the points described above.

Automatic filter for backlinks

With Google's automatic filter, everything is much more complicated. Since there is no message about it.

Important: automatic sanctions may be lifted if official update Google algorithm Penguin, which occurs on average once every 2-4 months.

No matter what you do, until the algorithm is updated, nothing will change if these are automatic sanctions.

We've removed the automatic filter more than once, but things don't always happen quickly.

Filter Signs

Everything is the same as with the manual one, but no notification comes.

  1. positions first decline at – 30-50
  2. traffic to the site drops

And then he doesn’t come back for a long time.

Main reasons:

  1. Direct entry of link anchor
  2. Low quality backlinks

Step-by-step guide to removing the automatic filter

IN in this case Unlike manual, everything takes an order of magnitude longer and is more complicated.


In conclusion about the Google Penguin filter

The algorithm was launched in 2012 and is constantly being improved. Its main task is to combat spam and artificial manipulation of search results.

To avoid falling under the filter, necessary:

  1. do not use a direct anchor sheet very noticeably, or not use it at all
  2. attract quality links to the site
  3. try to focus on the product and its quality in order to receive natural links (yes, this can be done even in the realities of the RuNet)
  4. maintain dynamics - Google likes constant dynamics

Then you will not have any questions with filters and your traffic will constantly grow. Naturally, if you work on:

  1. site content
  2. attracting natural links (not purchased)

Even if you have already stepped on a rake and received sanctions from Google, then you should not be disappointed - there is always a solution.

I hope this step by step guide will help solve your problem - return positions and traffic from search engine Google.

Important: This article only covers filters related to the Google Penguin algorithm and backlinks.

If your rankings and traffic have dropped sharply, then the reason may not be in the links at all, but, for example, in the content. The algorithm is responsible for this Google Panda.

Good luck to you! See you soon on the pages of the blog site

You can watch more videos by going to
");">

You might be interested

Anchor - what is it and how important are they in website promotion? Disavow links or how to determine which Google filter (Panda or Penguin) a site is under
SEO terminology, acronyms and jargon Why you can get banned from Yandex, fall under the AGS or a footcloth filter, as well as ways to get out of these sanctions
Rel Nofollow and Noindex - how to block external links on a website from indexing by Yandex and Google
Site trust - what is it, how to measure it in XTools, what influences it and how to increase the authority of your site

Despite the fact that changes in Google algorithms are one of the hottest topics in the field of SEO, many marketers cannot say with certainty how exactly the Panda, Penguin and Hummingbird algorithms affected the ranking of their sites.

Moz specialists have summarized the most significant changes to Google's algorithms and literally broken down the information about what each update is responsible for.

Google Panda – Quality Inspector

The Panda algorithm, whose main goal is improving the quality of search results, was launched on February 23, 2011. With its appearance, thousands of sites lost their positions, which excited the entire Internet. At first, SEOs thought that Panda was penalizing sites found to be participating in link schemes. But, as it later became known, the fight against unnatural links is not within the mandate of this algorithm. All he does is assess the quality of the site.

To find out if you are at risk of falling under Panda's filter, answer these questions:

  • Would you trust the information posted on your website?
  • Are there pages on the site with duplicate or very similar content?
  • Would you trust your credit card information to a site like this?
  • Do the articles contain spelling or stylistic errors or typos?
  • Are articles for the site written taking into account the interests of readers or only with the goal of getting into search results for certain queries?
  • Does the site have original content (research, reports, analytics)?
  • Does your site stand out from the competitors that appear alongside it on the search results page?
  • Is your site an authority in its field?
  • Do you pay due attention to editing articles?
  • Do the articles on the site provide complete and informative answers to users' questions?
  • Would you bookmark the site/page or recommend it to your friends?
  • Could you see an article like this in a printed magazine, book, or encyclopedia?
  • Does advertising on the site distract readers' attention?
  • Do you pay attention to detail when creating pages?

Nobody knows for sure what factors Panda takes into account when ranking sites. Therefore, it is best to focus on creating the most interesting and useful website. In particular, you need to pay attention to the following points:

  • "Insufficient" content. In this context, the term “weak” implies that the content on your site is not new or valuable to the reader because it does not adequately cover the topic. And the point is not at all in the number of characters, since sometimes even a couple of sentences can carry a considerable semantic load. Of course, if most of your site's pages contain only a few sentences of text, Google will consider it low quality.
  • Duplicate content. Panda will consider your site to be of low quality if most of its content is copied from other sources or if the site has pages with duplicate or similar content. This is a common problem with online stores that sell hundreds of products that differ in only one parameter (for example, color). To avoid this problem, use the canonical tag.
  • Low quality content. Google loves sites that are constantly updated, so many SEOs recommend publishing new content daily. However, if you publish low-quality content that does not provide value to users, then such tactics will cause more harm.

How to get out from under the Panda filter?

Google updates the Panda algorithm monthly. After each update, search robots review all sites and check them for compliance with established criteria. If you fell under the Panda filter and then made changes to the site (changed insufficient, low-quality and non-unique content), your site’s position will improve after the next update. Please note that you will most likely have to wait several months for your positions to be restored.

Google Penguin – Link Hunter

The Google Penguin algorithm was launched on April 24, 2012. Unlike Panda, this algorithm aims to combat unnatural backlinks.

The authority and significance of a site in the eyes of search engines largely depends on which sites link to it. Moreover, one link from an authoritative source can have the same weight as dozens of links from little-known sites. Therefore, in the past, optimizers tried to get the maximum number of external links in every possible way.

Google has learned to recognize various manipulations with links. How exactly Penguin works is known only to its developers. All SEOs know is that this algorithm hunts for low-quality links that are manually created by webmasters in order to influence a site's rankings. These include:

  • purchased links;
  • exchange links;
  • links from irrelevant sites;
  • links from satellite sites;
  • participation in link schemes;
  • other manipulations.

How to get out from under the Penguin filter?

Penguin is the same filter as Panda. This means that it regularly updates and reviews sites. To get out of the Penguin filter, you need to get rid of all unnatural links and wait for an update.

If you conscientiously follow Google's guidelines and don't try to gain links through unfair means, you can regain favor with the search engines. However, to regain the top positions, it will not be enough for you to simply remove low-quality links. Instead, you need to earn natural editorial links from trusted sites.

Google Hummingbird is the most “understanding” algorithm

The Hummingbird algorithm is a completely different beast. Google announced the launch of this update on September 26, 2013, but it also mentioned that this algorithm has been in effect for a month. This is why website owners whose rankings fell in early October 2013 mistakenly believe that they fell under the Hummingbird filter. If this were really the case, they would have felt the effect of this algorithm a month earlier. You may ask, what in this case caused the decrease in traffic? Most likely, this was another Penguin update, which came into force on October 4 of the same year.

The Hummingbird algorithm was developed to better understand user requests. Now, when a user enters the query “What places can you eat deliciously in Yekaterinburg,” the search engine understands that by “places” the user means restaurants and cafes.

How to raise yourself in the eyes of Google Hummingbird?

Since Google strives to understand users as best as possible, you should do the same. Create content that will provide the most detailed and useful answers to user queries, instead of focusing on keyword promotion.

Finally

As you can see, all search algorithms have one goal - to force webmasters to create high-quality, interesting and useful content. Remember: Google is constantly working to improve the quality of search results. Create content that will help users find solutions to their problems, and you will be guaranteed to rank first in search results.


After Google Panda punished me for duplicate content in February 2012 and for more than 2 months the traffic sank below the baseboard (I showed you this in this article about ), then I really gave up and didn’t want to promote the blog at all.

I stopped writing articles, doing link promotion, organizing competitions, etc. with the same enthusiasm. In reality, to be honest, there was some depression. Well, how is it, you puff up, puff up, everything goes uphill as it should be and then BAM, the traffic sags and you don’t know what’s going on.

And just for this decline in activity, Google Penguin crept up behind me and punished me in addition. For what? And for suspecting me of unnatural link promotion. And probably not even for the unnatural incoming links themselves, but for the decline in their number.

Now, in 2014, I already figured out what was wrong (why attendance has dropped sharply since February), it’s a matter of DUPLICATES, but unfortunately it’s too late, the penguin applied the filter. Thank God everything ended well and I successfully got out from under it.

Let’s make everything clear to beginners, I’ll start from the very beginning and in order. Here's what you'll learn from this article:

Google Penguin and Panda filters - what they punish for

Let's start with an understanding of what Google mainly punishes blogs for and imposes the so-called panda and penguin filters on them. Let's look at the panda first, since she punished me first.

The Google Panda algorithm penalizes content. That is, for your mistakes in the content component of the blog. And believe me, we have a lot of schools. I highlighted several points here:

1. The content itself. If your blog is mostly not unique content(rewrite or even worse, copy-paste, stupidly copied and pasted), then you are provided with a filter. The content on the blog must be unique. I copied and pasted it for myself - remember that.

2. Content re-optimization. If all your articles are stuffed with keywords, there is too much highlighting keywords, underlining, italics, too many keywords in h1, h2, h3, etc., overspam in the title, description, keywords, etc., then you can also get a header for this. It’s not that it’s possible, that’s exactly what you’ll get in the head.

If you highlight something, then do it in such a way that it is convenient for the user first of all, and not to somehow cheat and optimize it. Do you know a funny joke about a SEO? Listen:

An SEO specialist comes to a bar, restaurant, buy alcoholic drinks, clubs, the best bars in Moscow, order a banquet...

This shouldn't happen to you!)))

3. Duplicate content. My favorite topic of the last 2 articles. IN google results should include only the main page, pages and posts (articles). All. There should be no other pages with the same content in the search results. As practice shows, more than 90% of bloggers suffer from the disease of duplicates. Solution and .

4. Too many ads. If you have a lot of advertising on your blog, then Google Panda also sees this and lowers your blog in search results. key queries. The panda especially does not like advertising at the beginning of articles. Again, 90% of bloggers have Google or Yandex advertising at the very beginning of their articles.))

This is very bad. I personally conducted an experiment on my blog. I wanted to know how advertising affects positions. First, I removed advertisements at the beginning of articles. Traffic has increased a little. Next, I collected all those articles that are most often visited by visitors from search engines, COMPLETELY removed advertising from these articles and...

After 1-1.5 weeks, traffic on them almost doubled. Here's a joke for you. In short, think about it. do not make your blog a traffic light of advertising, insert it so that it is invisible to the visitor.

5. Bounce rate. What is this? In Russian, this is when a visitor came to your blog for some article and after 2-3 seconds, well, 5 or 10, closed the tab. And the more it is, the worse it is. 100 people came to the article, 80 people immediately left - this suggests that 80% of people consider it not informative.

That is, people did not find in it what they were looking for. Therefore, it is worth lowering it in the search results. What conclusions can you draw? Perhaps the title of the article is not right, there is not enough content, the person was scared off by the design, maybe some kind of music starts playing in the background (this is generally fucked up, I would immediately filter such sites)

If you take all these 6 points together, then you probably already understand which article will be the Google Panda's favorite. And this is a cool, interesting, unique article whose title will be catchy and informative. So that the visitor reads the title and understands, “That’s it, this is what I need, I found what I was looking for.”

An article in which there is no copy-paste, an article which is not stuffed with keywords, not over-optimized, in which there are no sales links and a bunch of advertising... An article which does not have twin brothers and which is read from cover to cover not in 5-10 seconds, but in 1-2- 3-5 minutes and they also want to read other articles. =)

This is what we should strive for, comrades. This is according to Google Panda. This little animal came to me in February 2012, looked at the thousands of duplicates that suddenly appeared out of nowhere and imposed sanctions on the entire blog.

Until this moment, traffic reached 6,200 visitors per day and continued to grow by leaps and bounds. Well, I screwed up.))

Now the duplicates are slowly disappearing and I see an increase in traffic from Google:

There have always been 150-170 people per day from Google, but here there are already 280-330. The process has begun. I feel when the duplicates go away, traffic from Google will even overtake Yandex =)

What's happened good blog? This is a blog that is constantly evolving and gives people high-quality, unique and useful content right? Right. How will Google determine whether it is of high quality or not? Interesting or not?

Very simple. In terms of behavioral factors - once. For incoming link mass - two. The first is clear, about behavioral factors I won’t say much, read and I have articles on my blog about them on the Internet.

Let's analyze the incoming reference mass. If the blog develops and constantly produces useful and interesting content, then in any case other sites and blogs on the network should link to this blog. So? That is, there should be a constant increase in the link mass. Links should go and go...

From websites and blogs, from social networks, bookmarks, forums, etc. and so on. Anyway... BUT! Links are natural and high quality - one, keep going constantly - two. You can read about natural and non-natural links on the Internet. I'll tell you more about quality later.

Briefly about natural and unnatural. Natural - those that other people put without your requests, etc. They liked your post and included the link. Not natural - purchased or specially placed somewhere by you, as long as there is a link leading to your blog, which will help increase your position in the PS. Something like this.

If the Google penguin sees that there are links to you from other sites and blogs, but most of them are not natural, then it will punish you. If the Google penguin sees that there are natural links to you, great, but if they were coming and then bam and stopped coming, then he will punish you too. For what?

And because your blog has stopped developing. It's simple. This is a kind of signal to Google, which makes it clear that either your blog has really faded, that is, visitors loved your content and shared links to it in other sources, and then stopped, or you were buying links, then stopped doing this.

That is, for example, there were 200 incoming links to you, and now there are 100. This means that somewhere you cheated, you probably bought temporary links, or simply where there were links to you, those sites and blogs died.

Google understands all this, it looks at the dynamics of link mass growth. If he sees a collapse, then sanctions may be imposed on you. It’s not possible, but they will impose it.

Conclusion: the blog must be constantly developed. Constantly write useful and interesting content, thereby increasing CONSTANTLY incoming link mass.

OK. We understand why the penguin punishes Google, but how can you find out if any penguin sanctions have been imposed on your blog? To determine this, you first need to know what types of filters Penguin has.

What are the Google Penguin filters (types)

Penguin has 2 types of filters:

1. Manual filter
2. Automatically filter

How to find out which filter your blog is under and whether it is at all? Let's start with the manual filter. I’ll tell you how my blog came under it and how I got out of it quite quickly, although I was under it for almost 2 years.

Moreover, in reality the issue was resolved quickly, but my unwillingness to understand this issue delayed the exit from under the manual filter for as much as 2 years. In short, I didn’t take him seriously and, figuratively speaking, I only pretended that I wanted to get out from under him.

So! In general, May 4, 2012 comes to me in the google webmasters control panel (section - search traffic manual measures ) here is the letter:

That is, Google tells me that it has found unnatural links leading to my blog and imposes a manual filter on me. Why manual? But because I received a letter. When applied automatic filter the letter does not arrive.

Well, in short, I looked at this letter, nodded my head, smiled and went about my business. Since the traffic had dropped due to duplicates (which I didn’t know about then) since February, I honestly didn’t give a damn, filters weren’t filters, pandas, penguins, etc.

I gave up on everything. I just stupidly wrote posts sometimes, was busy moving to a new house, then to, etc. In short, there was a severe decline in my activity. A WHOLE YEAR has passed!)))) And only then did I think something - isn’t it time to figure out the issue of removing the filter?)))

And only on March 3, 2013, I send the first request for re-verification to Google and ask them to review my blog for the presence of non-natural incoming links and remove the filter:

I say how white and fluffy I am, I won’t do that again, so I changed the design of the blog, the pictures are all unique, I write all the articles myself, la la la... There is such a button in the same section:

Sent. After 3 days I receive an answer:

That is, they checked my blog and it seems like it still violates quality assurance guidelines. The links to me are still unnatural. OK. I send the request again and then I’m even more indignant, like, how come, I didn’t violate it, it’s not my fault, he came himself and everything like that.

Again I get the answer, no, you came yourself... Guilty...))) That is, they don’t remove the filter. OK! I send the request again and am already swearing... Like, what is it, show me the links, there are already 1500 of them coming to me, I’m not a magician and I can’t understand which are good and which are bad.

I get this response:

That is, they again say that the site violates the recommendations, but they also even showed 3 links as an example. I looked at all these links and saw that this is fucked up. How can you consider them not natural if people themselves put them on their blogs??? Well, ok, I agree about one, I held a competition, but the rest are 2 for what?

I’m writing a letter to Google again, saying, comrades, have you gone crazy with your algorithms? Do you count for unnatural links just natural links. The people installed them themselves. I get the answer again - your site still violates the recommendations and again they show more examples of links.

I look at them again and see that they are all natural. I didn’t buy them, people installed them themselves. And just like that, I sent verification requests 9 times. It's all useless. I already wanted to write obscenities to them in support, but I realized that this would definitely not lead to anything good, so I went to the Google forum.

I decided to ask questions there. I explain the situation, I say Google is sending letters that my blog violates the recommendations, how can it be, because all the links are natural. I received a bunch of useless answers there and ended up leaving there.

My conclusion was clear: Google unfairly put a filter on me.

In fact, guys, I believe that Gogol imposed a filter on me precisely not for unnatural links, but for a decline in the total number of incoming links. I relaxed after the panda, stopped receiving high-quality incoming links and so the penguin hit me with its fin =)

How I got out of Gogol's penguin manual filter

But I’m not going to sit, cry into a handkerchief and leave everything as it is? At the beginning of 2014, I decided to pull myself together and get rid of this scourge, this google filter. I started digging various information on removing the filter, read a bunch of articles and cases, watched videos, attended webinars on SEO, watched courses, etc.

As a result, I got a clear understanding of what I need to do to get the blog out of the filter. And I had to do the following:

1. Collect full list all incoming links to my blog
2. Identify bad links from the list and disavow them in the Disavow Tool
3. Remove all those links from other sites (that google showed me)
4. Start getting quality natural links

Throughout January and February I did just that. On February 22 I send a request for re-verification and on March 2 HURRAY!!! I receive a letter from Google where it says that the filter has been removed:

After reading this letter, I jumped out of my chair, started running around the room in my shorts shouting “Google I love you!!!”, then again in my shorts I ran to the store, bought a bottle of cognac and got drunk like a beast =))) I’m kidding, of course ... But my joy knew no bounds.

So there you go! What did I actually do? First of all, I asked all those webmasters to remove those links to which Google pointed me. Almost everything was removed without any problems, for which many thanks to them! But there are also links that I can’t remove, right? What to do with them?

We paste this whole thing into a simple txt document:

OK. We have a complete list of all incoming domains for our blog. Now we need to figure out all the bad links from this list and tell Google not to take them into account at all when ranking.

Google has a special tool called Disavow Tool. There we can upload a list of links that should not be taken into account by Google, and after some time, after several updates, all these links will not be taken into account.

I will tell you how this is done below, but the question is - how to calculate from the list which links are good and which are bad? There are 2 options here:

Well, the first option certainly works, but is not very effective. Why? But because you can look at a resource and see that it seems to be normal, but in fact it is of poor quality. Quality is, first of all, TRUST and SPAM. More on this later.

That is, you do not reject a link, thinking that it is good, but it is actually bad. The first option is also bad because it will take you a lot of time. Here I have 1500 incoming links. Imagine if I walk on each one? This is tough =)

The second option is cool in that the Checktrust service itself will follow all the links that you show it, select all the bad, non-trusted and spammy ones from them and give you a READY FILE, which you just need to upload to the Disavow Tool and that’s it.

I definitely recommend that you use this option, as it is a 100% solution. I sent requests for review to Google 9 times and only 10 times, when I used this service (received the finished file and sent it to the Disavow Tool), the filter was removed from me.

In order to use the checktrust service you only need to:

1. Register
2. Add a list of all check links to the BackLinks Checker tool
3. Get a finished file with bad links
4. Send it to the Disavow Tool.

Let's begin. You will have no problems with registration. Next we create in the office new project BackLinks Checker:

Click “Next” and “Confirm and place the project.” All. In 5-10 minutes, or even earlier, the service will process all the links and give you a ready-made file, which you will need to upload to the Disavow Tool in order to reject bad links.

As soon as Checktrust has finished everything, go to “Home”, click on the name of our project and a txt file with all the bad links will be automatically downloaded to us:

With this file we go here to Disavow Tool. Click on “Disavow links”:

Once again - “Reject links” and upload our file there, which checktrust gave us:

All! We rejected bad links! But this was not enough for me to get out of the filter. I definitely needed to restore the dynamics of link mass growth. That is, I needed to make sure that various sites and blogs started linking to me again.

In short, I needed to start improving my link profile, which is what I did. What is a link profile, how to analyze and improve it, I explained in my course How to become a blogger 3.0 thousandaire

Naturally, the most important thing in improving your link profile is the constant increase in new high-quality links. New incoming links need to be received constantly, and the Checktrust service also helped me and is still helping me with this.

This is actually now the number 1 tool in promoting my blog. I also explained how to use it when buying links in the KSBT 3.0 course. Owner of this service Alexander Alaev, about which I have already told you more than once. Sanya, thank you for such a cool thing.

It really helps you not to throw away 90% of your budget when buying links. In general, I began to receive incoming links every day (free and paid, buying them on exchanges). Have you heard that Yandex has canceled links? =) Believe this further.))))

In general, this is how I got out of Google’s manual filter. There is also an automatic filter. How is it different from manual? And it differs in that your blog (site) falls out of the top 50-100 in all positions, the traffic sags sharply, but the notifications in the panel google tools you don't get it. It's simple!

And if you can, in principle, get rid of a manual filter quickly, only by using checktrust, then getting rid of an automatic one will be much more difficult because it is removed not by people who look at your application for review and make a decision, but by robots (algorithms).

Do robots have certain numbers exit from under the filter and until your resource reaches these numbers, you will not leave under the filter, you also need to wait for updates to the gogole penguin algorithms. The output diagram for the automatic penguin filter is the same as for the manual one:

If you see, for example, that you have good traffic from Yandex, and from Google from Gulkin Klyuk, then this is a signal - you are under the filter. Or a panda or a penguin. You already know what the panda punishes for. If you think that everything is “zero gut” with the panda, then everything is not so good with the penguin.

We need to take action. But don’t immediately panic and run to disavow links. Maybe you should just look at all your incoming links, analyze them in checktrust, look at their quality and visually evaluate the whole picture.

That is, which links are more bad or good. If you really see that there are few quality resources linking to you, then you need to correct this situation and start getting quality links. I will notice QUALITY!

Well, that's probably all. I will end here and in conclusion I have an interesting proposal for you:

WHO HAS REGISTERED IN THE CHECKTRUST SERVICE BEFORE AUGUST 15, 2014 MY AFFILIATE LINK AND WILL REPLENISH THE BALANCE OF HIS ACCOUNT BY A MINIMUM OF 1000 RUBLES, HE WILL RECEIVE A VERY GOOD PRESENT FROM ME.

AFTER REGISTERING AND REPLENISHING YOUR BALANCE, JUST WRITE TO ME IN SUPPORT AND RECEIVE YOUR GIFT. IF YOU REPLENISH YOUR BALANCE BY 2000 RUBLES, THEN IN ADDITION TO THE GIFT I WILL PERSONALLY LOOK AT YOUR BLOG, ANALYZE IT FOR ERRORS AND TELL YOU WHAT NEEDS TO BE CORRECTED.

IF FOR EXAMPLE YOU HAVE SOME DIFFERENCES WITH ELIMINATING DUPLICIES ON YOUR BLOG, WHICH I SPEAKED ABOUT IN THE TWO PREVIOUS ARTICLES, THEN I WILL ALSO HELP YOU WITH THIS AND SET EVERYTHING SET UP!

BUT! Don’t think that I wrote this article just to promote the checktrust service and earn commissions. This is wrong. This is truly a real service that I myself use almost every day and highly recommend that you use it.

And I’m making you an offer because I’m participating in a competition from checktrust, where an iPhone 5 is being raffled off, which I would like to win and give to the winner of the “” competition. =) These are the pies.

If anyone has any questions, please ask in the comments. But in theory, I think I explained everything clearly.

Guys, do you know why most bloggers today have a sad situation with traffic from Google and Yandex? Well, yes, technical mistakes play a role, but one of the most important reasons is the lack of a good incoming link mass or its complete absence.

How many letters do I receive in support - “Alexander, what should I do? I’ve been writing this blog for two years now, but there’s still no traffic from PS. The articles are unique, interesting, useful, there are a lot of readers... HELP!”

It’s all great that you have a lot of readers and your articles are great, but that’s not enough. You have to show and prove that they are really cool. Only high-quality, trusted, incoming links can show this. And the more there are, the better.

If an authoritative site links to you, then this is a direct indicator that your site is interesting. If you've been blogging for 2-3 years and have 100 incoming links, then don't expect mega traffic. Look at the projects that reach tens of thousands of visitors.

There, link promotion is thought out even before the start of the project. There are whole methods, analyses, link promotion planning, etc. That is why some resources are developing well, while others are in the ass.

In general, the last thing - use link promotion and you will be happy, and do not listen to any statements that the links do not work, they have been canceled, etc. Good luck!

P.S. Probably the most last question you say, “What should I do now? I didn’t receive a letter from Google, it seems like I’m not under the filter, there’s no traffic, but it’s coming. What then?

And then just evaluate the quality of all links coming to your blog through the checktrust service (I talked about how to do this in KSBT 3.0) and if you see that the picture is sad there are few links and the majority are not very good good quality, then take action.

Trust me! You will start receiving quality links and see an increase in traffic. I clearly demonstrated this until February 2012, when I received a panda for duplicates.

Well, that's it, bye everyone! I look forward to your questions in the comments!

Best regards, Alexander Borisov

We've released a new book, Social Media Content Marketing: How to Get Inside Your Followers' Heads and Make Them Fall in Love with Your Brand.

Subscribe

The Google Penguin filter is one of the latest algorithms that the company uses to rank websites in search results.

Today, Google takes into account more than two hundred factors when ranking sites. To take them all into account, one algorithm is not enough; several are needed, each of which will solve its own problems.

The main task of the Penguin filter is to identify and block sites that use dishonest promotion methods, the main one of which is the purchase of link mass. The algorithm is constantly being improved and currently the Google Penguin filter is updated almost continuously.

History of the development of the Penguin algorithm

Google Penguin was released to the world in April 2012. Over the next two months, it was updated twice, with the developers adjusting the first version of the filters. The second version of the algorithm appeared almost a year later; the updated Penguin acted more subtly and took into account not only the level of link spam, but also the overall level of the page.

In the fall of 2014, the algorithm was updated again. It must be said that at that time he acted in such a way that the sites that fell under his filters had to wait a long time after the correction for the release of the next update in order to pass the check again. The situation changed in 2016, after the release of Google Penguin 4.0, which operated in real time and was updated continuously. Latest versions The algorithms act extremely gently - the level of the site, the quality of the pages are taken into account, and low-quality links are canceled without sending the entire site to a ban.

What does Google Penguin punish for?

Experts believe that the Penguin algorithm should complement the Google Panda algorithm, which is responsible for checking website content. To prevent your resource from falling under the Google Penguin filter, you need to carefully work with external links to the site and avoid what experts call link manipulation. The main methods of such manipulation are:

  • “Trading” links, when the site owner publishes links to other people’s sites on his resource for money or other payment.
  • Obviously artificial link exchange, when sites link to each other due to collusion of owners, and not because of the quality of the content.
  • Use on the site large quantity texts that contain a lot of “far-fetched” anchors and keywords.
  • Using services that automatically generate links to the site.
  • The presence on the site of links that have direct keywords in the anchor.
  • Using cross-links with anchor keywords in the sidebar and footer of the site.
  • Comments on site materials with links to spam resources.
  • Excessive quantity contextual advertising on the main page of the site.

For using such dishonest link schemes, the Google Penguin filter will quickly and reliably “drop” your site by many pages in the search results. Moreover, it will be very difficult for you to regain your position, since Google Penguin site checks are performed only twice a year.

How to find out if Google Penguin has applied sanctions

Unlike Google's algorithm, which only works in automatic mode, Penguin is also used for manual moderation. If you notice a sharp drop in traffic, go to Google Webmaster Tools, to the “Manual measures” section and check if there is a message from the moderators there.

If there is a letter, all you have to do is correct the shortcomings indicated in it and send a request for a new check.

However, most often the algorithm works automatically. In this case, it's worth going to Moz.com and checking if there have been any recent Penguin updates. If there have been updates, then the diagnosis has been established correctly, and it’s time to start “treating” the site. You can also identify this correspondence using the PenguinTool service from the Barracuda website. True, for this you will have to give the service access to your account in Google Analytics, so that it compares the period of drop in traffic and the time of release of the new update. The comparison result will help you understand whether you are caught by Penguin’s filters or not.

What to do? if Google Penguin caught you

If you fall under the filters of this algorithm, then the worst thing you can do is start to panic delete all links. This will completely destroy the resource.

A site that is deemed to be of poor quality by the search engine needs a calm and thoughtful reorganization. Google itself proposes to gain link mass again, slowly, naturally and mainly through the creation of unique content.

The first thing you need to do to get out from under the filter is to analyze the link profile of the resource. You will need to understand which links come from quality sites, that is, from useful, interesting and visited ones, and which from spam ones. You can do this using the Majestic SEO service. Links to spam sites ( internal links) must be neutralized using noindex and nofollow bans, which will block “bad” links from indexing and block transitions to them. To remove external links you will need to use Google service to disavow links. The service is called , Google Penguin simply does not take into account the links included in it.

The second step is changing link anchors. It is performed in two ways. The first is to change the link to a non-anchor link, but only an experienced webmaster can do this. The second method is to build up your link profile by creating new non-anchor links.

The third step is to expand your base of link donors, that is, make sure that the links come from different sources: from forums, from social networks, from directories, from blogs and media, such as online magazines and news portals. Complete site sanitization and removal from filters usually takes 3-4 months.

To avoid getting hit Google filters Penguin, you need to attract only high-quality links, maintain constant growth dynamics in your link profile, and not use direct anchors in your links. Quality content and natural link building from different sources will protect you from search engine sanctions better than any specialist.

07.11.17

In 2012, Google officially launched " anti-web spam algorithm”, aimed against spam links, as well as link manipulation practices.

This algorithm later became officially known as the Google Penguin algorithm after a tweet from Matt Cutts ( Matt Cutts), who later became the head of Google's web spam division. Although Google officially named the algorithm Penguin, there was no official comment on where this name came from.

The Panda algorithm got its name from the engineers who worked on it. One theory for the origin of the name Penguin is that it is a reference to the Penguin, the DC comic book hero Batman.

Before the introduction of Penguin, link count played a significant role in how Google's crawlers rated web pages.

This meant that when sites were ranked by these scores in search results, some low-quality sites and pieces of content appeared in higher positions.

Why was the Penguin algorithm needed?

Google's war on low-quality search results began with the Panda algorithm, and the Penguin algorithm has become an extension and addition to the arsenal.

Penguin was Google's response to its increasing practice of manipulating search results (and rankings) through spam links.

The Penguin algorithm only processes incoming links to the site. Google analyzes the links leading to the site and does not perceive the outgoing link mass.

Initial launch and influence

When first launched in April 2012 Penguin Google filter influenced more than 3% of search results, according to Google's own estimates.

Penguin 2.0, fourth update ( including original version ) algorithm, was released in May 2013 and affected approximately 2.3% of all search queries.

Key changes and updates to the Penguin algorithm

There have been a few changes and updates Penguin algorithm, since its launch in 2012.

Penguin 1.1: March 26, 2012

This was not a change to the algorithm, but the first update of the data within it. Sites that were initially affected by the Penguin algorithm, but then got rid of low-quality links, saw some improvement in their rankings. At the same time, other sites that were not affected by the Penguin algorithm when it was first launched have seen some impact.

Penguin 1.2: October 5, 2012

This was another data update. It affected requests for English language, as well as international inquiries.

Penguin 2.0: May 22, 2013

More advanced ( from a technical point of view) a version of the algorithm that has changed the degree to which it influences search results. Penguin 2.0 affected approximately 2.3% of English-language queries, and approximately the same share of queries in other languages.

It was also the first Google Penguin algorithm update to look further home page and pages top level looking for evidence of spam links.

Penguin 2.1: October 4, 2013

The only update to the Penguin 2.0 algorithm (version 2.1) was released on October 4 of the same year. It affected about 1% of search queries.

Although there was no official explanation from Google for the update, statistics indicate that the update also increased the depth of page views and introduced additional analysis for the presence of spam links.

Penguin 3.0: October 17, 2014

It was another data update that allowed many sanctioned sites to regain their positions, and others that abused spam links but hid from previous versions of Penguin to feel its impact.

Google employee Pierre Far ( Pierre Far) confirmed this and noted that the update would require " few weeks» for full deployment. And that the update affected less than 1% of English-language search queries.

Penguin 4.0: September 23, 2016

Almost two years after update 3.0 was released last change algorithm. As a result, Penguin became part of Google's core search engine algorithm.

Now, working simultaneously with the basic algorithm, Google Penguin 4 evaluates sites and links in real time. This means that the impact of changes can be seen relatively quickly external links on the position of your site in Google search results.

The updated Penguin algorithm was also not aimed strictly at imposing sanctions; it devalued the value of spam links. It's become the opposite previous versions Penguin, when bad links were punished. But research shows that algorithmic sanctions based on external links are still used today.

Algorithmic downgrades in Penguin

Shortly after the launch of the Penguin algorithm, webmasters who practiced link manipulation began to notice a decrease in volume search traffic and search positions.

Not all of the downgrades triggered by the Penguin algorithm affected entire sites. Some were partial and only affected certain groups of keywords that were actively clogged with spam links or " overly optimized».

A site sanctioned by Penguin took 17 months to regain its position in Google's search results.

Penguin's influence also extends across domains. Therefore, changing the domain and redirecting from the old to the new can lead to even more problems.

Research shows that using 301 or 302 redirects does not negate the impact of the Penguin algorithm. On forum Google webmasters John Mueller ( John Mueller) confirmed that using meta refresh redirects from one domain to another can also lead to sanctions.

Recovering from the effects of the Penguin algorithm

The Disavow Links Tool was useful for SEO specialists, And that hasn't changed even now when Google Penguin algorithm functions as part of Google's core algorithm.

What to include in a disavowal file

The Disavow file contains links that Google should ignore. Thanks to this, low-quality links will not reduce the site’s ranking as a result of the Penguin algorithm. But this also means that if you mistakenly included high-quality links in your disavow file, then they will no longer help the site rank higher.

You don't need to include any comments in Disavow unless you want them yourself.

Google doesn't read the comments you make in the file because it is processed automatically. Some people find it easier to add explanations for later reference. For example, the date when a group of links was added to the file or comments about attempts to contact the site's webmaster to remove the link.

After you upload your Disavow file, Google will send you a confirmation. But, although Google will process it immediately, it will not disavow the links at the same time. Therefore, you will not be able to immediately restore your site’s position in search results.

There is also no way to determine which links have been disavowed and which have not, since Google will still include both in the external link report available in Google Search Console.

If you previously downloaded Disavow and submitted it to Google, it will be replaced by the new one rather than added to the old one. Therefore, it is important to ensure that you include new file old links. You can always download a copy of the current file in your Google account Search Console.

Disable individual links or entire domains

For removing a website from Google Penguin It is recommended to disavow links at the domain level instead of disabling them individual links.

Thanks to which the search engine Googlebot You only need to visit one page on the site to disavow links leading from it.

Disabling links at the domain level also means you don't have to worry about whether a link is indexed with or without www.

Detecting your external links

If you suspect your site has been negatively impacted by the Penguin algorithm, you should audit your external links and disavow low-quality or spam links.

Google Search Console allows website owners to get a list of external links. But it is necessary to pay attention to the fact that Disavow includes links that are marked as “ nofollow" If the link is marked nofollow attribute, it will not have any impact on your site. But it is necessary that the site that posted the link to your resource can remove the “ nofollow» at any time without any warning.

There are also many third-party tools that show links to your site. But due to the fact that some webmasters protect the site from being crawled by third-party bots, the use of such tools may not show all incoming links. In addition, such blocking can be used by some resources to hide low-quality links from detection.

Monitoring external links – necessary task to protect against “Negative SEO” attacks" Their essence is that a competitor buys spam links to your site.

Many people use negative SEO as an excuse when their site gets penalized by Google for low-quality links. But Google claims that after Penguin Google updates The search engine recognizes such attacks well.

A survey conducted by Search Engine Journal in September 2017 found that 38% of SEO specialists have never disavowed external links. Reviewing external links and thoroughly researching the domain of each external link is not an easy task.

Requests to remove links

Some site owners require a certain fee to remove a link. Google recommends not paying for this. Simply include the bad external link in your disavowal file and move on to removing the next link.

Although such requests are effective way restore the site's position after applying sanctions for external links; they are not always mandatory. The Google Penguin algorithm considers the portfolio of external links as a whole, that is, as a ratio of the number of high-quality, natural links and spam links.

Some even include in the agreement for using the site “conditions” for placing external links leading to their resource:

We only support "responsible" placement of external links to our web pages. It means that:

  • If we ask you to remove or change a link to our sites, you will do so promptly.
  • It is your responsibility to ensure that your use of links will not damage our reputation or result in commercial gain.
  • You may not create more than 10 links to our websites or webpages without obtaining permission.
  • The site on which you place a link leading to our resource must not contain offensive or defamatory materials. And also must not violate anyone's copyright.
  • If someone clicks on the link you posted, they should open our site at new page(tab) rather than inside a frame on your site.

Link quality assessment

Don't assume that just because an external link is on a site with a .edu domain that it is necessarily high quality. Many students sell links from their personal websites, which are registered in the .edu zone. Therefore, they are regarded as spam and should be disavowed. In addition, many .edu sites contain low-quality links because they have been hacked. The same applies to all top level domains.

Google representatives have confirmed that locating a site in a specific domain zone does not help or harm its position in search results. But you need to make a decision for each specific site and take into account Penguin Google updates.

Beware of links from obviously high-quality sites

When looking at a list of external links, don't assume they are high quality just because they are on a particular site. Unless you are 100% sure of its quality. Just because a link is on a reputable site like the Huffington Post or the BBC doesn't make it high quality in the eyes of Google. Many of these sites sell links, some of which are disguised advertising.

Promo links

Examples of promotional links that are paid in the eyes of Google include links placed in exchange for free product for a review or a discount on a product. While these types of links were acceptable a few years ago, they must now be marked with a " nofollow" You can still benefit from posting such links. They can help increase brand awareness and drive traffic to your website.

It is extremely important to evaluate every external link. It is necessary to remove bad links because they affect the position of the resource in search results as a result of exposure Google Penguin algorithm, or may lead to manual sanctions. Should not be deleted good links, because they help you rank higher in search results.

Can't you recover from the sanctions of the Penguin algorithm?

Sometimes, even after webmasters do a lot of work cleaning up inbound links, they still don't see any improvement in traffic volume or rankings.

There are several reasons for this:

  • The initial increase in traffic and improvement in search rankings that occurred before the algorithmic penalties were imposed were and came from bad inbound links.
  • When the bad external links were removed, no effort was made to generate new high quality link juice.
  • Not all spam links have been disabled.
  • The problem is not with external links.

When you recover from penalties imposed by the Penguin algorithm, don't expect your search rankings to return or to happen instantly.

Add to this the fact that Google is constantly changing Penguin Google filter, so factors that were beneficial in the past may not have as much of an impact now.

Myths and misconceptions associated with the Penguin algorithm

Here are a few myths and misconceptions about the Penguin algorithm that have arisen in recent years.

Myth: Penguin is a punishment

One of the biggest myths about the Penguin algorithm is that people call it a punishment ( or what Google calls manual sanctions).

Penguin is strictly algorithmic in nature. It cannot be applied or removed by specialists Google in manual mode.

Despite the fact that the action of the algorithm and manual sanctions can lead to a big drop in search results, there is a big difference between them.

Manual sanctions occur when Google's webspam specialist responds to a complaint, conducts an investigation, and determines that a domain should be sanctioned. In this case, you will receive a notification in Google Search Console.

When manual sanctions are applied to a site, you need to not only analyze external links and send a Disavow file containing spam links, but also send a request for a review of your “case” by the Google team.

The lowering of positions in search results by the algorithm occurs without any intervention from Google specialists. Does everything Google Penguin algorithm.

Previously, you had to wait for an algorithm change or update, but now Penguin works in real time. Therefore, the restoration of positions can occur much faster ( if the cleaning work was done efficiently).

Myth: Google will notify you if the Penguin algorithm affects your site

Unfortunately, this is not true. Google Search Console will not notify you that your site's search rankings have deteriorated as a result of the Penguin algorithm.

This again shows the difference between algorithm and manual sanctions - you will be notified if a site has been penalized. But the process of recovery from the effects of an algorithm is similar to recovery from manual sanctions.

Myth: Disavowing bad links is the only way to recover from the effects of the Penguin algorithm

While this tactic can remove many low-quality links, it is not helpful. The Penguin algorithm determines the proportion of good, quality links relative to spam links.

So instead of focusing all your energy on removing low-quality links, you should focus on increasing the number of quality inbound links. This will have a positive effect on the ratio that the Penguin algorithm takes into account.

Myth: You won't recover from the effects of the Penguin algorithm.

Can remove the site from Google Penguin. But this will require some experience in interacting with Google's ever-changing algorithms.

The best way to get rid of negative influence The Penguin algorithm is to forget about all existing external links leading to the site and start collecting new ones. The more quality inbound links you collect, the easier it will be to free your site from the confines of the Google Penguin algorithm.

This publication is a translation of the article “ A Complete Guide to the Google Penguin Algorithm Update", prepared by the friendly project team

Good bad







2024 gtavrl.ru.