Panda from Google: what's new? How to identify the action of Panda. Why doesn't Panda punish?


We have released a new book “Content Marketing in in social networks: How to get into your subscribers’ heads and make them fall in love with your brand.”

Subscribe

Google Panda is a filter program from Google. Its task is to monitor and block sites with low-quality content.

More videos on our channel - learn internet marketing with SEMANTICA

Search engine Google system considers its main function to provide users with relevant, interesting and useful information. That is why she firmly fights black promotion methods, when a site gets to the TOP of search results not by creating high-quality and in-demand content, but by manipulating search engines.

Panda's story

Google Panda algorithm is automatic. The first step towards its creation was the assessment of a large number of sites for compliance with the requirements that Google imposes on content. It was carried out manually by a group of testers and made it possible to formalize the main factors influencing the quality of website content.

First Google times Panda was released in February 2011, but became fully operational in April. During 2011 alone, the algorithm was updated 57 times, and then another 5 times in 2012. The next year 2013 brought Last update Pandas with their own number – Panda 25, which has undergone minor changes in June and July of the same year. Latest version The Panda 4.0 algorithm, released in May 2014, shook the position of news aggregators and affected giants such as Ebay.

Today, the algorithm is being improved almost continuously, being updated every month for 10 days. The company does not publish update dates in order to make it as difficult as possible to manipulate sites’ positions in search results.

Google Panda requirements for websites

Google Panda is designed primarily to combat low-quality content, so its main requirements for sites are based on the quality of published information.

  1. The site should mainly contain unique content and should not contain duplicate texts or images.
  2. The use of automatically generated texts is unacceptable.
  3. Search engine robots and users should see the same thing; you cannot use pages that are visible only to search engines and are needed solely for website promotion.
  4. Keywords Each page of the site must correspond to its content; spamming with keywords is unacceptable.
  5. Links and advertisements on pages must correspond to the subject matter of the texts and other content of the page.
  6. It is prohibited to use doorways, hidden links or hidden text, the purpose of which is to deceive the search engine.

Algorithm Google Panda will punish you if the site pages contain content copied from other resources without a link to the source, template articles with an identical structure, duplicate pages. Sites on which the text is a complete sheet, without illustrations, videos or infographics, with the same meta tags on different pages can fall under the Google Panda filter.

This algorithm is considered to pay little attention to links, but it does require that links in articles be relevant to their topic. That is, in the texts about plastic windows there should be no references to the sale of tea, etc.

In addition to everything described, Google Panda pays a lot of attention to behavioral factors. If your site has a high bounce rate, users leave after the first page and never return, you will certainly fall into the attention of the algorithm.

How to escape Google Panda sanctions

How to identify the action of Panda

Before you take steps to escape Google Panda's filters, make sure that you have suffered from its clutches. How to determine whether Google Panda is the culprit of your problem, or is the reason for something else?

Pay attention to the connection between the algorithm update and the drop in traffic to the site. Google Panda Update is held monthly, lasts 10 days, you can find out the time of its implementation on the Moz.com website. If there is a coincidence, action must be taken.

The second way to catch Panda is to use special service Barracuda. One of the services of this site, Panguin Tool, having access to your account information in Google Analytics, superimposes the graph taken from it onto the dates of algorithm updates and produces an answer. This method has two disadvantages - it is not suitable for those who have recently installed a Google Analytics counter and require access to an account, which, in turn, allows you to access the money in your Google Analytics account.

The third method is quite simple, but requires time and patience. You will have to check every page of the site. You need to do it like this:

  1. Copy several sentences from the text on the page into the search engine Google string. It is enough to take 100-200 characters.
  2. See if your site appears at the top of the search results.
  3. Conclude the same passage with search bar in quotes and see again whether the site is in the search results or not.

If the site appears in the search results only in the second case, then Google Panda is the culprit of your troubles. Remember that you will have to check every page.

How to get out from under the filter

To escape the Panda filter you will have to do the following:

  1. Make a complete revision of the site’s content and replace most of the texts with interesting and useful ones, and rework the rest so that they become unique and relevant.
  2. Remove excess keywords from headings of all levels and meta tags. Change the headings to relevant and attractive ones, make sure they catch the visitor and make them want to read the text.
  3. Clean the site from irrelevant and aggressive advertising, which will significantly improve behavioral factors.
  4. Delete all duplicates and broken links, deselect keywords.
  5. Check links on pages for consistency with the content of the pages and replace or remove irrelevant ones.

These steps will help you get out of Panda's filters over time. However, please note: Google Panda - automatic algorithm, and no appeal will help you quickly free yourself from his sanctions. Fill the site with interesting things unique content, don’t get carried away with placing advertisements on it, and you won’t get in trouble with the smart beast Panda Google.

Hello readers. Today I am publishing another interesting article on a pressing topic -. Surely, many remember that on February 24 of this year, the mighty Google introduced a new algorithm, the purpose of which is to rank sites according to a new scheme.

Of course, not to say that the new algorithm destroyed all sites from the search results, but 12% of all search results were revised, and not for the better for site owners. The result of using Google Panda was impossible not to notice. On many sites, traffic from Google has dropped by half. Many have even more.

Perhaps you have not felt the effect of this algorithm, but you should not rejoice ahead of time. It's never too late to fall under the clutches of Google Panda. In this article I will talk about the algorithm itself, as well as how it can be bypassed.

Answer the question: are you ready for the fact that one day SE traffic on your site will be exactly half as much? If not, then I can assure you that no one is immune from such a surprise. But first, let's talk about how to identify the fact that your site is under attack from Google Panda

How to determine if a site is under a filterGooglePanda?

In general, you can use different systems statistics, but it is best to use Google Analytics, because... It is this statistics system that was developed directly on the sidelines of the Google PS, and no one else reflects Google statistics so accurately.

Surely, many who have English-language sites that are presented in American search results were able to feel the impact of Panda on their sites. Below I will show how catastrophic the damage caused by Google Panda is. But, besides this, we will look at how to identify the fact that a filter is being applied to your site, and look for ways to solve the problem.

Of course, Google Panda hit American search results first. In Russian Google everything is much simpler, there are no such strict restrictions, but still, I recommend that you read this article to be prepared for the new algorithm.

Well, now to the point. The very first thing we will need to do is log into your Google Analytics account. Not all the guys working in RuNet have sites focused on the US, so if you log into your GA account and see what’s in the screenshot below, you might think that everything is fine and Google Panda has bypassed you:

But, really, this is only the first impression. In fact, everything is much more complicated, and you need to dig deeper here. IN Google account Analytics go to the “Traffic Sources” tab, then go to – “ Search engines"(Search Engines). Then select non-paid (not paid):

Here you can see the breakdown by search engines. The arrow indicates the Google search engine. Click here. This is necessary in order to see traffic statistics exclusively for this search engine.

Next, click on the “Keywords” tab. In the screenshot you can see it in a green frame. Something like this appears large menu, as can be seen below in the screenshot. From this menu select “Country/Territory”. This element is highlighted in a red frame. We do this in order to filter data by specific country. Click on this element:

Here in the screenshot you can clearly see that, starting from February 24, traffic from Google search dropped significantly. More than 50%. Tell me, is this the reason for the disorder?

Let's look at left menu. Here we select Advanced Segments, and after that we create a new segment. In this segment we specify the following settings:

You can call this segment whatever you like. Here, for example, it is called “G US organic”. Now let's save this segment. Further, all our data will relate only to visitors coming to our site from US organic search, from the Google search engine.

In general, this is all just in words for now. In fact, let's try to understand why Google Panda is needed in the first place. Main Google appointment Panda – clean up the search results from UG sites.

Matt Cutts said Google Panda will hit:

    Sites with a lot of useless content

    Sites filled with copy-paste

All this is necessary in order to “give way” to the top for high-quality sites that have original and interesting content, sites that are not filled to capacity with advertising.

Of course, first of all from Google filter Panda will suffer HS. But the filter is imperfect, so the creators of the filter are well aware that normal sites can also suffer from it, and therefore if problems arise, you can safely contact them through feedback.

By the way, in in this example a normal site was considered - SDL, and not some kind of UG. The foundation of this site is high-quality articles written by various experts. In addition, a service like Q and A (question and answer) is attached to the site, where all participants can find answers to their questions. Yes, Q&A services can theoretically be a target for Google Panda, because the pages of such services often contain little content. But there are facts when Google Panda also mowed down normal content sites where there are no forums or other services like Q&A.

When you can be “covered”GooglePanda?

Of course, none of them Google developers Panda (Matt Kats and Amit Singal) did not reveal the secret, but still, from their interview with Wired magazine, certain conclusions can be drawn regarding the nuances of the operation of this filter

Below is a list of potential factors that could put you at risk:GooglePanda:

    A large amount of copy-paste. According to observations, the worst thing is that the entire site, and not its individual pages, can be pessimized for copy-paste

    Significant prevalence of non-unique content on the site over unique content

    Page irrelevance search queries, according to which it is promoted or is in the search results

    Re-optimizing the page

    High bounce rate

    Little time that users spend on the site

    Low percentage of returns

    Lots of low-quality links on the pages. Low-quality, in the sense of non-thematic and links leading to explicit GS

Okay, if your site only fits one of these points, that's okay. But if it fits many, then this may be the reason for the Google Panda strike. By the way, you need to learn to distinguish between penalties and a ban in general. And your site will be in trouble until you improve its quality new level. Google Panda is a very serious algorithm, and if your site has several pages with blatant G, then this may be the reason for the “sinking” of the entire site, although all other pages of the site may have high-quality content.

It is possible to see how the filter was applied: at the site level or at the page level

If the filter was applied throughout the entire site, then you will probably see it with the naked eye, because it will affect all pages of the site. Using our example, you can use the “G US organic” reports and see this...

Go to Content > Top Landing Pages. Without comments, everything is clear:

These are statistics for landing pages. That is, for individual pages. There are 4272 of them. In order to find out whether all the pages were affected by Google Panda, we need to tinker a little with the reports in Google Analytics:

    We need to make a report on the pages that are most important to us

    We need to make a report on groups of pages. You can sort by URL. For example, select only pages that contain the word forum

This is done simply:

Above, I talked about how Google Panda applies a filter to the entire site. This opinion, in any case, is floating around the forums. But in fact, when I analyzed several sites according to the filters I described above, I came to the conclusion that Google Panda does not filter the entire site, but only at the page level, which means that not everything is as scary as it initially seemed .

Despite the fact that changes in Google algorithms are one of the hottest topics in the field of SEO, many marketers cannot say with certainty how exactly the Panda, Penguin and Hummingbird algorithms affected the ranking of their sites.

Moz specialists have summarized the most significant changes to Google's algorithms and literally broken down the information about what each update is responsible for.

Google Panda – Quality Inspector

The Panda algorithm, whose main goal is improving the quality of search results, was launched on February 23, 2011. With its appearance, thousands of sites lost their positions, which excited the entire Internet. At first, SEOs thought that Panda was penalizing sites found to be participating in link schemes. But, as it later became known, the fight against unnatural links is not within the mandate of this algorithm. All he does is assess the quality of the site.

To find out if you are at risk of falling under Panda's filter, answer these questions:

  • Would you trust the information posted on your website?
  • Are there pages on the site with duplicate or very similar content?
  • Would you trust your credit card information to a site like this?
  • Do the articles contain spelling or stylistic errors or typos?
  • Are articles for the site written taking into account the interests of readers or only with the goal of getting into search results for certain queries?
  • Does the site have original content (research, reports, analytics)?
  • Does your site stand out from the competitors that appear alongside it on the search results page?
  • Is your site an authority in its field?
  • Do you pay due attention to editing articles?
  • Do the articles on the site provide complete and informative answers to users' questions?
  • Would you bookmark the site/page or recommend it to your friends?
  • Could you see an article like this in a printed magazine, book, or encyclopedia?
  • Does advertising on the site distract readers' attention?
  • Do you pay attention to detail when creating pages?

Nobody knows for sure what factors Panda takes into account when ranking sites. Therefore, it is best to focus on creating the most interesting and useful website. In particular, you need to pay attention to the following points:

  • "Insufficient" content. In this context, the term “weak” implies that the content on your site is not new or valuable to the reader because it does not adequately cover the topic. And the point is not at all in the number of characters, since sometimes even a couple of sentences can carry a considerable semantic load. Of course, if most of your site's pages contain only a few sentences of text, Google will consider it low quality.
  • Duplicate content. Panda will consider your site to be of low quality if most of its content is copied from other sources or if the site has pages with duplicate or similar content. This is a common problem with online stores that sell hundreds of products that differ in only one parameter (for example, color). To avoid this problem, use the canonical tag.
  • Low quality content. Google loves sites that are constantly updated, so many SEOs recommend publishing new content daily. However, if you publish low-quality content that does not provide value to users, then such tactics will cause more harm.

How to get out from under the Panda filter?

Google updates the Panda algorithm monthly. After each update, search robots review all sites and check them for compliance with established criteria. If you fell under the Panda filter and then made changes to the site (changed insufficient, low-quality and non-unique content), your site’s position will improve after the next update. Please note that you will most likely have to wait several months for your positions to be restored.

Google Penguin – Link Hunter

The Google Penguin algorithm was launched on April 24, 2012. Unlike Panda, this algorithm aims to combat unnatural backlinks.

The authority and significance of a site in the eyes of search engines largely depends on which sites link to it. Moreover, one link from an authoritative source can have the same weight as dozens of links from little-known sites. Therefore, in the past, optimizers tried to get the maximum number of external links in every possible way.

Google has learned to recognize various manipulations with links. How exactly Penguin works is known only to its developers. All SEOs know is that this algorithm hunts for low-quality links that are manually created by webmasters in order to influence a site's rankings. These include:

  • purchased links;
  • exchange links;
  • links from irrelevant sites;
  • links from satellite sites;
  • participation in link schemes;
  • other manipulations.

How to get out from under the Penguin filter?

Penguin is the same filter as Panda. This means that it regularly updates and reviews sites. To get out of the Penguin filter, you need to get rid of all unnatural links and wait for an update.

If you conscientiously follow Google's guidelines and don't try to gain links through unfair means, you can regain favor with the search engines. However, to regain the top positions, it will not be enough for you to simply remove low-quality links. Instead, you need to earn natural editorial links from trusted sites.

Google Hummingbird is the most “understanding” algorithm

The Hummingbird algorithm is a completely different beast. Google announced the launch of this update on September 26, 2013, but it also mentioned that this algorithm has been in effect for a month. This is why website owners whose rankings fell in early October 2013 mistakenly believe that they fell under the Hummingbird filter. If this were really the case, they would have felt the effect of this algorithm a month earlier. You may ask, what in this case caused the decrease in traffic? Most likely, this was another Penguin update, which came into force on October 4 of the same year.

The Hummingbird algorithm was developed to better understand user requests. Now, when a user enters the query “What places can you eat deliciously in Yekaterinburg,” the search engine understands that by “places” the user means restaurants and cafes.

How to raise yourself in the eyes of Google Hummingbird?

Since Google strives to understand users as best as possible, you should do the same. Create content that will provide the most detailed and useful answers to user queries, instead of focusing on keyword promotion.

Finally

As you can see, all search algorithms have one goal - to force webmasters to create high-quality, interesting and useful content. Remember: Google is constantly working to improve the quality of search results. Create content that will help users find solutions to their problems, and you will be guaranteed to rank first in search results.


In August 2011, introduced for all segments World Wide Web. For English-speaking Internet users, the algorithm was launched a little earlier - in April of the same year. The developers of the new algorithm were employees of the search giant Amit Singal and Matt Kats. New algorithm radically different from previous developments in its focus on the human factor. However, the algorithm also takes into account conventional criteria when ranking.

Unique content

The uniqueness of the text content is taken into account by Panda in the form of a percentage of copy-paste both for a specific page and for the site as a whole. In other words, now the presence of borrowings worsens not only the position of a specific page, but also the entire “pirated” resource. Plus, Panda easily tracks such an indicator as templates. Simply put, the algorithm sets pages that are similar in content, the content of which is created even for different key queries.

Advertising relevance

An advertisement for baby food placed on the page of an online store that sells heavy trucks is almost guaranteed to cause the “displeasure” of the search engine. This attention from Google to the quality of advertising placement is intended to cool the ardor of unscrupulous optimizers and webmasters making money. The goal is to raise sites that truly correspond to the stated topic

Link mass

Here we are talking about both hosted and backlinks. There is only one requirement - finding sites in a single information space. That is, sites exchanging link mass should be as similar in topic as possible. The number of backlinks, of course, is taken into account by Panda, but is far from determining the ranking.

Human factor

IN in this case it would be more correct to use SEO terminology, in which the human factor is called behavioral. The Panda search algorithm, like no other before it, meticulously takes into account user behavior on the site. Obviously, for this, the developers needed a colossal complication of the algorithm, which is why, by the way, it took so long for the algorithm to be launched for the entire Internet - the creators endlessly refined the search tools.

However, the results are, frankly speaking, impressive. Today, truly useful sites can reach top positions in Google. Thanks to the use of the new algorithm, the Google search engine has truly become the most powerful search engine on the planet, which is confirmed not only by the growth in the number of users of this service, but also by the level of Google’s capitalization, which has exceeded $200 billion. The user can still be deceived by various fashionable features and aggressive advertising, but an investor cannot be fooled by cheap tricks. He feels that Google is a profitable business. And this is not least thanks to Panda.

So, behavioral factors. These include, in principle, a limited set of criteria:

  • bounce rate;
  • time spent by the user on the site;
  • the ratio of returns to the site to the total number of transitions.

Moreover, Panda pays attention, so to speak, to the detail of each indicator. Simply put, it takes into account, for example, not only the time during which the user remained on the site, but also what actions he performed there, for example, whether he followed links, one way or another related to the subject contained in the key query. In a word, the algorithm makes it possible to determine not only the fact of the attractiveness and usefulness of a given resource for the user, but also the degree of this attractiveness, the means that make this site so.

An optimizer who promotes sites in Google can today, even with a relatively small reference mass get high enough positions in search results. For this you need competent linking internal pages site and, in general, increasing the convenience of the resource for every user - both for the guru and for the one who only heard the word “Internet” yesterday. There's a new one on this search algorithm Google - Panda.

Google Panda is one of Google's most complex and interesting algorithms, which now, three years after the release of the first version, continues to remain a mystery. As we know, the main goal of Panda is to identify low-quality content, but you must agree that this is a very broad concept. We saw how doorways, sites with automatically generated content, with a large number of duplicate pages, fell under Panda, non-unique content, in text and meta tags and with many other less obvious problems. But does this mean that the algorithm itself remains at the level of a coarse filter? Is not a fact.

What do we know about Panda?

Panda is primarily a heuristic algorithm. This means that by scanning information, it tries to find the most correct solution to a problem or answer to a request. The development of the algorithm began with a group of testers evaluating a large number of sites according to a number of criteria. These scores allowed engineers to establish the main factors that should be taken into account and determine the so-called “Panda score”. In addition to the mechanism for filtering low-quality content, I think there is another side to Panda, which is included in the main ranking algorithm and can both negatively and positively affect the site’s position (don’t think of Panda as a binary filter: either it filters or it doesn’t ).

Is Google Panda part of the overall algorithm?

Not long ago, one important event occurred in the “world of Panda”, which made life even more difficult for webmasters: Matt Cutts announced that the algorithm would be updated monthly for a 10-day period. In addition, there will be no confirmation of the update. Now we're all playing dark.

Why this becomes a problem, I think, is easy to understand. For example, when you see in Analytics that a site’s traffic has dropped, and you receive confirmation that the Panda algorithm has been updated at this time, you clearly understand the reason for the drop in traffic and can take meaningful action.

Now a site can be subject to the Panda algorithm for a 10-day period, but no one knows when it starts and when it ends. In addition, the main algorithm Google ranking is updated about 500 times a year, and this does not take into account “Penguin” and other small “joys”. It is becoming increasingly difficult to understand the reason for the drop in traffic, and without this it is not possible to take effective actions to restore it.

The last confirmed Panda update was July 18, 2013. But this is far from the last update, since the algorithm has moved to the almost real-time stage. I wrote “almost” because many SEOs believe that Panda now works non-stop (it is part of the overall algorithm). This is not entirely true, and in one of the Google+ HangOut John Miller that explains. From John's words it can be understood that Panda does not work in real time, but maintains the same update system. But progress in development is already at such a level that Google can be confident in the algorithm and testing before launch is no longer mandatory. This is important to understand, since a site cannot fall under “Panda” on any day of the month, only during the next update (remember, it now lasts 10 days).

How can you track Panda updates?

Now we come to the main problem: how to determine that traffic fluctuations are caused by Panda? Matt Cutts' answer to a similar question:

But let's try to figure it out ourselves. Some tips:

1. Keep an eye on the dates

First of all, you need to clearly establish the date of the increase or decrease in traffic, which will allow you to associate it with the expected update. It can be either “Panda” or “Penguin”. It is very important to determine which algorithm “hooked” the site - the action plan for restoring lost positions will depend on this.

For example, if Google rolls out a new Penguin update, then on the forum you can find many descriptions of the same situation: positions and traffic have dropped sharply or, conversely, returned to their previous positions. If you see that others' traffic fluctuation dates are similar to yours, then most likely you are in the same boat.

3. Follow SEO blogs

Subscribe to news or read the blogs of those SEO specialists who regularly monitor all algorithm changes. Experienced specialists have good connections and access to a wide range of data (client sites, consulting, audience reports), which allows you to get a more complete picture of what is happening.

4. Conduct an SEO audit

If you have suspicions that a “cute animal” has walked across your site, then you need to conduct an SEO audit first of all in those areas that Panda is targeting, namely a content audit. Here are some tips:

  • check the site for availability
  • see how a site is searched for pieces of text (broad and exact match);
  • check the texts and meta tags for ;
  • look in Google Webmaster Tools whether the number of indexed pages has decreased (Google Index -> Index Status);
  • check in Google Webmaster Tools number of site impressions in a search engine (Search Traffic -> Search Queries).

5. Monitor services that track algorithm changes

There are sites that monitor the “weather” in the world of SEO - these are MozCast and Algoroo. Large changes in “temperature” indicate that Google is up to something or has already started something. These services won't tell you what to do, but staying informed is also very important.

Conclusions. It can be assumed that the fact that Google launched “Panda” into semi-automatic mode is a definite signal to “SEO futurists” that the time is not far off when “Penguin” will suffer a similar fate. More precisely, this fate will befall us. Just imagine: every month there are two updates that are not confirmed by the search giant. Dark times Dark times await us, friends.

I hope this article helped you understand what Google Panda is and how to determine its algorithm updates.







2024 gtavrl.ru.