Semantic core for direct message, example of filling. Portrait of the target audience


Pavel Lomakin

IN In general, there are many ways to collect words for semantics - there are paid ones automatic services, there are self-collection options. We will mainly talk about the latter here, since paid services are not suitable for beginners due to their high cost. To be honest, I don’t use them myself because there’s no reason to. I’m such a pragmatist, well, one of the shortcomings I think is that all these services are not accurate and will not reflect the real picture specifically for your site. Even if your business is as similar as possible to your competitor, I still recommend compiling the semantic core yourself and anew. But this does not mean that they cannot be used at all. We'll talk about this in more detail below.

N Before you even start anything, you need to understand advertising. For the convenience of automating the selection, I use one of the programs. Don't forget about which half of the success depends. So what do we do first?

Portrait of the target audience

D To make advertising interesting potential visitors– you must know who to adapt it for. Since the site, by and large, is written specifically for - it is necessary that the queries that will be in the advertisement be entered by the average representative of this group. That's why important stage compiling a semantic core will take into account the portrait of your target audience. You need to answer the following questions about the average representative: target audience:

  • - gender, age, Family status;
  • — occupation, education;
  • - monthly income;
  • — hobbies, interests, free time;
  • — the most frequently visited resources on the Internet;
  • - character.

I the draw must be formed taking into account the resulting portrait. It’s not for nothing that I put drawing up a portrait in first place, because only with it in front of you can you start looking for keys.

We are looking for directions for SY

N Directions (impression masks) are one or two word high frequency queries characterizing your business. For example, for a business selling monolithic formwork, the characterizing queries will be: formwork, monolithic formwork, formwork systems. Within these directions are the words you need for advertising. You need to collect as many of these areas as possible.

ABOUT main search paths:

  1. Sales page with product description
  2. Echo of the wordstat yandex service
  3. Synonym dictionary
  4. 1st page sites search results
  5. Thematic forums
  6. Mindmaps
  7. Word multiplication service
  8. Slang spellings and narrow-thematic terms
  9. Paid services to study competitors

Search on an advertised site

T it's actually simple. We go to our website and see what is written on it. We write down all the main queries that we find in our notebook.

T We carry out the same procedure with competitors’ sites. We go to Yandex search, enter the main queries and go through the results of competing sites. Perhaps we missed something and therefore carefully study what is written on them. You can also visit thematic forums to search for directions. By analyzing sites from the first page of search results and assessing the content of their semantic core, you can learn a large number of information for yourself.

Studying Wordstat Yandex results

ABOUT be sure to use a faithful assistant in any questions related to keywords - free service. The link contains a whole series of articles on this service. We enter the queries collected in the first step. We are interested in the right column. This is the so-called “Echo Wordstat”. It shows queries entered by users interested in our main query. The most frequent relevant keys from there should be included in the directions for the semantic core of your site.

Dictionaries of synonyms

P Continuing the topic of keys that are close in meaning, refer to the dictionary of synonyms. The same phrase can be described different ways, use the most popular ones. This applies not only to nouns but also to verbs, adjectives and other related ones; for example, for a query with the verb “to do,” there are synonyms: create, execute, manufacture.

TO Of course, you shouldn’t blindly write out all the synonyms from the dictionary; after all, we are talking about a practical and rational approach. A little more about synonyms. Don't forget about slang expressions. Remember what else the desired thing is called. It may not be in the dictionary, but sometimes jargon is used almost more often than the usual name. For example, “clave = keyboard” has taken root so firmly that it is very undesirable to ignore it when selecting a semantic core. Also be aware of abbreviations and narrow terms. For online stores of some kind electronic technology The field is not plowed at all. Everything is assembled down to the nomenclature models.

Multiplying Keywords

ABOUT They are usually used to compile so-called artificial semantics. But it is also possible to automate the process. For example, we take the main and additional words from a structured map (described below) and get a dozen two-word queries for parsing. You can do this in Excel using formulas, but I usually use this.

Clear all fields

IN put in the first and second fields the right words, set the text length to one character greater than the first generated word. and press the button.

P We get about a dozen words. We check that there are no errors and send it for parsing.

Paid query search services

AND x is quite a lot. Enter a competitor analysis service into the search and a wagon and a small cart will appear. Of course there are basic ones, but that’s not the point. I recommend using them only after you have completed all the previous steps and it still doesn’t seem enough to you, although I doubt it. As a rule, in such services you enter a competitor’s domain and see what queries it is advertised for, but do not stupidly copy the same queries, but simply look for what you may have missed.

Another argument not in favor of such services is that they provide a very approximate picture, especially for small sites with little history. On the screenshot there is one site in my work and the indicators: the number of keys, traffic, etc. are underestimated several times, so if a competitor views them, they will not find any benefit for themselves.

How to structure everything?

WITH leave Mindmap - an intelligence map of queries divided by convenient categories, corresponding to the filling of advertising with keys. The advantage of this method is that the risk of missing something is reduced. In the future, using this map, you will clearly see what can be connected and parsed with what. The mindmap is built on the principle of clarification - there is a main topic, for example - flowers. Flowers (what?) – indoor, artificial, cut. Indoor flowers (what?) – cacti, succulents, and so on. Can be compared to a tree with branches or cobwebs. After filling out the card, you will receive a visual diagram according to which you will fill your advertising with requests. Example below

  • Example 1
  • Example 2
  • Example 3


IN Save all collected keys in Excel file. Remember to remove inappropriate or overly broad queries. After the semantic core has been compiled, structured, cleared of garbage, a list of common negative words has been compiled, duplicate keywords have been removed and cross-minus has been done, proceed to writing ads. A separate ad is created for each request. By the way, I have already described it, it is very useful for beginners.

N don't forget that in advertising campaign In no case should there be several advertisements with the same keys, in order to avoid competition between them and a decrease in efficiency. Your main audience. which will convert well enters three or more into search complex queries by which they search for your product. Think about which request is the most targeted:

  • buy iphone
  • buy iphone 5s pink 16gb

D I think you guessed it and named the correct answer. And this is almost always the case, the more verbose the request, the more ardent the audience is for it, but unfortunately such words have a low frequency. That’s why it’s so important to collect as many of these requests as possible, so that you can then create an individual ad for each that shows as much as possible that this is exactly what he’s looking for.

AND Using all the methods described in this article, you can easily collect the maximum semantic core for your advertising and wipe the noses of competitors who do not take into account such brainstorming and technical storming in their advertising to find directions and parsing. I hope this material was useful to you. If you have any questions, write them in the comments.

About what is important to consider when compiling a semantic core.

To bookmarks

How to collect the correct semantic core

If you think that some service or program can build the correct kernel, you will be disappointed. The only service capable of collecting the correct semantics weighs about one and a half kilograms and consumes about 20 watts of power. This is the brain.

Moreover, in this case the brain has a very specific practical use instead of abstract formulas. In this article, I will show rarely discussed steps in the semantics collection process that cannot be automated.

There are two approaches to collecting semantics

Approach one (ideal):

  • You sell fences and their installation in Moscow and the Moscow region.
  • Do you need applications from contextual advertising.
  • You collect all the semantics (extended phrases) for the query “fences” from anywhere: from WordStat to search tips.
  • You receive a lot of requests - tens of thousands.
  • Then spend several months clearing them of garbage and you get two groups: “needed” queries and “negative keywords.”

Pros: in this case, you get 100% coverage - you took all the real requests with traffic for the main request “fences” and selected from there everything you need: from the elementary “buy fences” to the non-obvious “installation of concrete parapets on a fence price”.

Minuses: two months have passed, and you have just finished working with requests.

Approach two (mechanical):

Business schools, trainers and contextual agencies have been thinking for a long time about what to do about this. On the one hand, they cannot really work through the entire array for the request “fences” - it is expensive, labor-intensive, and they cannot teach people this on their own. On the other hand, the money of students and clients also needs to be taken away somehow.

So a solution was invented: take the request “fences”, multiply it by “prices”, “buy” and “installation” - and go ahead. There is no need to parse, clean or assemble anything, the main thing is to multiply the requests in a “multiplier script”. At the same time, few people worried about the problems that arose:

  • Everyone comes up with the same multiplications, plus or minus, so queries like “installation of fences” or “buy fences” instantly “overheat.”
  • Thousands of high-quality queries like “corrugated fences in Dolgoprudny” will not get into the semantic core at all.

The multiplication approach has completely exhausted itself: difficult times are coming, the winners will be only those companies that can solve for themselves the problem of high-quality processing of a really large real semantic core - from selecting bases to cleaning, clustering and creating content for websites.

The purpose of this article is to teach the reader not only to select the correct semantics, but also to maintain a balance between labor costs, kernel size and personal effectiveness.

What is a basis and how to search for queries

First, let's agree on terminology. A basis is a general query. If we return to the example above, you sell any fences, which means that “fences” are your main basis. If you sell only fences made of corrugated sheets, then your main basis will be “fences made of corrugated sheets”.

But if you are alone, there are a lot of requests, and campaigns need to be launched, then you can take “corrugated sheet fences price” or “buy corrugated board fences” as a basis. Functionally, the basis serves not so much as an advertising request, but as a basis for collecting extensions.

For example, for the query “fences” more than 1.3 million impressions per month in the Russian Federation

These are not users, not clicks, and not requests. This is the number of impressions of Yandex advertising blocks for all queries that include the word “fences”. This is a measure of coverage applicable to a certain large array of queries, united by the occurrence of the word “fences” in it.

Would you like to know how to create a semantic core in Yandex Direct? The core is not compiled into Yandex Direct, but it comes in handy.

4 stages of collecting the semantic core for direct

How to create a semantic core?

Yandex Direct requires a semantic core, because the task is advertising, and queries serve as the basis. Advertising company It is based on .

You can select queries on Yandex Wordstat.

Core - direct

In Yandex Direct, the core is used to implement an advertising campaign. Semantic core required for Direct. It needs to be created using Yandex Wordstat. Type the key into Wordstat and get a list keywords. This is a core collection for direct.

Strive to ensure that collecting the core for Direct results in a large semantic core, direct loves when there is a choice. This is necessary for the campaign to go well.

Collecting the core for direct

The semantic core of direct loves consisting of commercial queries. This means that queries like “what is the semantic core” or what is a sofa will not work; you need to select queries that are more practical. For the semantic core, Yandex Direct would prefer queries with “buy”; they are more valuable. Try to select queries in the core of the direct message with words such as “order”, and you can also “how to choose”.

The last example is not very commercial. However, it may be suitable for direct. This is cheaper than queries in the semantic core with the word “buy”. However, it is suitable because articles of this type have a good conversion rate for info articles. Collecting the core for the direct message will help lay the foundation for a good campaign.

Copywriters will help you write articles, an exchange where they write on request for 15 rubles. — Etxt and Advego.

Examples of the semantic core

buy basket

order basket

how to choose a basket

Basket Samara

These 4 queries were obtained after collecting the semantic core in the direct message. It is necessary to collect the kernel for direct in a similar way.

How to collect a semantic core? 4 steps

To build a kernel for direct, you need:

  1. Manually select 5-6 queries.
  2. In Wordstat, turn them into 60.
  3. Remove unnecessary things.
  4. Select the best commercial queries for the semantic core.

Hello everyone!

Once you have created an account, you can proceed to the instructions below:

Great! Key Collector has been successfully configured, which means you can proceed directly to compiling the semantic core.

Compiling a semantic core in Key Collector

Before you start collecting key phrases for Yandex.Direct, I recommend reading, in it you will find a lot useful information about key phrases (only for beginners). Have you read it? Then it will not be difficult for you to collect masks of key phrases, which are very necessary for parsing through Key Collector.

  1. Be sure to indicate the region where keywords are collected:
  2. Click on the “Batch collection of words from the left column of Yandex.Wordstat” button:
  3. Enter key phrase masks and distribute them into groups: This is the result. Click "Start Collection": This is done for the convenience of processing key phrases. This way, requests will not be mixed in one group and it will be much easier for you to process them;
  4. Wait until the collection of key phrases is completed. Once the process is completed, you can collect the exact frequency of requests, and also find out approximate cost click on the ad, the approximate number of ad impressions, the approximate budget and the number of competitors for a specific request. All this can be found out using one single button “Collecting Yandex.Direct statistics” (we added it to the panel quick access):
    Check all the boxes according to the screenshot above and click “Get data”;
  5. Wait for the process to complete and view the results. To make this convenient, click on the column auto-tuning button, which leaves visible only those columns that contain data:
    We need the statistical data that we have now collected in order to analyze the competitive situation for each key phrase and estimate the approximate advertising costs for them;
  6. Next, we’ll use this coolest and the most convenient tool Key Collector as “Group Analysis”. We've added it to the Quick Access Toolbar, so just go to it from there:
    Key Collector will group everything key phrases according to the words and it will be convenient for us to process each group of requests. Your task: look through the entire list of groups; find groups of queries containing non-target words, that is, negative words, and add them to the appropriate list; Mark these request groups to delete them later. You can add a word to the list by clicking on the little blue button: Then a small window will appear where you need to select a list of negative words (list 1(-)) and click on the “Add to stop words” button: This way you work through the entire list. Don't forget to mark groups with non-target words. Key phrases are automatically marked in the table search queries;
  7. Then you need to delete the marked non-target phrases in the search queries table. This is done by clicking the “Delete phrases” button:
  8. We continue to process phrases. As you remember, the status “Few impressions” appeared in Yandex Direct at the beginning of 2017 (we dealt with it), and in order to avoid this status, it is necessary to allocate requests with low frequency (low-frequency requests) into a separate group. First, apply a filter to the “Base Frequency” column:
    Filter parameters: Base frequency, less than or equal to 10. I set these filter parameters based on the display region - Izhevsk:
    Then we mark all the filtered phrases:
  9. We create a subgroup in the group where work takes place in this moment a simple keyboard shortcut CTRL+Shift+T: Then we transfer the filtered phrases from the “Buy iPhone 6” group to the “Few impressions” group. We do this by transferring phrases to another group:
    Then specify the transfer parameters as in the screenshot below (Run-transfer-marked):
    Remove the filter from the “Base Frequency” column:

You process the rest of the groups in exactly the same way. The method, of course, may seem tedious at first glance, but with some skill you can quickly, quickly create a semantic core for Yandex Direct and already create campaigns in Excel, and then upload them. It takes me about 2 hours to process the semantic core in this way, but this depends solely on the amount of work.

Export key phrases to Excel

All we have to do is export the key phrases to a file for working with Excel. Key Collector offers two export file formats: csv and xlsx. The second option is much preferable, since working in it is much more convenient and more familiar for me personally. You can specify the file format in the same program settings, in the “Export” tab:

You can export key phrases by clicking on green icon in the quick access panel:

Each group is exported separately, that is separate group- this is separate xlsx file. You can, of course, put all request groups into one file using the “Multi-Groups” tool, but then it will be extremely inconvenient to work with these files, especially if there are a lot of groups.

Next you need to export your negative keywords. To do this, you need to go to “Stop Words” and copy the negative words to the clipboard so that you can then paste them into Excel:

This is how I work with Key Collector, which I taught you too. I sincerely wish that this lesson will help you in mastering this wonderful tool and your semantic core will bring exceptional targeted traffic and lots and lots of sales.

See you soon, friends!

Previous article
Next article

About what is important to consider when compiling a semantic core.

To bookmarks

How to collect the correct semantic core

If you think that some service or program can build the correct kernel, you will be disappointed. The only service capable of collecting the correct semantics weighs about one and a half kilograms and consumes about 20 watts of power. This is the brain.

Moreover, in this case, the brain has a very specific practical application instead of abstract formulas. In this article, I will show rarely discussed steps in the semantics collection process that cannot be automated.

There are two approaches to collecting semantics

Approach one (ideal):

  • You sell fences and their installation in Moscow and the Moscow region.
  • You need applications from contextual advertising.
  • You collect all the semantics (extended phrases) for the query “fences” from anywhere: from WordStat to search tips.
  • You receive a lot of requests - tens of thousands.
  • Then spend several months clearing them of garbage and you get two groups: “needed” queries and “negative keywords.”

Pros: in this case, you get 100% coverage - you took all the real requests with traffic for the main request “fences” and selected from there everything you need: from the elementary “buy fences” to the non-obvious “installation of concrete parapets on a fence price”.

Minuses: two months have passed, and you have just finished working with requests.

Approach two (mechanical):

Business schools, trainers and contextual agencies have been thinking for a long time about what to do about this. On the one hand, they cannot really work through the entire array for the request “fences” - it is expensive, labor-intensive, and they cannot teach people this on their own. On the other hand, the money of students and clients also needs to be taken away somehow.

So a solution was invented: take the request “fences”, multiply it by “prices”, “buy” and “installation” - and go ahead. There is no need to parse, clean or assemble anything, the main thing is to multiply the requests in a “multiplier script”. At the same time, few people worried about the problems that arose:

  • Everyone comes up with the same multiplications, plus or minus, so queries like “installation of fences” or “buy fences” instantly “overheat.”
  • Thousands of high-quality queries like “corrugated fences in Dolgoprudny” will not get into the semantic core at all.

The multiplication approach has completely exhausted itself: difficult times are coming, the winners will be only those companies that can solve for themselves the problem of high-quality processing of a really large real semantic core - from selecting bases to cleaning, clustering and creating content for websites.

The purpose of this article is to teach the reader not only to select the correct semantics, but also to maintain a balance between labor costs, kernel size and personal effectiveness.

What is a basis and how to search for queries

First, let's agree on terminology. A basis is a general query. If we return to the example above, you sell any fences, which means that “fences” are your main basis. If you sell only fences made of corrugated sheets, then your main basis will be “fences made of corrugated sheets”.

But if you are alone, there are a lot of requests, and campaigns need to be launched, then you can take “corrugated sheet fences price” or “buy corrugated board fences” as a basis. Functionally, the basis serves not so much as an advertising request, but as a basis for collecting extensions.

For example, for the query “fences” more than 1.3 million impressions per month in the Russian Federation

These are not users, not clicks, and not requests. This is the number of impressions of Yandex advertising blocks for all queries that include the word “fences”. This is a measure of coverage applicable to a certain large array of queries, united by the occurrence of the word “fences” in it.







2024 gtavrl.ru.