Technical SEO. At the technical optimization stage it is necessary


Today the topic of conversation on the blog is website optimization, and there is specific example, where you can explain what this optimization is and what is generally required of a site so that it is highly visible on the Internet, on the first pages of search results, and brings customers from search engines.

I was contacted by a person who was looking for an optimizer to display a resource no less than in TOP3. This formulation of the question raised alarm bells: previously, promoters were looked for to quickly get to the TOP. And they were already “promoting”, as time has shown, straight to the ban, because illegal methods were used.

It turned out that the business owner is aware of these issues and understands that the situation on the Internet has changed dramatically. What he needed was an optimizer that would bring him to the TOP. After conducting an express audit, I realized that the site’s creators did not care about any optimization at all or that this was not part of the development package, which is fundamentally wrong, and now you will understand why.

Technical website optimization

Firstly, the pages on the computer loaded very slowly, but they should load quickly. I haven’t tested it on mobile phones, but there was no such task at this stage, although mobility (both speed and convenience) is a very significant point in technical optimization. It caught my eye that instead of links there were cracks. And they must be readable. Next: the page hierarchy is broken. Directories are created, and all pages go to the second level, bypassing directories. Correcting this situation will lead to a change in URLs, which means re-indexing of pages, which takes time. In Yandex, for example, this can take a very long time. There are no meta tags, but semantic core supposedly compiled.

Website optimization for Yandex

The methods of working on it depend on determining which search engine a web resource should be optimized for. Search engines differ not only fundamentally, but also in the audience that prefers to use one. Yandex has a unique ranking algorithm. Robots collect information in their own mode, but do not immediately show it. Text update search base occurs at intervals of about a week, the reference factor, and now the behavioral factor - even less often. The quality of the site in all its indicators affects what position it will be in as a result of the next update.

Yandex’s task is to select from a variety of sites several of those that, in its opinion, more accurately answer the request, and display them on the first pages. He is loyal to good resources. Webmasters are given the opportunity to use a number of tools to fast indexing. An important detail is that Yandex quickly assigns a region if geotargeting is set to a city or region. This is a big plus for those who do business in a certain territory. Specialists who have knowledge and experience and demonstrate results can raise the quality of a website to the required level for promotion under Yandex.

Optimization for Google

My opinion is this, and I’m sure many will agree with me: even if Yandex is chosen as the priority search engine, the influence of Google should not be diminished. Here is Alexa, which collects statistics on all Internet web resources, presents information on many indicators and maintains a global ranking. As a result, everything in the “web” is interconnected.

Topic and geography are the main criteria that need to be taken as a basis when choosing optimization for Google. But for it to be successful, it will have to adapt to his requirements. For example, great importance has the weight of a certain page. It is believed that a young site can be in the TOP of Google for a specific request addressed to a specific page. There are ways to increase its CTR. The role is played by the volume of content, its uniqueness (and not only that which is expressed in the percentage of matches of words and phrases, if it is text), clear snippets, micro markup, etc.

Everyone knows that Google robots are omnivores. They index everything they see, even ignoring established prohibitions. And this is where problems with duplicate pages may arise. A subtle point that requires understanding how to prevent this. By the way, Google was the first to begin a fierce fight against such promotion methods as mass purchase of links, throwing overboard the guilty ones and freeing up places in the rankings, it turns out, worthy.

Usability

To the visitor in our virtual office It should be convenient, intuitively clear how to move from one page to another, pleasing to the eye. First of all, this concerns navigation. The time spent on each page is recorded by robots. When this value is small even by internal passages(up to 15 seconds), the bounce rate increases. It’s these tiny factors that make up the authority of a website. Can be downloaded beautiful pictures, but if the text is not formatted correctly, huge or small fonts, illogical paragraph indents. Giant header fonts can not only be a design flaw, but also pose a direct threat to the site if H1-H3 tags are used too many times. There are certain standards, if exceeded, the page will be considered over-optimized. In the example site I gave, pixels were adjusted by tags, but this is configured in CSS.

External website optimization

You can move on to external optimization only when you are not ashamed to invite friends from social networks, post a press release or something similar with a link to it on the trust site. My “patient” was clearly not ready, but I also couldn’t fix absolutely everything, for example, set up caching. Optimization directly depends on knowledge of the programming language and technical knowledge. First I showed it to one of my colleagues. The answer after the inspection was: “The client doesn’t have as much money as there is to fix.” Didn't take it.

We all want to be in the TOP, right? Competition should be healthy. Search engines have created all the conditions for us to work on our sites, improve them, improve them. Proper technical optimization of a website, especially a commercial one, will help you climb several positions, or even pull it out of oblivion altogether. And this investment in business is profitable, because now without representation on the Internet, a business simply cannot exist.

The main task of technical optimization is to ensure maximum indexing of all pages of the site. If this is not done, then other measures to promote to the TOP may be in vain.

Of course, technical optimization should be carried out by professionals - website developers. Those who are not well versed in HTML code should not try to change anything - it can only cause harm.

If everything is done correctly, the site loads without delays, errors or failures, does not contain broken links or images, and is quickly indexed. To achieve this result, you need to take into account a number of factors.

Site loading time

Website loading time is the period it takes to fully load all of its elements from the moment the request is received. It consists of:

  • server response time;
  • the time it takes for the first byte sent by the server to arrive on the computer – TTFB (Time to first byte);
  • time HTML processing and downloading content;
  • time before rendering begins - the moment when the page becomes visible.

The speed of these processes depends on the purity of the code, the “weight” of the files, the correctness of the layout and the correct CSS optimization and JC.

Google openly admits that site load time is a ranking factor. The MOZ company conducted a study on this matter, which gave interesting results:

Yandex, although it does not make official statements, indicates in the webmaster’s certificate that "When choosing hosting for a website, you should take into account access speed and mean time between failures", and advises "use the hosting that will provide best speed access to the site and least time, during which the site may be unavailable due to technical problems."

You can check your loading speed using the PageSpeed ​​Insights service.

Also, we must not forget that behavioral factors. If the page loading speed is low, this has a significant and negative impact on the browsing depth and bounce rate. The user may lose patience without waiting for the end of the download and simply go to competitors. In addition, the longer the delay, the fewer pages viewed in one session.

Server response

You can check the server response with the appropriate Yandex.Webmaster tool. The list of main service responses can be found in the "Help" section of Yandex.Webmaster.

When checking web pages, be guided by the server response of 200 OK (“good”). This means that everything is fine with the page - nothing is prohibited from indexing and there are no redirects.

Other common server responses:

  • 301 Moved Permanently (“moved forever”) – a permanent redirect is configured. This means the page has moved to new address forever, and the link redirects to another page. Users cannot see them, so the search robot does not index them, but it does index the page on which the redirect is installed.

    Used in cases where it is important to preserve the space occupied by the old URL in search results, traffic and link weight. However, try not to abuse redirects. The site’s position in search results may decline: a 301 redirect transfers link juice by no more than 90% and increases loading speed. Replace all internal links with new ones and minimize the number of redirects.

  • 302 Moved Temporarily – temporary redirect. The requested resource is temporarily located under a different address and it is implied that the page at the old address will soon be available to users. Unlike a 301 redirect, a 302 redirect does not transfer link juice to the new page address.

    It is appropriate to use a 302 redirect in case technical work on the site, to temporarily display to visitors any materials (which may change frequently), without making changes to old page– for example, promotions of online stores.

  • 404 Not Found(“not found”) – the page does not exist. The main task of such a page is to receive traffic via broken links. The search robot treats it normally if it follows external link, but when moving along the internal one it causes a negative reaction.

    To reduce the failure rate, it is important to optimize it. The design must correspond to the overall style of the site, and also provide the user with options for solving the problem. Indicate the main links on this page and place a site search form.

  • 503 Service Unavailable (“service unavailable”) – the server is temporarily unavailable and cannot process requests for technical reasons(overload, maintenance, etc.).

Installing 301 redirects from all non-main website mirrors to the main one

Decide on the main thing domain name and indicate it in the .htaccess file for search engines so that the same site is not perceived as a duplicate. This includes gluing together mirrors like “vkontakte.ru” and “vk.com”, as well as mirrors with “www” and without “www”.

Duplicates and clones

If there are clones of your site on the Internet, this has a bad effect on promotion - search engines are likely to impose their own filters. Duplicate pages must be closed from indexing, or better yet, deleted altogether.

Site structure and page nesting level

Use a wide structure instead of a deep one. The closer the page is to the main page, the better it is indexed and promoted.

Page addresses

The best page address is a human-readable URL - CNC. It is easier for the user to remember and also helps to navigate the location on the site. Within one resource, one option for compiling CNC should be used - either translation or transliteration.

Robots.txt file

This file contains a list of allowed and prohibited pages for indexing. The site may be difficult to index because the search robot leaves the page as soon as it encounters a ban on indexing. The administrative part of the site, pages with personal information users, site search, ordering and shopping cart.

Setting up robots.txt also includes creating and specifying a sitemap.xtml file, which will help the search robot better understand the site structure and quickly index as many pages as possible.

Title, Description and headings

Title, Description and headings also partially influence technical optimization. There should be no pages with duplicate meta tags and titles.

An h1 heading containing a keyword cannot appear more than once on a page. Each page can have several h2-h6 headings, following the gradation: the h2 tag follows h1, h3-h6 comes after h2. The h2 header is no less important for promotion, since Yandex forms additional headers for snippets from h2, it can be used for promotion low frequency queries. Headings should not be used as website design elements (contacts, telephone, promotions, news).

Every mistake in technical optimization made at the initial stages will definitely come to light later. What looks insignificant now will create certain problems in the future and hinder your progress to the top of the search results. It’s a shame to lose customers because of such seemingly trifles, so optimization is a process that requires the closest attention.

Let's move on to consider the issues technical optimization of the site.

What is technical website optimization?

Technical optimization of a website involves the implementation of a set of measures to eliminate technical flaws in the operation of the website that prevent the site from being correctly indexed by search engines and used by users. The first nuance technical website optimization is that it is usually carried out only once at the beginning of the promotion.

A common mistake of beginners or unscrupulous optimizers is the exception this stage from the process search engine optimization site. As a result of such a negligent attitude towards the resource, it may not be fully indexed, which significantly limits the possibilities for promotion.

Elimination of technical issues of the site should be dealt with by specialists, because Inept changes can not only cause harm, but also completely disrupt the functionality of the site and its performance.

What technical improvements to the site should I pay attention to?

In the publication about I already mentioned technical optimization. Now I would like to provide some grouping that will help more accurately identify problems, eliminate them, or set tasks technical specialists, if it is not possible to carry out the modification yourself.

Technical problems of the site that hinder effective promotion can be divided into the following groups:

    Hosting problems and program code. This factor is one of the most critical, because Not only how search engine robots will index it, but also the loyalty of resource visitors depends on the speed of hosting and the correct operation of the program code. Malfunctions, site unavailability and other problems caused by poor quality hosting and content management systems can negatively affect. If for various reasons you still choose cheap hosting, then I recommend reading reviews about it and, if possible, ordering a test period to make sure the hosting is reliable and meets your needs.

    Setting up redirects and error pages. This group includes setting up site mirrors and indicating the main ones, gluing duplicate pages. Care should be taken to ensure that all pages of the site return the correct codes, and links to non-existent pages either lead to a 404 error page or correctly redirect to its new address or home page site.

    Setting up a website for search robots. This group includes setting permissions and restrictions on indexing in the robots.txt file, as well as creating and specifying a sitemap.xml file, which allows search robots to better study the structure of your site and report on all the pages existing on it that you want to index.

    Adjustment of template, navigation, structure. This group of technical site optimization includes adjusting the appearance of the site so that it is indexed by search engines as efficiently as possible and displays correctly in different browsers, was easy to use. This is especially true for sites made using Flash technology, an abundance of JavaScript and other modern technical bells and whistles that may not be correctly perceived by search engine robots.

    Setting up URLs for pages. For effective search engine promotion of a website, it is considered that links to various pages sites were permanent and included keywords(an example would be the address of this publication). This type of address can be configured in a special section of the CMS or using special directives in the .htaccess file. If appearance links were initially set in an inappropriate way and they had to be changed, it is worth taking care of correct redirection from old addresses to new ones. This will allow search engines to re-index the site more quickly and correctly, as well as save the already acquired weight of pages from incoming links to them.

    Removing duplicate pages and clones. It is worth making sure that there is no duplicate content on the site. Moreover, it is important that the content is not duplicated on other resources or so-called . The presence of duplicates and affiliates can negatively affect the possibility of website promotion, because high probability

Website promotion begins with technical optimization. As a rule, this procedure is carried out once, allowing only minor adjustments to be made in the future.

Any shortcomings in the operation of the resource are identified through a technical audit. If pages load slowly, the site is poorly indexed search robots, and errors appear when opening links - it’s time to technically optimize the site.

Technical website optimization is a set of actions aimed at ensuring the most complete and fast indexing of pages, improving interaction with search engines, effective promotion site. This is serious work, which together is the basis for a high-quality and long-term SEO company.

The work on technical website optimization is divided into several stages.

Code layout and optimization

The stable and automated operation of the resource attracts visitors who open pages not only in different browsers, but also through various devices. The site must be displayed correctly on screens of any resolution, as well as on mobile devices.

The more “garbage” in the code, the longer the pages take to load, and this affects behavioral factors users.

Setting up redirects

Redirects (redirections) are created using the .htaccess file. Search engines can treat an address with or without www as completely different. different domains. In order for users typing the site address differently to be able to get to the resource, it is necessary to register the correct redirect.

Setting up 404 errors

The appearance of non-existent pages can scare off many users, and the presence of errors in links leads to a decrease in site indexing. Therefore, before promoting a resource, it is important to configure the correct handling of 404 errors.

Robots.txt file

The presence of copies of pages (duplicates) - both external and internal - may entail the imposition of sanctions by search engines. By adjusting the constituent elements of the site, the operation of the entire resource is improved, unnecessary “garbage” and unused status codes are removed.

The robots.txt file contains directives to search engines, indicating which sections and pages of the site can be indexed and which cannot. As a rule, all duplicate pages in this file are closed from indexing.

Setting up page URLs

It is important that all page addresses on the site have the same design.

Often, website page addresses contain: lower case, and capital letters. In this case, when the user searches for a page, errors may be generated. To prevent this situation, it is better to use only lowercase letters in addresses.

Also of particular importance is correct setting redirects from old links to new ones.

Other technical improvements

The technical optimization checklist is quite long. In each individual situation, various shortcomings and problems may arise that require an individual solution (for example, changing the structure or adjusting the design).

A neglectful attitude towards making improvements prevents many sites from achieving top positions in the search, that is, it negates all further efforts to search engine promotion. Correct and complete technical optimization of a website is the key to successful promotion in search engines.

Technical optimization of the site - a complete list of technical improvements. Displaying the site in the TOP 10 PS Google and Yandex. Correction and correction existing errors, which can affect website promotion in search engines. Without competent technical optimization and improvements, the effectiveness of promotional actions is reduced to zero. Let's look at everything necessary actions that need to be taken into account.

The content of the article:

Technical optimization and website development

A set of measures aimed at correct completion of important technical aspects of the site. The main purpose of which is:

  1. Full indexing in search engines.
  2. Correct ranking of documents.
  3. Simple effective structure! Which will allow the search bot to quickly reach a new document and understand it.
  4. Competent structure – easy transitions between pages.
  5. Navigation for better user experience necessary information. Allows the visitor to quickly find the desired document.

Correctly correcting errors will significantly improve the performance of the site. Will contribute:

  • Fast indexing of site pages;
  • Significant improvement in the positions of most pages;
  • Reducing the bounce rate;
  • Audience growth;
  • Uniform weight distribution between documents;
  • Reducing response time and many other important aspects. The impact of which depends on the technical optimization of the site.

Technical website optimization: complete checklist

It is important to understand that all your actions must start from the very beginning. Moving only in front of the final options for errors. Which do not have a serious impact on the site. Incorrect corrections to the code or other elements of the site can lead to fatal errors. Consider and test all modifications several times before making changes to a live site.

Optimizing hosting and server performance

Hosting has one of the most critical impacts on a website. Bad software or low number of dedicated resources for work. Will lead to serious consequences on the site.

  1. Install high-quality software or ready-made hosting.
  2. Provide a sufficient number of dedicated resources for the site. Volumes RAM memory, HDD, SSD.
  3. Keep the versions up to date software hosting – PHP, databases and others.
  4. Absolutely all pages of the site must return the server response code 200 OK.

The following services can help:

Installing and configuring the necessary 301 redirects and response codes to avoid duplicate pages

Such measures are necessary to correctly indicate the main mirror of the site. Correctly handling 404 errors and reducing duplicate site documents in search.

  1. 301 – redirect from “www” (or without “www”) to the main mirror of the site.
  2. 301 – redirect from other mirrors to the main one (if any).
  3. Set up 301 – redirects to URLs with a slash “/” at the end or vice versa without it.
  4. 301 required – redirect from “http” to “https” or vice versa, depends on the version of the site. A redirect will help avoid duplicate content.
  5. Creating a 404 page – for non-existent URLs. Such a page must transmit a response code 404 Not Found. The 404 page is designed to match the design of the project, and a convenient structure is organized on it. This will allow visitors to quickly find information on the site rather than leaving it.

Optimizing site loading speed – HTML, CSS, JS and graphics

In 2019, site loading speed will also be in demand by search engines. The influence of this indicator on website ranking is one of the most significant in search engine promotion.

  1. Optimize HTML code pages - check the layout for the absence of unclosed tags. This will greatly improve the validity of the document. Html code should be clean and lightweight.
  2. CSS and JS significantly affect the speed of loading a document. They need to be compressed and combined as separate included files. This will increase the speed of their processing and loading. visual elements in the browser.
  3. Graphics optimization – images need to be first processed and compressed before uploading to the site. Be careful and monitor the image quality after compression.
  4. The server response must be within 0.200 ms.
  5. The total loading time of the site should not exceed 0.700 ms.

Organization of high-quality website structure, navigation and correct CNC

The structure of the site and its navigation are a significant factor. It affects bounce rates, indexing, and weight transfer between pages.

  1. The site structure should be simple and no more than 4 clicks deep. It should be easy for the user to navigate through the pages.
  2. Create a main menu to provide access to the main sections.
  3. Create links between pages.
  4. Use labels and page navigation.
  5. Add an additional page containing the entire project structure. This way the visitor will quickly find important information. And the PS bot will quickly index new content.
  6. Setting up the site's CNC - such urls must fully reflect the meaning of the page title and be understandable to humans.
  7. It is recommended to use rel=“canonical” to get rid of most duplicate site content. Allows you to point search engines to the original document. Thus, all copies of the original page will not be taken into account by the search engine.

Robots.txt, sitemap.xml and .htaccess files

The instructions written in these files will help the P-bot to correctly index the project pages. All three files must be in the root directory.

  1. Robots.txt – allows you to specify indexing rules for numerous search engines. This file sets bans on service sections, duplicate pages, search, and others.
  2. Sitemap.xml – necessary for fast indexing. Must have a completely identical site structure. Providing quick bypass P – bot of new or changed pages.
  3. .htaccess – allows you to specify rules for the server. Wrong rules! Will lead to fatal errors on the server side! This file allows you to do the following:
    • in .htaccess you can specify most of the redirection rules on the site;
    • configure server-side compression of a large number of file formats;
    • install correct hashing of documents in browsers;
    • establish some safety rules;
    • and others.

Such recommendations include actions that are of a secondary nature. But they play an important role in website optimization. These additional improvements will also improve site ranking and behavioral factors.

  1. The h1 tag is unique and cannot be repeated on the page.
  2. H1-h6 should not be stylistic or design elements.
  3. The pages contain only completely unique Title tags and Description. Which reflect the essence of the page.
  4. Create an attractive Favicon.
  5. External links should open in a new tab.
  6. Close the admin login page from indexing. The Disallow directive in the robots.txt file.
    • This also applies to service folders: cgi-bin, wp-icnludes.
    • Registration and login pages: register=, logout=.
    • Searching results: search.
    • Rss avoiding duplicate content: feed”, wp-feed.
  7. Use extra menu in the “Contents” article to navigate within the article.
  8. The site must have an optimized mobile version.
  9. Requires AMP version - for mobile search.
  10. It is also recommended to configure TURBO pages for the Yandex search engine.
  11. The document font must be visible to the user.
  12. Avoid large quantities highlights in the text.
  13. Do not put it a large number of advertising blocks on the site. Search engines can react negatively to a bunch of advertising, just like users.

These were all the most necessary improvements to the site. I hope this article will be useful to you.

  • Read articles on this topic:






2024 gtavrl.ru.