Many webmasters often forget the fact that the goal of all search engines is not to provide sites high in the search space and as a result , give a lot of traffic , but in the fact that users get high-quality and relevant search results . As experience shows , these goals are often not the same. We have a variety of search engines filtering criteria sites, because of which many of the resources , sometimes quite high quality and made for people who do not fall in the first place the search and is frequently excluded from extradition.
Unfortunately, many filters are far from perfect , so they fall under the normal and quite relevant sites created for people , and which at first sight there are no methods prohibited by the rules search engine.
The main filter - this ban, the complete exclusion of the site from the index. Generalized theoretical reason - a violation of the license Yandex search . But the real reasons may be many , and not all of them are obvious at first glance. The situation is aggravated by the fact that there are no open and clear " rules of the game " and support Yandex (Plato Shchukin ) rarely gives specific advice and certainly never indicates reasons imposing the ban.
The most likely and common cause - search engine spam . Most frequently occurring case of search engine spam is oversaturation of the site pages keywords. This can be both intentional and due to ignorance . There is an old misconception , according to which the greater the density and the number of keywords on a page , the better position of the site . This " truth " is irrelevant for many years. And not just irrelevant , but also very harmful.
Previously been circulated reception filling various meta tags keywords , including topics not related to the site, but is a popular ( sex, porn , etc.) . Consequently, the effect of meta tags for search is minimized .
Thus, the most common methods of internal optimizations that can be mistaken for Yandex search engine spam and ban the site:
- Banal glut text key words and phrases . Often it happens intentionally, the person writes , calculates the density adds keywords ; in the end it turns out that the density and the like , in theory, acceptable , and the text turns unreadable. There are cases when the author site , immersed topic about which he writes , unintentionally oversaturates text keywords. Therefore, after writing an article a must subtract , preferably aloud. And check the density of the various services is not worth it uninformative indicator . But there are those who still believe that the density of 5 % - is the key presence in the top ...
- Publication on the site keyword lists . It was very common a few years ago . Now, fortunately , many are aware of the futility of the whole film this action. Usually these lists are placed at the bottom of the site, barely visible or even invisible color, which, of course , exacerbates the situation. Some "advanced" webmasters place them above the caps and headers site , assuming (again, following the popular belief ) that the higher the keywords are , the greater the effect on them for promotion. And get well-deserved ban. Others use modern methods of layout and put keywords in div- ah , or just invisible (display: none), or with an offset beyond the screen . But the robot looks code page and the code of the keywords you can not hide - so that such methods inevitably lead to the ban .
- The publication of lists of search queries . Usually written "on our site come on such requests " or " this page can be found in Yandex search requests ." This classic version of search engine spam ! So , you should not do . Some engines allow you to do is very simple, tracking referrals from search engines and automatically display a list of queries that visitors went to the site. This function must be immediately disabled. Or , in extreme cases , to close these units from being indexed , and links to them - from the referral web crawlers .
- Oversaturation keyword tags and attributes title, h1-h6, alt, meta keywords, meta description , etc. in an amount not happiness! On the contrary - oversleep can lead to the ban .
- Availability of online pages created specifically to promote the positions and do not contain useful information for visitors , as well as numerous pages with redirections to other resources.
- The presence of any hidden text in general , including those with links not created for ease of use and to manipulate SERPs .
Sign ban - complete loss of site SERPs. But not every loss is clearly ban. There are situations when the site just flew out because of problems with hosting , due to the imperfections Yandex algorithms , due to an erroneous code , etc. If the site suddenly disappeared from the SERPs , you need to wait 2-3 updates and if the site is not back , write a service Yandex and the question of why loss . And if the answer is about the violation on the site - you need to eliminate these violations and write them again . Often these disorders are not obvious even for the webmaster, so that correspondence may be delayed. If you're lucky, support hint at the specific cause or even directly point to the violation. If not, then you need to think long and carefully analyze the site, preferably with the assistance of outside experts.
Earlier ban from simple loss can be distinguished by adding a site in Yandex addurl - it gave an error ; Now it is irrelevant , all sites are added back without error.
Another equally popular and equally , if not more , dangerous filter Yandex - AGS (sometimes this name stands AntiGovnoSayt ) . The main symptom - the index is from 1 to 10 (rarely 30 ) pages. And the standard answer technical support - " Our algorithms have made the decision not to include the site in the search. Develop site, and it will return to the issue ." And this answer can send even when the site has changed , and the robot 's new not even visited , so our support is necessary to write a long and active, detailed and without emotion explaining the situation .
The main reason for this filter ( if it is a site for people , of course) is a duplication of content within the site. Almost all popular engines create pages with the same content - it might be the page tags, page breakdown for days, months and years , sections of the page with announcements articles containing fragments of texts of the articles themselves , etc. Logically, of course, would be excluded from the search duplicates and leave only the content , but Yandex apparently easier to filter out virtually the entire site, than to engage in a detailed analysis of its structure and contents.
Other reasons:
- Site, consisting of 100% non-unique content may fall under the ACS if it is not anything interesting for visitors;
- " Bredokontent " - generated texts or bad sinonimayzing avtorerayt , automatic translation , etc. All these sites pretty quickly fall under the filter. Also can get under it and sites illiterate texts.
- The lack of content on the site or its very small amount . However, the site consisting solely of videos and photos, may not indexed.
- A site created by the sale of advertising. If the webmaster does not waste time and energy on something to make the site " humanoid " form, then this site almost entirely fall under the AGS .
- Site, consisting solely of " corrupt " articles. Argued that selling links is death for the site, but the sale of seats under Article safe. In fact the opposite is true - much faster than a bad film " kill " the site than reasonable selling links , which is generally safe .
There are many myths and misconceptions about the ACS . For example ( remember, it MYTHS )
- If options sold more than pages on the site;
- If the site is to get rid of a pile of catalogs at once;
- If the site low attendance ;
- If the site is rarely updated ;
- If the site is built on a standard template ;
- If by hosting one or many ip govnosaytov etc.
Get out of the ACS is extremely difficult, it is often easier to change the domain. You can try to close all doubles , check out the website for unique content and if there are non-unique articles, replace them with new and unique . You can fill the site with lots of new articles. You can change the structure of the site .
Pessimizatsiya - this trip reference ranking for a particular site. Bought and natural links no longer work , and as a result , the positions are plummeting . Drop usually occurs several hundred positions , fluctuation of several tens of positions are not normally associated with on site sanction imposed . All pages are in issue and are in search of , but the promotion of the first position it can not be because they do not work off the influence of reference ranking .
The main reason - the presence on the site " reference dumps " , directory of links , not intended for people, and posted solely for the purpose of manipulation issuance . Fortunately, a time of great unmoderated directory of links to sites already passed, but there was a time when many pessimizatsiya decimated .
If you decide to register the site in the directories , then in any case can not be placed at back links or banners on these directories ! The use of them will be exactly zero , but the chance to get a very tangible pessimizatsiyu .
Quite likely pessimizatsiya for having a large number of corrupt links ( including articles ) - do not put them more on page 4-5 .
Very often pessimiziruyut sites for placing too obtrusive and interfering visitors advertising. For example, pop-ups or too many teasers on pages. When pessimizatsii often can reset particles.
It is possible that instead came pessimizatsii filter " You're the last " , as it is impossible to draw a clear line between them .
Filter " You're the last " similar to Supplemental results in Google. Site disappears from search , staying indexed . Looking can only be found by unique phrases from the text that are placed on it , and even on request , containing the website address, it may not be the first places . At the same time support Yandex says that ranking happens normally , according to its relevance.
For the site as a reference ranging disconnected , and the effect of static weight ( internal linking , including ) . Experience shows that the main reason - it's not unique content on the site , which is designed to earnings. Moreover, under the " You're the last " and can get the original, which has copied most of the articles. In this case, will have to either replace become non-unique articles, or wool and make online webmasters delete copies .
Get out of it can only be completely changing the site of the structure , design , etc.
Filter affiliates . It may come under several sites belonging to the same company , dedicated to the same subject. Yandex believes that the top should be the only one site of one company and, in theory , this is correct. Moreover, due to incorrect operation of the filter with the first places may fall all sites , but not all , except one. Filter also applies to affiliates and contextual advertising , direct - there is also monitored advertising affiliated sites.
Like all filters Yandex filter affiliates often does not work properly - some companies take their sites all top , and sometimes it happens that " glued " sites completely different companies.
Now it is impossible to catch the sites of the same company or design or structure ... and smart webmasters have long been adjusted for these factors . Even on the same cms can make several completely different sites on the structure . I'm not talking about the whois data for the domain - many webmasters make them different, even if the sites are not connected with each other.
It unequivocally fall under the affiliate websites that are created solely to redirect traffic to the stores or affiliate . Detect affiliates, inter alia , may intersect semantic nuclei sites.
If there is a need to advance in the top few sites, then you need to make the most of them different from each other :
- Whois, ip ;
- Different cms, designs , site structures ;
- Different contacts listed on the site ;
- Try to make different semantic kernel (except key requests) ;
- The lack of reciprocal links between these sites .
Unfortunately, there is currently no reliable way to determine this filter as the algorithm itself is unknown . All this - only guesses .
Filter "You spamnyh ." Official information about its existence is not, however, its manifestations were observed and classified. It is not imposed on the whole site or domain, and to a specific page on specific request. Instead, specially optimized for this page request is issued is completely different, much less relevant .
Coined it is supposed to deal with " foot binding " ( as they are sometimes called veiled " seo text") - long texts , rich keywords that were created exclusively for the promotion of pages on request; use them for people was minimal. One way to check to penetrate the filter " you spamnyh " - type in a search query in the corresponding modified word form . If the search ( not necessarily first ) will be issued to the desired page , then it is under the filter . You can also do a search on your site by Yandex - if the required page is not promoted request first , then probably it imposed this filter .
Easy to get out of the filter . Need to rewrite all the text , removing spamnost keywords , rewriting and issued it to people and generally reducing its quantity. Perhaps then need to change the promoted page , its address .
Some species also isolated filter that overlaps for " oversleep " anchor sheet. Treated its dilution. Likely to take into account when applying the filter and internal factors ( search engine spam on the page) , and external factors ( Perespa anchor sheet ) .
Nepos filter is that the weight of the links from the site (or , probably , with certain of its pages ) is reset. The reason - a large number of outbound links , selling or exchange . Filter sites for adults ( adult ) filter is applied , as the name implies , the sites with erotic or pornographic content . And this may not be erotic in the site content and advertising , especially in the teaser blocks. Because of these teasers most decent site can get under the filter . Conclusion - should seriously advertising placed on the site. However, there were cases of imposing such a filter for articles about sex in the medical or psychological key ...
There are many myths and misconceptions associated with filters Yandex .
For example, it is believed that if the site is actively expanding reference weight , then it will hit the banks . It would seem a very convenient way to bring down the competition. Fortunately, it is not banyat sites and filtered. There is also a filter for the redirect using java-script - if the site is certainly not designed specifically to redirect traffic.
Many people think that non-unique text indexed or not such sites immediately get into the bank . Too bad they do not go to the issue of realized and see what's there , most sites do not have unique content .