Like any search engine , Google has a filtration system his extradition . A large percentage of sites on the Internet is not created for the people , and to redirect traffic to other sites , to influence the search results , etc. , so that the SERP was clean , the garbage should be carefully filtered out . Filter so as not to hurt the normal sites. But there are no clear criteria for " normality " , so the problem of filtration is complicated, and it is still not completely solved. Filters often fall under the "white and fluffy " sites , and to determine the reasons for this is getting very difficult.
Analyzing information about filters ( and you can analyze only the views of webmasters as Google itself does not lay out almost no information about its search technology ) , clearly shows that the name of the filter set, but they all boil down to just one - unwinds honestly , and that site is all is well. No need to determine with high accuracy , as the site is called cash filter. Ways of getting out of them about the same, and knowing the name will not change anything .
1 . Sandbox, sandpit. Filter faced by most young sites. Manifested in the fact that the first months of the site, it can not in principle take place in the first issue , as it is not unwind. Sandbox - this is not a sanction , it is a certain trial period for young sites. The reason for its introduction, likely was the desire to weed out short- and doorways to prevent their presence in leadership positions in the search results . Before the introduction of some doorways sandbox was enough to stay on the first places in search of just a few days or even hours , to bring substantial profit to the owner. And if a lot of these websites ?
While the site is in the sandbox , you need to work on it , so that after the filter is removed , he immediately took a good position . Many people somehow just hang on and wait for the cap domain , but this makes no sense .
It is believed that the sandbox does not exist, and its effect - is a consequence of the complex actions of other filters, such as the direct influence of the age of the domain or trust . Like it or not does not matter , since the effect is obvious , and it can be called a separate filter. Sometimes separate filter for young isolated domains.
Sandpit , is only English-language sites. Avoid the sandbox or shorten their stay in it in several ways . However, none of them makes any warranty :
- Regularly build quality (if possible natural ) reference weight ;
- Initially buy an old domain , which was already trust 's age ;
- Regularly update the site quality unique content , give it to rss. Engage in copying texts in other sites should not be ;
- Close from indexing all the extra special pages site , leaving only what is necessary and useful to visitors ;
- Do not use excessive text optimization.
2 . Filter " -30 " . Filter against those who too actively and aggressively promoting their site. It is not intended to completely throw out the site from search, and created, rather, to hint at the owners website promotion inadmissibility spamming techniques . The reasons for its imposition may be:
- Rapid growth reference weight , especially if the links are to " junk" resources - unmoderated directories , guest books , comments on blogs , forums , etc. So do not actively use the system of automatic registration site in such services.
- Presence on the site " reference dumps " - large , poorly moderated directory of links , blocks advertising links on the pages of articles published specifically for advertising.
- Promotion of the use of cloaking . Cloaking - giving different content to users and search engine robots . Shown visitors read informative material , robots - optimized pages to improve search engine positions.
- The use of a redirect using java script. You should only use the server (301 , 302 ) . Certainly not any java script leads to this filter , as many believe .
Get rid of the filter can get rid of the above methods of promoting and using them in the future . Deleting reference garbage and cleaning the reference weight , you remove the filter from the site.
3 . Google Bombing. Stunting positions on a particular keyword . The reason - the rapid growth of links with the same anchor text . If it should be quite questionable , especially with regards to non-commercial queries . Eliminate it possible, for example , diluting referential mass anchors the text environment , and indeed strengthen its trust links.
The reason for the introduction of the filter was actually " guglbombing " - bulk buying of links ( often by many people ) with one text to a specific site and then to promote it by some absurd request. Classic example - the first place at the McDonald's site for " the nearest public toilet" or at the site of Dmitry Medvedev for " angry dwarf ." This was done or just for a joke, or to harm a competitor or just as viral marketing. Guglbombing - a consequence of reference ranking algorithm which , in the presence of a strong link mass , virtually ignores the text content of the page being promoted . However, with the introduction of the filter to get rid of this phenomenon are not fully succeeded .
4 . Google Florida. Pessimizatsii classic version of the site - a sharp drop in all positions promoted requests and , consequently, a drop in traffic by several orders . Imposed for excessive optimization pages - text keyword saturation , their presence in excessive tags title, h1-h6, spamnyh meta tags ; by and large , for the owner of the site of action aimed at its promotion , rather than improving in terms of user convenience. The filter does not work all the time, and periodically ; applied mainly to the sites promoted by commercial demands. This has resulted , in particular, that in the first places for many search queries , even the most expensive in terms of promotion, the first places are informational or educational sites , and no resources , investing in the promotion of big money.
5 . Supplemental Results. More search results ( " snot " ) , what is shown after clicking on the link " repeat the search with the omitted results included ." The main reason for this - a non-unique in all sense of the word , including within a single site, duplication of information . Identical in structure and , especially page content fall into more results. In many foreign Internet sites precisely estimate the percentage of pages in the main issue Google. If this percentage is below 30-40 , the site is considered to be non-unique and very high quality . By imposing this filter can cause poorly adjusted CMS, which generate a lot of duplicate pages and pages , do not carry useful information to users . Such pages are , of course, or even filtered , or fall into the supplemental.
In order to avoid the additional loss results pages should primarily use a unique quality content . Secondly, you need to avoid duplication of maximum texts within its site, close by indexing all doubles . This filter depends on the query ; one query page can be basically searching , on the other - in other results.
Display page out of this filter can be as follows:
- Buy her trust links;
- Unikalizatsiya title and meta tags ;
- Increase the number of its unique content;
- Reduction of "broken " links on the site and as a result , pages , an error 404 ;
- The correct setting of the engine of the site.
6. Filter on non-unique content . Perhaps it is not necessary to allocate it as a separate phenomenon , but it is very massively. The page does not contain unique content , gets the first position unless trustee and hold the outstanding reference weight , and then after a long time. From the set of pages that contain the same text , Google selects only a few , which gives the main issue .
7. Filter for broken links . Site positions may decrease if it contains a lot of broken links and 404 pages . Get rid of it , thus getting rid of these errors .
8. Filter on long -loading sites. If Google's robots do not get a quick response from the server , site position decreases until the complete removal from the search index. A few years ago by this filter can effectively get rid of, reducing the size of pages and images within them . Now that the Internet has become fast and the page even takes a few megabytes , loads very quickly , the reasons for these delays may be different. For example, in scripts or misconfigured server. Therefore, it is necessary to dig deeper , including to find out from host is not limited to whether the access to the site for the search engine robots.
9. Filter citation (Co-citation Linking Filter). The essence of this filter is to ensure that you only need to refer to quality sites , which in turn refer also to the normal resources . Naturally, trace the chain links is not possible, therefore it must be assumed that putting a link on your site to another , do you recommend visiting it to its visitors. And if recommended bad sites , Google concludes that the site itself is also of poor quality and one that does not care about their users .
The filter may be applied to sites that sell links or place under article in automatic mode, and the owners are not very thoroughly filtered received promotional materials.
10 . Filter reference dumps . To this can lead an active part in the automatic link exchange and the emergence of a large number of online pages that contain only links . Link Exchange must be manual , you need to change only a limited number of high-quality websites . And at the primary site must be precisely its content , not directories links playing a supporting role.
11. Filter reciprocal links . Mass mutual exchange , in terms of Google, it's bad. Therefore it is necessary to use more sophisticated schemes of link exchange , for example, when the first poster refers to the second , the second to the third , and the third to the first . There are more intricate scheme to bypass the filter .
12. Filter rapid growth of content. If the site is updated very frequently and actively , Google may suspect him of using parser generators or content and , therefore, reduce or eliminate its positions of issuance. Therefore, when using these generators is to put a limit on the amount of added material for a certain period of time.
13 . Google Bowling. Filter , whose existence is questioned. Its essence lies in the fact that it keeps track of the sites involved in unfair competition with the aim to drive their competitors under Google's filters , for example, mass purchasing substandard links or sending spam.
14 . Trust Rank. The degree of trust Google to a particular site . Actually filter Trust is not, but a site with a higher rate will be easier to achieve high rankings . It affects virtually all of what makes the site interesting to users , including the age of the site , who can talk about the seriousness of the webmaster .
We must remember one thing. A quality website that is for people who regularly updated quality and interesting information on which links appear natural , most filters are not threatened.