Since writing my last article a lot of things changed. And old friends (or rather, the sworn enemies ) type of Panda and Penguin evolved and new predators appeared little animals in the zoo Google. Whether there was a better issue of it? - Definitely not . Problems were added as webmasters and ordinary people , which supposedly cares " corporation for good." As always , it is beneficial to someone third . But that is not the purpose of my article. We analyze what filters are currently how to deal with them and how to live . Last article was full of bitterness, venom and criticism. There will be more constructive .
The essence of all filters Google (and , in fact, all search engines ) is reduced to one a disappointing conclusion for many webmasters - tightening the criteria for indexing sites in general and for entering in the first place of issue in particular. Perfectionistic attempts to improve the site , bring it closer to the ideal Googley , often lead to the opposite effect , namely to filters and sanctions.
Good purpose at first glance. But only at first , only in theory. In practice, all the criteria for quality according to Google's sites are unwritten . Of course , there are multi recommendations from the google , but experience shows that they are far away from practice. Conversely, mindless adherence to these " helpful " recommendations can lead to sanctions and it filters.
So spend a historical retrospective sanctions and filters that Google puts the guise of concern for ordinary users find it.
Probably, the most common and controversial filter - sandpit. Its essence lies in sifting young sites ; they just are not allowed to reach the top of some indicators (actually, age, number of pages , attendance , etc.) On the one hand , it seems logical that in the first place shows the old and proven sites. On the other - from the young w sites taking funds for development , if Google does not allow them to get quality search engine traffic ? In general , the filter unpleasant but inevitable . Nothing wrong in it, it's just a logical step in the development of many websites.
Sandbox - The period of time within which to safely develop the site . Add good unique content , purchase one way or another quality links do internal optimization , work on behavioral factors. All these actions will be effective after the site out of the sandbox . Period - from months to years.
At this time, you can get traffic from other search engines - at Yandex , Mail.ru, Bing.com , etc. sandbox or similar filter for young sites not. There's quite possible to go out to the first position on the low-frequency queries in the first month of life resource.
After that was set nameless filters - for reoptimization for bad links for quick growth of external links , for the low download speed for duplication of content within the site , etc. All this overshadowed new animals - Panda and Penguin . Also have sanctions imposed by hand, but they are below.
Panda was the first serious tightening of selection of sites in the top . In fact, a new ranking algorithm , as claimed by Google itself . But we know ... algorithm algorithm , but in the end it turns out it was a serious filter for substandard (according to Google's robots ) resources.
The first thing you should pay attention , it's content . Obviously, do not you? Google analyzes the content in terms of trust to specific articles and the site entirely. For example, important unique content (not formal tsiferki issued programs check for plagiarism , and the real uniqueness ) , literacy, quality of information , its accuracy , etc. Comprehensive analysis allows weed quality and useful to people and materials , respectively , the sites which most of these materials .
So-called " seo articles " immediately eliminated ; in fact, no one does not write lyrics , stuffed with keywords to the limit. With copy-paste situation is more complicated . So , Google said that for the non-unique text, for example, laws or standards (ie , the fact that , by definition, impossible unikalizirovat ) sanctions will not be imposed . Certainly understood that apart from this site would also contain a unique content to be copied and complement . It is also necessary to avoid duplication of materials within the site , close by indexing all takes content that can create a variety of engines . Close also need special pages (tags , site search , user profiles , etc.)
The first sign of Pandas may be a drop in traffic , which is associated with the loss of most of the index pages. And not the fact that it will fall out substandard materials - selection criterion remains a mystery. New pages while also not indexed ; it all perfectly clear on the charts in Google Analytics statistics . Website may disappear from the issuance of the most low-frequency queries , including direct occurrences of articles on search phrases . If this happens to your site - congratulations, you Panda .
But it is treated. Actually, it is clear from the above , how to get rid of the filter. First - tidy content . There are unpleasant situation where your unique samopisny content copied ; in this case it is better to rewrite it to again become unique . Desirable to regularly add content to the site. Second - order index closed for search engines service page. Safely close all that users do not need in the SERP . Ideally, they should only receive relevant content . For all popular CMS there ready files robots.txt, based on them , you can make your filter.
The site should be interactive. Be it social networking buttons ( do not forget to Google + ) , keep the comments and discussions , to encourage visitors to stay longer on the site and make it proactive . Actually, behavioral factors ( for them at our site have my separate article ) is gaining more and more weight as a ranking factor , and not only in Google, so ignore them focus .
Advertising must be just advertising, but not the main content . It should be useful to visitors (yes , it happens! ) Should occupy a minimum of space and be subject. Google Adsense with a properly configured targeted ideal , is not it? However , accurate selling links also does not hurt . But popups , teasers , news feykovye style " Pugacheva lost his leg " and other junk advertising goes into the furnace , it is a recipe for not only filters in Google.
Panda - not a one-time update of the search algorithm . She regularly " attacks " on the issue of filtering more and more sites. Total , according to Matt Cutts , have 25 versions of the Panda. Last published in March 2013 , and since then it has become constant. Can conclude its discussion of the unpleasant filter and go to no less destructive algorithm - Penguin .
Penguin appeared in the spring 2012 . Its aim , unlike Panda external ranking factors , namely reference. Notorious seo links that for many years were almost the only way to move subject to the availability of at least some of the budget. In commercial budgets and subject areas is determined top . All attempts to deal with this have been useless, but came Penguin and all decided . As always, in Googley , decisively, on a large scale - and crooked.
Under Penguin hit , including sites and perfectly optimized conditions Pandas - good content , relinking , behavioral factors , and many others . The reason is simple - the external links that Google suddenly considered bad.
Keywords in Ankor references ... what could be more logical , it would seem ? But no, it turns out, these are the sites at which most of the links contain keywords , came under the filter. Trapped behind the cheat issue , they say, there are natural links or bezankornye , or some neutral texts , but certainly not with the key phrases .
The essence is simple - to avoid the Penguin , you must anchor list do as natural as possible . Key phrases in the form of " TV to buy cheap moscow " were in the distant past. Anchor list be diluted , for example, bezankornymi link ( containing the specific address or domain site ) , as well as key phrases without direct entry . Keywords must be in a readable form , according to the rules of the language. Not " plastic windows Urjupinsk installation ", " Installation of plastic windows Uryupinsk ." Perhaps a percentage of direct occurrences can and leave , but more of them, the more chance to get under the filter .
Second , it is quite obvious criterion - it's a coincidence subjects donor and acceptor . This, of course , a difficult question how to determine the proximity of the subject , to determine whether the theme of the site in general or any particular page in particular , and in general, how narrowly delve into the thematic proximity . For example, will thematically page acceptor " Sale of wooden doors" page- donor " Sales of metal doors "? Not worth the climb so far . In my opinion, a fairly general theme of coincidence .
Paradoxically , but can and should buy links with the attribute nofollow. Yes, Google them to move like they do not have to , but he had them in the rankings take into account , and finds it perfectly natural.
The rapid growth of inbound links is also harmful . Previously, because it was - bought a pack of matches , and expect growth. And there was growth. And now, on the contrary, he bought a bunch of links and came under the filter. For unnatural ! Links need to buy gradually , focusing on quality and theme . In any case, they are not cheap , so much and can not buy.
What is the most annoying is the fact that , unlike Panda Penguin climb out almost impossible. Most often, it is necessary not only to eliminate all shortcomings reference mass site , but also to move it to a new domain , putting a 301 redirect from the old .
Outbound links are also the subject of analysis by Penguin site . The criteria are the same - if the link is not formulated for people if not adequately written into the context of the page , if it is not the same topics , so there is a good chance to get a filter.
We have a handy google ( in his opinion ) tool - links deviation (Disallow Links). It is possible to specify which links to your website , do not need to take into account when ranking . Logical, because such links can put anyone, and, theoretically, anyone can make your site under the filter , having bought thousands of poor quality links. Therefore, you should regularly analyze the reference weight , upload links list , and all the low-quality (or doubtful) to send to the service.
In the spring of 2013 was rolled a second wave of Penguin . Its consequence was most serious filtering options purchased on the reference markets . Many sites whose positions held by these exchanges plummeted extradition. It struck both by webmasters ( how to buy and sell links ) , and by itself exchanges, as well as systems of automatic promotion and various aggregators .
The latest version of the Penguin appeared in October 2013. It analyzes all the pages , without exception , and filters them much tougher than the previous version. For example, now may be imposed sanctions for active placement of articles in foreign guest blogs. Most of these articles are written formally and not particularly high quality, their main goal - just to get a link . Now this must be approached responsibly , really need to write interesting articles that will be read by people .
Google also has a negative attitude to the links in the comments and posts of forum if they are delivered specifically to promote , and not as a recommendation to others. The big question is how it will distinguish between these options, but the fact remains - the links should be natural . Mass spam dofollow- blogs do not just bring positive results , but also lead to a rigid filter. Many webmasters who previously followed such recommendations have noticed the growth of the positions of their sites. This is logical , because they have grown at the expense of those who did not heed these recommendations .
EDM- filter (exact match domains) directed against the sites whose domains consist solely of keywords. Such websites are often MFA, but not exclusively. In runet common myth that having keywords in the domain significantly affect its position in the search. Many rushed to register domains type kupit-dom-v-moskve.ru - and came under the filter. Perhaps it can be called quite adequate compared to the rest .
The same applies to the CNC. Themselves " Beautifying URLs " is a good thing . But when in the URL of the page contains the whole deal and it takes hundreds of characters , it's overkill , and it's uncomfortable for the person and for the search engine algorithms which try to influence such a rude manner. Clearly, such Pages that can be filtered .
The site, of course, is evaluated as a whole. If it is good for people (and this is seen by behavioral factors ) , if the keys are quite logical domain (carsales, hostelbookers and the like ) , if earnings on Adsense - not the sole purpose of its existence , the sanctions imposed on it will not.
The last significant change in Google's algorithm is Hummingbird - this is more like it is on the improvement of the search algorithm than its rigid filtering. The emphasis is on the search approximation to normal human language . Hummingbirds analyzes not just words in a query and outputs relevant pages , it defines the context of the request.
Hummingbird - little bird , but this algorithm covers the vast majority of queries , about 90%. In particular this applies to verbose queries with updates and details . Here plays a role not only morphology and semantics and . For non-commercial inquiries primarily be issued information sites , commercial - the ones that are most relevant within the meaning of the phrase entered by the user . If a person is looking for a product - it will show the goods, if the service is looking for , and even geo-referenced - be it services near his location.
If people had questions asked in a simple form , such as " buy a TV " or " thailand tourism" , but now more and more questions are wordy ; people trying to communicate with the search engine in their own language without trying to adjust to the robot. But for these people and made new algorithm .
Sure, it can not affect on many sites . Obviously, this is an obvious attack on the resources containing many articles optimized for low frequency long-winded questions . Naturalness texts - the most important thing . Yes, use keywords possible and necessary, but only if they will fit in a distinct context. The same applies to many newcomers favorite title tags and h1-h6. Be natural !
The downside of such a personification of issue, as paradoxically , may be a drop in traffic from top leaders . In snippets will be a brief description and the phone - of course, many will call immediately , without visiting the site . This is a plus for commercial sites selling a particular product . Plus for those companies that have established contacts with clients; the phone has to sit a professional who immediately tells the desired options. Remember, people have not seen the site and do not know all the details.
Here comes to the fore LSI- copywriting . Its essence lies in the fact that the text of the article includes not only the main keyword phrase , and accompanying it . For example, in an article devoted to the acquisition of appropriate air conditioners , in addition to the primary key " buy an air conditioner ," to use the word "temperature" , " climate" , "home appliances " , etc. keywords must be written naturally , but due to successful selection of context words Interesting articles and general visitors linger on it - it is , among other things , positive behavioral factor . Should be less water ; even if you want to stretch the text, stretch it with the use of words that are similar in theme to the main Keyword . Actively using synonyms !
This is primarily hit " shkolorayteram " who punched articles on the exchanges of 10 cents per 1000 characters . For the money the customer does not care about quality, only it was on literacy and on topic . However , saving on the content , you can lose a lot of money because of the falling position of the site . Seo copywriting is gone - and this is good as webmasters and good writers , who , in fact, before writing quality text .
It is still early to talk about the effectiveness of this algorithm , as in the Russian segment of it or do not act , or acts in part. But still to come , so I will repeat again and again - write texts for the people.
In addition to automatic filters , Google may impose sanctions and hand . Seemingly billions of sites , not for all to follow , but the fact remains - this section is, in Webmaster Tools. It all starts with the message " We found that some of the links to your website , Google violate the requirements for the quality of search results ." In panel webmaster indicated even a specific reason - for example, low-quality inbound links. Google may even provide concrete examples of such low quality links. Hand sanctions may be imposed on the entire site as a whole, and its individual pages .
The main advantage of manual before automatic sanctions " algorithms " - the possibility of human dialogue , an opportunity to get a clear indication of the problem and , as a consequence, an easy opportunity to correct them.
After removing all the inappropriate references (of which was mentioned above ), you must file a request for reconsideration of the sanctions. Iterations may be several, if you're lucky - eventually filter is turned off and the traffic back. It may take several months.
Sanctions may also be a reference for the spam in your website. This may be due to poor moderation comments or offline , as well as due to a poorly tuned antispam system . Alternatively , the site could crack and pour the doorway with spamnyh links. Of course, this is a direct cause for the filter , however, it is easy to remove .
Another reason may be content with the " low added value " (little or no added value). These pages should ideally be completely removed or try to just close by indexing , if they are needed on the site.
Recently, a new variant of sanctions - the discrepancy image. It is imposed on non-thematic pictures in the article, which will be irrelevant in the search for keywords related to the page text .
Recently more and more talk about porn content filtering . Impossible to say , good or bad , but the variant when this filter to get completely innocuous sites ( as is the case in Yandex ) , for example, the presence in the comments of words related to this topic or for flooded through hole porno .
If we talk about the future, the search will become more and more chelovekoorientirovannym , personalized regionalnozavisimym . Google will increasingly move away from the traditional keyword search , and relevance judgments , more attention will be paid to the context and user behavior . Will be used extensively to improve the Knowledge Graph semantic search .
Conclusions from simple articles - about traditional SEO , prevailed on the Internet the last fifteen years , you can gradually forget. Technology , on the one hand, more complicated - you need to take into account the semantics , context, analyze the structure of the site , the behavior of almost every visitor ... on the other hand, it has become much easier - do websites , interesting people. In this case , with high probability , any filters you are not afraid . Be aware of the latest trends in promotion , analyze different points of view , try to predict the next move Google. Leading sites exactly and act - so they are the leaders .