new filters
Filters of search engines - an inevitable thing. Only thanks to them (ideally , in theory, of course ... ), visitors using search , receive relevant issue ; they are provided with the sites that best match they enter search queries. Alas, this is only a theory ; in practice, most filters are very far from perfect. Their influence led to the fact that the first places to look is not the most relevant pages , and those who succeeded in one way or another around these filters . No, of course they need. Issue to be filtered , and just sifting govnosayty substandard resources , make fraud search license deceit , etc. But despite the constant development of search technologies , high-quality filter efficiency is not increased. Their development and the emergence of new leads only to greater and greater discontent webmasters. They cease to understand the rules of search engines , which do not explicitly written . And more and they are becoming less clear . Misunderstanding is exacerbated by the selective action of filters. Often there are sites in the tops with numerous violations of prospecting licenses ; thus fall under the sanctions and completely " clean " resources. No filters are applied immediately - it may be some time before the robot finds the site violations and punish him ; the same applies to the lifting of sanctions . Helpdesk search further complicate the situation , confusing webmasters fuzzy answers.
panda
Panda " came " in Google in February 2011 . Its introduction, more precisely, the results shocked many webmasters - and no wonder , because traffic has fallen several times, and on a completely "white" sites for people that have a long and successful working and is in first position .
As always , no specific filter information "Panda" , about its features and causes , as well as the methods being out of it, Google is not posted . After some time, Google representatives mocked webmasters , laying out a list of factors that lead to the imposition of the filter. And what we are seeing? Yes, all the same : - non-unique content , more precisely, its large number ;
- A large number of advertising;
- A high bounce rate ;
- Low residence time visitor on the site;
- Bad outgoing links ;
- Bad incoming links .
Universal answer to the question " what 's come under sanctions," regardless of the type of sanction of the site and search engine. The practical benefits of this response exactly zero , since normal sites and so avoid all of these items , but it does not prevent sanctions.
In any case, a conclusion can be drawn :
Google tightened its criteria for quality sites in the top .
Criteria seems to be unchanged , but increased control of their compliance. It would seem that all these principles are quite obvious - but to follow them all in one site is almost impossible ... "Your problem," says Google . Who is right ? He , of course ; because it depends on him and the traffic as its main consequence, the profit that brings websites to their creators .
Yet we must do something . The first is to draw attention to the uniqueness of the content , not just the Internet in general , and even within a single site. Here are some important points ; illogic of some of them there , but what can you do ... 1. should get rid of all duplicate content. It could be in the form of short news pieces complete and tagged pages , dates, sections , etc. , in general , all areas where the text can be duplicated . All this needs to be closed from indexation robots.txt.
Should not , by the way , remember that Google (and many other search engines ) are not able to determine the original source of content and not really aspire to this . If someone copies your text , not the fact that it was his site falls under the sanctions. Alas, you will have to rewrite their texts that they again became unique . Yes , logically, thus you can help someone who has copied your old texts - they then also become unique. Embarrassing moment , which many do not think. A lot of people think - and specifically replicate .
2 . Also must close by indexing all the pages that do not contain text content , or pages that are not for visitors , special pages , etc. For example, these can be page site search , user profiles , lists of search queries . The same applies to pages with non-unique content copied from other sites , as well as to generate unreadable text.
3 . Even a few "bad" pages may adversely affect the fate of the entire site , the rest of the pages in the SERP. Watch closely for the site contents. This is particularly true of sites whose content is created by users - forums, blogs, and actively commented etc. Careful moderation - the key to success . Do not manage yourself - hire moderators or invite them active participants of communication online . You should immediately delete all spam , discussion topics " on the verge " ( violations of the law , pornography , drugs , etc.) , illiterate and frankly pointless messages or comments. There were cases when the site positions fell sharply because of one such " violations " .
4 . Considering the "social " activity site visitors. This refers to the addition of links to resource materials in the social network. To do this, place a button on a webpage for this addition ( vkontakte , facebook , Google + , Twitter , etc.)
5 . Content of the page should be a lot , not just a few paragraphs. Yes , not always , not all sites can provide a large amount of content . Not all types of resources it will be useful to visitors. But it is a fact . Quality content also plays a big role - affirms that the pages with texts that contain a lot of errors ( spelling, punctuation , etc.) will go down in the search.
Long texts should be qualitatively structured , however, it is logical and necessary without Pandas . Nobody will read long " sheets " . Need to cling to view visitor using text markup - headlines, images ( uniformly distributed and placed in the right places ) , insert various information , etc.
6. Advertising on sites content should complement , not replace it. Advertising should benefit site visitors , not annoy them , forcing as quickly as possible to leave the site . Optimal advertising - thematic trust banners from direct advertisers and, of course , contextual advertising . However , the adequacy of the show Adsense contextual ads often raises questions.
7. All the same notorious behavioral factors also play a role, apparently. And they accounted for according to statistics Google Analytics. Conclusion - do not put all these statistics - suggests itself, but whether it is true? Perhaps without all sites will be reduced or behavioral factors for such sites will be assessed indirectly, and therefore, it is not accurate. These data, incidentally, can be collected through Google Toolbar and Google Chrome does - sort of a Trojan turns out ...
8. Website should be interactive - its contents should cause the visitors desire to discuss it, as on the site, and the third-party resources - forums, social networks, personal blogs. However, the interactivity is a downside. I think all seen blogs where the record itself takes a couple of paragraphs, and 90% of the content of the page is filled with comments. And they are a continuous text, with no breakdown on the thread, so if there are dozens, it is unrealistic to keep track of any discussion. Comments become just low quality stream of content, no benefit site visitors. Well, yes, they can speak - and what? .. Where many can speak. Whether to consider a large number of positive comments as such - the big question. Especially because it is a factor, which also lends itself to simple falsificating.
The most annoying thing - the fact that this filter and hit the sites that worked for years without any complaints from the search engines. Yes, this time they would naturally have accumulated thousands of pages that do not contain content containing any "obscene" content or just office materials. This could be the consequence of the shortcomings and the engine, and neglect of the owners. All this did not prevent or interfere with the visitors suddenly became ... well, at least because all these pages just do not get on top of search engines and most people just do not see them, and even if they see - immediately closed. Many such sites design and usability can not be changed for ten years - it is also quite meet all the requirements of users. In this case, again, these sites are 100% meet the needs of visitors, had a very high quality and regularly updated text content, and in general have long been a kind of standard content projects.
So, I ask, Why would they need to completely change? For what? After all, they are designed and operate it for the visitors - and visitors satisfied. Adapt to the search engines? - Dismiss! So precious is lost traffic; and lose of those for whom created and search engines - their visitors. I think many have noticed already, that if the requested information was used on the first page of results, and even in the first five positions, but now it can get caught on the third and on the fourth page. Uncomfortable, however, and is not logical.
Panda - not a new ranking algorithm. This is a filter that is passed from time to time by topam and cleans them from unwanted, in his humble opinion, sites. Analogue - AGS in Yandex, the analogy is enhanced if illogical to understand the full results of both filters.
We can say that Panda is a classifier documents; it distributes them according to various categories, groups and, therefore, "gives the sign" search algorithms to increase or decrease their positions as well as positions of other documents within this site. And these categories can be set ... ranging from the banal non-unique content and ending with a seemingly harmless and does not affect the mapping errors in the code page.
And to what does eventually lead Panda and its subsequent "reincarnation"? And the fact that the owners of large sites will overwhelm them and will be engaged in "one-stage projects," the essence of which is the rapid collection of traffic and money with him. Such sites will quickly fall into the tops, there exist for months (well, a couple of years maximum) and then quietly fade into the bowels of a sickle. And in their place there will be others.
It also led to the fact that the sources of information began to be provided at a lower position in the search than the sites that this information is copied. Of course, in many situations it is justified - but still mostly good news sites, especially agencies present their information in great shape, much better than those who do it just for parsing RSS ...
If you carefully read the above criteria , we can conclude that the ideal site - a MFA. Yes, any modern website, created exclusively for earnings on contextual advertising complies with all the principles of " the ideal site for the people." But if he is perfect ? - Alas, no. His goal - to not meet the needs of visitors and direct earnings per click advertising . A satisfy the needs of its users on the sites to which they 've come to this most advertising. Doorway , in fact, but more domesticated .
According to the statements of managers of Google , Panda uses so-called " signals " that say about the quality of a document - among these signals recently took the reaction of visitors. And then indicate that important visitors quality content.
Soon after the introduction of the Panda came the official announcement that this filter allows for active user reviews that are going to the browser plug-in Chrome , sponsored by Google . Virtually all the sites (84%) who received negative feedback , soon came under "wheel " Panda. Despite this, they argue that users block sites by themselves do not lead to a lowering of positions - it's just a signal to the filtering algorithm . But the numbers speak for themselves ...
Another unpleasant thing is that , again, according to representatives of Google, Panda - the algorithm is 100% automatic, and no requests for review site will not help bring back his former position. All algorithms solve . Fix , they say, violations on the site, and it will return its lost ground by itself. Reminds answers Yandex support in the face of Plato Shchukin - " Our algorithms have decided ...", which sent even when the robots to the site and did not go .
As you can guess from the Pandas have suffered not only the zafiltrovannye sites, and those to whom they were given traffic. According to the chain , so visitors lost completely " harmless " resources - only because they are referenced sites came under Pandu . The same applies to losing Page Rank, though now it has been reduced to a symbolic role .
Many , during the introduction of the Panda noticed that fell as traffic from other search engines , for example , Bing or Yahoo. There can be several . The most obvious - these search engines use to some extent the issue of Google or the same filtering techniques , either directly or indirectly. Less obvious , but more logical reason - the site owners who have fallen under the filter , of course, try to get out of it, significantly modifying the site. These changes may come into conflict with the demands of Bing or Yahoo! , therefore, in these systems the site positions are reduced. But the fact remains - Panda deprived many traffic .
There are many recommendations for the evaluation of articles posted on this site , or from the standpoint of preventing exposure to the Pandu , or in order to get out of it. It makes no sense to write long lists , they all boil down to one thing:
Read the article and think - worth it ? Would you recommend it to your friends to get acquainted with the theme? Is it original ? Whether well read ?
In March 2012, the webmaster was "pleasantly " surprised by the introduction of a new version of the filter - 3.3. According to representatives of Google, in the algorithm was made more than forty changes. The main focus is on changing the system analysis and evaluation of options - however , no details of them are not followed , except for common phrases on a certain topic pages. It can be concluded that enhanced the struggle against " unnatural " , the purchase link . This fight was declared a long time, however, never really actively conducted for many reasons. The main reason - in fact , purchased links and allow search engines to generate a relevant issue . Another reason is the possibility of manipulation of this factor in order to reduce the positions of competing sites - because no one bothers to buy thousands of links leading to the other site and the search engine will not reveal it.
Penguin
"Zoo" Google added in April 2012, a new algorithm for filtering issue - Penguin. The main feature of the Penguin, unlike last algorithm, Panda is a large focus on the promotion of external factors, namely the links. He drew attention to the quality, quantity, rate of change of the reference mass, its naturalness, etc. Summarizing, we can say that Penguin aims to combat web spam in all its manifestations.
In general, Web spam can be defined as any action committed to improve the position page without improving quality of the page and the whole site.
In the description of the algorithm mentioned increased impact of natural links. However, nothing is said about what is considered natural links and how to distinguish them from purchased. As a consequence, among webmasters a new concept - "imitation of naturalness." It contains an internal logical contradiction, because as you can simulate natural? There are many versions: - buying links with anchor text containing the address of the site
- With anchor text containing the name of the site
- With anchor text with the word "here", "here", "online", "can read," etc.
Such links are called "bezankornymi", ie, in their text does not contain the traditional search. Do they make sense or is it just a waste of time and money - the big question.
If we summarize the reasons for which the sites punished by Penguin, you can highlight the following: - he does not like the abundance of keywords in Ankor purchased (well, suddenly natural) links. Dilute them, use okolossylochny text ... in the end, buy links with anchors, not directly related to the page to which they lead. Can also be taken into account and the quality of pages donor spammed their keywords and the quality of their content.
- No spam! One of the few positive aspects of Google's new filters - is an active fight against spam. Automatic runs hrumera and similar malware, hopefully, finally a thing of the past. Of course, the question remains about the negative impact of such runs on competitors' sites ... but the fact remains - the promotion by aggressive spam your links will not have such a high efficiency as before.
- High speed capacity of purchased options - bad. This factor has long been discussed, but the webmaster did not come to the same conclusion. For example, if the high growth rate reference is bad, in a way, albeit expensive, you can omit the extradition of its competitors. And what does "high speed" - how it links a day? in a month? Most likely, it is a relative concept, and it takes into account not only the speed itself, but also the quality of most of these options, as well as age, and the "trust" of the recipient site.
- Placement options also plays a role. Notorious "link in the footer" can play a negative role.
- The same applies to internal links. If most of them will contain keywords - come Penguin. Try to make them harmonious and natural as possible. However, there is a certain logic, and without regard to the filters, because texts oversaturated with malochitaemym link text will be poorly understood by readers.
- Purchased articles may also be of poor quality. For example, for submission to article directories use many duplicated articles, which, naturally, "written by" unreadable language. This is one of the negative factors for the Penguin.
- Creators of popular MFA long registration domains containing multiple keywords and phrases. Such sites are also likely to fall under the filters. The same is likely true for long URLs with keywords.
- Quite probable negative effects after using programs to access to Google - analysis of issue and its parsing, analysis of site positions, etc.
Despite all this, do not immediately remove all links or the majority. Cost to remove a frankly bad links, and after that it may be a temporary drop positions. If the panel webmaster you have not received a notice of "the use of low-quality links", and hence, the other factors are likely to internal quality-related site itself.
Intrinsic factor, as has been said, may be internal links - worth peruse the pages for poor internal linking. Well, and requirements concerning the uniqueness and quality of the content also remain in force, as Pandu has not been canceled.
Some experts, such as Michael Shakin, note that one of the factors influencing the runaway Penguin are the transitions to the site directly from the browser string when there address is entered manually or copied. Consequently, the positive effect will be on the balance of "inactive" in the form of links http://site.ru without a tag on popular sites, where such a link people to pay attention. Reference should naturally be content posted an interesting paper.