Abstract: Website If there are one or two deficiencies, search engine may forgive you, inclusive you, but, if the site has a variety of errors, then, the search engine will give you some warning, if you 18 classes of martial arts are made out, it is inevitable to escape death. Our web
Site if there are one or two deficiencies, search engines may forgive you, inclusive you, but, if the site has a variety of errors, then, the search engine will give you some warning, if you have 18 class martial arts are made out, it is inevitable to escape death.
Our site is k, in fact, this time is the most we should calm down, because only we can seriously analyze the site is k reason, can let their website recover as soon as possible. Here are some summary of the site recommended by K self-diagnosis method.
The right to fall due to black hat manipulation
1, Software brush flow
Whether the recent use of some brush flow software such as traffic treasure, flow elves, days and traffic king and so on? Brush flow, whether to their own brush, or to the competitor brush, if the method properly, bring the result is not the same.
In fact, because the brush flow is dependent on the nature of the purchase of the chain. Once you buy the chain, the expiration is removed, then your ranking will certainly be affected.
Brush flow once stopped, then because the brush flow out of the ranking, it will certainly be a nosedive. If you continue to brush, keep the brush, then brush to a certain time, various kinds of careless paralysis caused various index instability, by the search engine's attention. After the junior, absolutely not 15. The time to wait, it is not just the ranking of the disappearance of the problem, but facing the site down right or directly by K.
2, software chain Mass
The chain is a good way for us to get traffic and exchange traffic, but we should have a degree of management of our own chain.
Through software and other means, mass without content meaning of the chain, whether from the quality or timeliness of consideration, are not desirable.
Mass outside the chain, content can not be targeted, a large number of outside the chain, in different environments, with the same content to express, this is in the test of the IQ of the search engine.
For timeliness, each site, has its own set of outside the chain of suppression methods, once found, whether it is a warning, or shielding, or seal, the nutritional value of the chain, the survival rate is bound to be very low.
For the same time, a lot of the same links, appear in different sites, this for search engines, recognition is very easy. This kind of aggressive outside chain, the search engine general processing way is negligible, simultaneously will the link corresponding website has an unfriendly appraisal.
3, a large number of buy black chain
If you are on a website that hangs on a black chain, the search engine may not punish you because the search engine cannot identify who is the culprit. But if you buy or decorate a large number of black chain, in many unrelated sites, are hanging your black chain, then is to tell the search engine, the real cheating people, is you. If someone reports again, it will be completely finished.
4. A variety of cheating methods
Cheating is the most taboo search engine, such as the use of bridge page, link station with the same color background, hidden links, reduce font and other methods.
This kind of cheating, the current search engine may adopt conservative method, negligible. However, if the combination of other methods of cheating, or someone reported, after manual audit, these cheating methods once found, the end is only one, K dead.
5, the site was attacked
All three of the above are caused by their own factors. Then the attack here is due to external factors, the recent site has been attacked, hacked into the link or was a horse? Regular inspection, we can prevent.
So what are the specific methods? Of course, to upgrade the firewall, but also periodically check and analyze the data to see if your site has been attacked. Because at present, whether Baidu or Google can easily show the site was attacked, as long as our site appeared Luantan Web page, hanging horses and other phenomena, search engines will be on our right to decline. Of course, this is also for the benefit of the user.
As webmaster we must do a good job in this respect.
Website construction
6, the stability of the Web server
The stability of the Web server, for the site included, or ranking, are critical.
If an area, often an earthquake, will anyone come? Come and run away. Server instability, the site is often not open, or open the speed is very slow, I believe users will not like, and will not hesitate to leave, the experience is very poor. Also for search engines, ready to come to your site Crawl page is hindered, more times, you will think that your site is very unstable, will not give a high weight.
If the space is not stable, then in the hair outside the chain, it is necessary to attract attention, whether you need to speed up the spider to crawl the frequency of your site.
7, the website revision brings down the right
Web site program or database problems, or because of improper operation mistakenly deleted a lot of content, the site appears dead chain, the results lead to a large number of 404 pages.
The web site for the revision, a wide range of changes in the structure of the site, content, path, or frequently modify the site title, keyword tags, description tags, these will cause search engine attention.
Web site revision will usually cause a lot of turbulence, in the case that must be revised, we need to deal with many small details to avoid the collision of search engines.
Website revision, preferably in the local revision, and after testing, no need to change the case, and then upload.
8, the website content is not healthy
Website content has bad information, such as pornography, gambling, reactionary content and other illegal information.
In our website for editing and content input, we should fully consider whether our content is healthy, is not in line with the current legal provisions, is not included in sensitive keywords. We know that because the state of the Internet content of the strict practice, search engines are not willing to show those include unhealthy content of the site, naturally, such content must be harmonious. Therefore, we should have a strict audit of the content of our website.
9, the content of a large collection of articles
There are a lot of webmasters do not want to write their own soft text, also do not want to hire writers to write, so in the content of their website to continue to update the time, are used in the Content collection tool.
The search engine is to provide users with useful information, for this whole station is to collect other people's content, no own value of the site, search engine will not let him appear in front of users.
After 6.28, the majority of webmaster found that the site content of a large collection of hazards, as well as the importance of original content.
10. Whether the website is on file
Under normal circumstances, the choice of domestic space, all need to record. Not to the site for the record of the site, usually not by the search engine down the right, but is the site to record the authorities down the right. Because this was blocked, a little miserable.
11, with the IP site was blocked
If you are worried about this problem, you can choose a standalone host.
But because the cost of the independent host is expensive, most people will choose to share a host space with others, this is a certain risk.
Similar to the same office, many people have the flu at the same time, then others will be included in the suspected flu target. The same reason, a space, a number of sites are blocked, then your site, will also become suspicious object, serious will be affected.
For the use of Friends of the station group, some people may say that multiple sites in the same space, then encounter the possibility of serial killing is also very large.
SEO Operation fault
12, station group Sprocket
Station Group is to build a lot of similar sites, so that their own sites in search engines occupy more positions. This is a common type of SEO cheating, search engines are extremely opposed to the way. We can through the search engine on the major portal blog down the right to see out.
Station Group has a certain commonality, IP, templates, content, keywords, domain name registration information, the tightness between the site, the chain environment, these commonalities is the main way to identify the site group search engine.
If one of the sites in the station group is found to be down to cheating, then other linked sites will be implicated.
13, station optimization over
Many webmaster in the site has not been online when the content of their site began to optimize, online and immediately after a variety of SEO optimization.
For example, a large number of key words, mate tags repeated keywords, the page keyword density is too large, all outside the chain point to the home page, the chain anchor text is too single, the site to fill the content, a large number of collection articles, and so on, these are the performance of excessive optimization.
Now Baidu for SEO, relatively cold, Baidu advocates, "in the premise of ensuring user experience, appropriate to do some of the optimization of spider will help to include."
14, the website theme is not clear
The topic is not clear, the user does not know your website, exactly is does what, cannot find from your website He cares about the question, whether is the bounce rate, or accesses the time and so on, these data are disadvantageous to the website itself.
For search engine, see your website content mess, what all have, the content has no correlation, the content and the title is irrelevant, such station to search engine, equivalent to trash station.
15, the inside page and the homepage compete with each other
First of all, the internal page and home to compete with each other the same keyword, this phenomenon often occurs. The way to avoid it is to optimize the different keywords for each page, which embodies the importance of the long tail word record form.
The second is that the inner pages are too similar to the home page, which usually appears on the label page. Reasonable planning labels, the correct use of labels, can avoid the occurrence of similar situations.
16, a large number of anchor text to the home page
Here said anchor text, just the site inside the page pointing to the anchor text. Many people believe that in order to focus on the weight of the home page, regardless of the various anchor text, point to the home page, in fact, this is not desirable.
If we are not related to some anchor text all point to the home page, in a short period of time may have some effect, but if the number of large, long time, will fall right.
Reasonable within the chain to the site rankings, lifting the weight of the homepage is useful, but not that all the anchor text will point to the home page. We can point to another article through the article, like a perfect spider web, search engines can more smoothly crawl your site, there are better included.
17, the friend chain is a burden
This needless to say, we all have their own understanding. In the blog was k at the same time, some people withdrew my friend chain, some people are willing to hang my friend chain, at the same time I also very reluctantly withdrew one or two friends chain. Really reluctantly.
Friend chain, is a link between the site communication, as friendship, is a double-edged sword. Good time, can encourage each other, support each other, bad times, will attack each other, abandon each other. The reason for this reality is that search engines.
However, if the site is doing a good job, the search engine will not be because you hang one or two of K site links, but to you under the ruthless hand. Usually only a number of adverse factors, together with the attention of the search engine, will be the friend chain this factor into which.
18. Low User Experience
Many people, are in order to improve their rankings, and constantly to our website for a variety of optimization. But another factor that we should consider is the user experience.
Now the user experience has been viewed by search engines as an important factor because it is designed to improve the user experience.
Your site If there is no value to the user content, resulting in very high jump rate, the average user stay time is shorter, PV is low, search engine will determine you as a worthless site, will not give you the weight.
Add point, robots.txt file because of error operation and other reasons shielding search engine access, the result is conceivable. We can check robots.txt files regularly to prevent accidental occurrence.
Site by K, is a experience, a growing experience. We understand the risks of doing the station and we need to be prepared for it. Hope that the above 18 points, for the site by K friends, can have some help.