This article records the process of my website being indexed from K to restored in 404 to weight 4, which involves the 304, 200, 200, and robots.txt documents of website logs, signal-to-noise ratio, related articles, statistical analysis, and server selection. As a webmaster, my writing skills are limited. I hope this article will be helpful to webmasters and friends who encounter the same problem. Don't forget to give me some comments!
I. Website introduction
84 Agricultural Network, an old station in 08 years, was built during the Wenchuan earthquake. In fact, the development has always been good. In, the top 10 comprehensive websites of the top agricultural companies in China were selected. Later, they were canceled because they did not go to Beijing for a meeting. Because the website was built as a student identity, after graduation, it was easy to manage and think of collection. (I believe many of my friends who have started the website have this problem ). At first, the program used supesite and discuz. In order to update the program in real time, Zhimeng was used to establish two Directory stations. A well-known collection plug-in of Zhimeng was used for automatic collection, which was recorded around 0.1 million, PR6, the overall website revenue is more than 4000 yuan per month. The income is not high, but it can be mixed to eat, so it continues until Baidu K station day.
In June 28, I believe many of my friends are suffering. Baidu conducted a large-scale K-site and algorithm adjustment. In fact, my idea is that none of my stations is affected by K, baidu is meaningless. K is indeed missing. Similarly, google has k sites, and PR has been directly reduced to 0.
Record comparison before and after Baidu K Station
II. Website restoration
After the website was K, it actually went away for about half a month. I deleted all the data on the website, re-designed the template, and re-designed the content column, locate the website (although I still think there is a problem with the positioning) and then re-build the site. The website recovery starts from now on. The main method is to improve the details of the website, and view IIS logs on the server to improve the various forest types of the website. The following describes how to process logs. Of course, we also need to use some high-weight external links to guide the spider.
1. Website 404 handling
After the website is deleted, the 40-4 quantity each day will be posted to you at 100,000 RMB.
After writing robots, you can use the google website administrator tool to check its validity to avoid the case where errors are not recorded. Then update robots with the webmaster tool of Baidu, and this will take effect soon. In about ten days, 404 is reduced to more than 20 thousand, and now there is basically no 404.
2. Website 304 handling
In fact, when I got 304, I was very nervous. What I said on the internet was that there were no updates or that the downgrading was like this. If my income is static, it is 304. If I use dynamic resources, it is 200 0 64 (this problem is described later). I have been thinking about it. In fact, my website is updated, how can it be 304, but if the permission is downgraded, there should be no 200, but another 200. Therefore, I processed my homepage statically. The homepage is generated every minute, and Baidu gave me 200 of the result. It seems that when Baidu visits this page, it reads the modification time of the webpage.
3. Website 200 0 64 processing.
The website's 200 0 64 has always had many sayings, such as systematic, authority downgrading, and network problems. I like to test it myself again, so in response to this problem, some tests were conducted. If the permission is downgraded, I cannot handle it, so I compared the system with the network. In both linux and windows2003 systems, 200 0 64 occurs. If other systems are not tested, I think it is a network problem. When I saw that Baidu recommended a domestic cdn, I used it for testing. The result is still 200 0 64, but the 200 0 0 0 is relatively more, I think the network is stable, so I switched to an American ECS instance called the Birch network. The performance is stable and the speed is fast. As a result, all of them are 200 0. Normal. For the server selection, leave it to the last article. I would like to say a few more words here. I feel that this 200 0 64 code should have a certain relationship with the stability of the network.
4. High-quality external chain construction
What external links are of good quality? I have been plagued by video, library, anchor, or url. I have done some work, mainly for Baidu Encyclopedia and Baidu Library, and then for some video websites that do little or no more than five every day, just to attract the spider, nothing else. However, when the website is promoted later, the external chain of encyclopedia is indeed overbearing!
III. Website promotion
The website's reply is certainly not only in some aspects, but also very tired after a lot of effort, but fortunately, the website has resumed recording. Google Baidu replied. In addition, google also gave a PR4 and a shortcut link, which gave me some weight. The point is Baidu. There was no weight at the time of restoration, so I made some effort to address the website title, internal chain, and relevance.
1. Website webpage signal-to-noise ratio
I have never paid much attention to this problem. Later I used the Golden Flower keyword tool, which is indeed a good tool. Please do not blame it for advertisement here, except for obtaining keywords, using the signal-to-noise ratio tool, in addition to unnecessary code on the web page, it also involves some layout issues of keywords. For example, my website has various prices in various regions, however, if you put some information on the homepage, the signal-to-noise ratio is only about 10%, but if you don't put it, you will use more than 98%. That is why there are too many keywords. Let's figure it out! (The higher the signal-to-noise ratio, the better)
2. Internal chain processing
The internal link is actually very important. It is a link jump to my internal link when users do not know the content or important content appears, basically, the website's main keywords, channel keywords, topic keywords, and necessary long-tail keywords are used. You can increase the weight of these links.
3. Handling related links
The link is also a matter of increasing viscosity. After reading an article, you will see some related articles or images at the end of the article, which is good for users to access the desired content. This is not to mention.
4. Add interaction elements
This is what Baidu advocates, so I also followed it. Baidu's social comments like buttons, Baidu shares, and more are added. To tell the truth, it is very useful, take this for example (there are also social comments, of course). Some articles commented by others will be synchronized to many places such as QQ space and Weibo to increase exposure, and all of them are @ my website. It is also an external chain.
5. Improving PV through statistical analysis
I often read the website statistics to see how others come in and then how they go out. I found that the viscosity of the website is actually very poor. Many of them are about to finish reading and closing the website. Analysis of search keywords, most of which are date + Region + keyword. Therefore, we think that the region is very important and everyone is concerned about it. Therefore, we have made a topic on regional classification, the effect is good, the weight is also greatly improved, and the ranking on the internal page is also increased a lot, and the weight naturally goes up. Paste a region effect chart
Here, we should look at the articles and content according to our own situation, and add these portals to improve viscosity and PV. If they are advertising alliances, the results will be good. Finally, it is the current situation of the website.
Website recording
Love site weight
IV. Server space selection
I believe that the choice of server space is a process essential for every webmaster, as long as it is a webmaster. I want to select different spaces based on my own needs. It is really not easy to find a fast and stable space. I have experienced the transformation from a domestic server to a foreign server, I would like to share my views with you. The overall domestic space is good, and the overall experience is high, but there are some suffocating problems:
The first is the record filing problem. The personal record filing number is easy to delete, but all webmasters have encountered such problems. I met twice, that is, my Agricultural Station. The filing number was deleted twice. Once deleted, the space would be suspended for me. Once, I almost couldn't even get the data, it's boring and scary.
The second is that illegal keyword checks in China are very annoying. I will give you two titles: "The Ministry of Agriculture will regulate the domestic fresh milk trading market in the near future" and "plum blossom shoots cutting plug seedlings ", for the agricultural site, this is a normal title, but the domestic detection does have illegal keywords. If I want to correct them, I am speechless!
For foreign space, I have used Hong Kong hosts and U.S. hosts, and Hong Kong has been using it for more than half a year. However, because Hong Kong exports generally have a small bandwidth, the routes are often unstable. Finally, we chose the United States, including the servers currently in use, all of which are from California. China Telecom and China Unicom have good access and the most important thing is good stability.
Many comments on the US host are slow. In fact, this is not the case. For the speed, I usually use Baidu to count the website speed rather than the PING value. I think the ping value is a bit dumb, I have used servers in Anhui, and the ping value is less than 10 or even 1 ms. But I only know that the speed is not very fast.
Therefore, choosing a good server is the foundation for stable website development. The server is not stable, and SEO is a waste. In my opinion, stability is the first, speed is the second, as long as the user access is smooth.