Intermediary transaction SEO diagnosis Taobao guest Cloud host technology Hall
1. Why is space very important?
Search Engine crawler (Baidu's reptile is called Spider, Google's crawler is called a robot is a Web site crawling on the website of the page of the program, we call it a crawler, it is automatically run, it collects the Web site and download the site's page of the program and the page of all the links to statistics this includes the chain and the chain, After the statistics are crawled again in the form of txt text to save to its server.
Included is divided into 2 processes, 1, first by crawling to a page (the Search engine link you submitted) to collect links, 2, crawl to your Web page and download the page. 1, Cache server (snapshot) 2, Site Server (included) 3, index list server (ranking) they are not on the same server this is why our snapshots date different reasons. Because in different servers there will be different steps such as: We site the domain name does not have a home page, but the direct search domain name but also has a home page, this is the data is not synchronized.
Why is the stability of the space very important? Because Search engine crawler is to simulate user's behavior habit to crawl the content of the website. If the server is unstable or slow to open, the crawler will lose interest in the site when it crawls to crawl the site with data loss or crawl out of content. So no 娽 remind Seoer server instability is to SEO optimization has a direct negative impact.
2. So how should we prevent it?
1, must frequently carry on the website data backup (webpage data and the database data), the database backup website file whole package downloads to the local. In case of being attacked, we can restore the data directly, and modify the FTP password server password or space background control password, and temporarily cancel the site folder write right, the more complex FTP password is better!
2, the space to open faster than 6 seconds to SEO is quite unfavourable, if it is because the site pictures too many flash too much, we suggest you will be the best image compression processing not more than 50kb,flash can not use it, in addition, it is recommended to enable the compression of the server transfer function. Another reason is because of calls, especially the weather forecast, as long as the call to the site to open the site is slow, their own site will be very slow, that kind of online message software and site statistics have a line, many will also affect the opening speed of the site. Remember that the calling code is more open with more speed! If it is not the above reasons, it is very likely that the space or the server is slow to open, please communicate with the space business or the computer room if you can not resolve a decisive replacement, if the replacement space or server please remember several points:
First put the data (Web page files and databases) and then transfer,
Test space or server speed before the second transfer,
Third, first enable the two-level domain name for debugging or use this space to send a three-level domain name for debugging,
Four domain name resolution domain name resolution is best in the user access to the least amount of time,
After resolving the domain name, original space to ensure 24-hour stability, the original space can not be closed, the original space data can not be cleared, because the DNS resolution of the global effective time of 5 minutes to 24 hours, many old users have the original IP cache and each region DNS resolution effective time is different, and spiders are also cached.
3. How to choose a reasonable space?
The first to support pseudo static space, now most of the site source is dynamic pseudo static, so must support pseudo static.
The second best is to provide IIS log queries, and if you want to understand the crawler's movements on the site, you must view the IIS log, and the IIS log is best able to generate one per hour.
The third best support php+mysql space, most stationmaster use is Php+mysql website source code.
Four to support the space behind the online decompression function, if you do not support the background of online decompression and compression we upload files or backup will cost us a lot of time.
V to support 301 redirects and 404 error page bindings, 301 redirect can let our site weight concentration or weight transfer, 404 error page is the user and crawler friendly performance.
Sixth, it is best not to limit the number of IIS concurrency, limit IIS concurrency number of space as long as the thread attacks directly paralyzed.
Seventh problem technology can be solved in about 12 hours.