Intermediary transaction http://www.aliyun.com/zixun/aggregation/6858.html ">seo diagnose Taobao guest cloud host technology Hall
In doing SEO there is such a sentence: Details determine success or failure, in SEO optimization we have been emphasizing the details of attention, because it is possible that we can ignore in the optimization of some of the details directly lead to our optimization into the bottleneck, a lot of webmaster just took an old station to optimize, will encounter some thorny issues, such as the ranking stalled , the ranking has a small decline, included sometimes rise sometimes decline and so on, these problems we sometimes difficult to find solutions, then when we take over a site in the optimization of the medium-term or a problem, we need to analyze the details of its various aspects, to explore the existing problems, In order to let our optimization become active, today we discuss with you about the old station optimization diagnosis details analysis.
When we take over the site, the first thing we have to do is System analysis, system analysis is not from the macro view of this site, but from every aspect of each detail to analyze the site, when we took a site we need to analyze the following points.
Robots.txt operation
Robots.txt main role is in the station page and the path of a processing, can help us to block some of the content of the page and dead links, and so on, we look at the three aspects of Robots.txt
1. Check that there is no robots.txt,robots.txt is our website optimization must do, did not do Robots.txt explain this site SEO do very bad, check there is no Robots.txt method is very simple directly enter the domain name/ Robots.txt can be, the general Robots.txt are in the root directory, if we can see, if not that must hurry to do.
2. Check whether its Robots.txt is written correctly, Robots.txt Grammar general webmaster should understand, here to emphasize is the allow and disallow arrangement problem, do not put allow in front, to put the disallow in front of the first, to prohibit in the permit.
3. Check the page and path of the Robots.txt screen, we webmaster to learn some common web site identification, we open each of the blocked pages and paths to see whether these pages are invalid page is really the shield, or superfluous, at the same time we also want to site: domain name check his collection, It is important to see if there are any pages that need to be screened.
Diagnosis and analysis of Web site path
1. Dynamic Path parameter detection many Web sites are adopted dynamic path, and each dynamic path are parameters, if a dynamic path of more than 3 parameters or more, that will directly lead to the inclusion of the page difficult, then how to see the dynamic path of several parameters, very simple as long as the equals sign in the path, Several equal signs have several parameters.
2. Pseudo static path of good and bad pseudo static path is also our common, at that time there are a lot of pseudo static path does not good also affect its inclusion, we are in the process of inspection, if his pseudo static path in the existence of a question mark or other parameter symbols or Chinese, it shows that this pseudo static path to do very failed, Some pseudo static path such as WP pseudo static sometimes will appear after the domain name a/index.php/, such pseudo static is also very bad, this we should pay attention to.
3. To examine the rationality of the path of the rationality of the problem to be emphasized to everyone, that is the Chinese path and English path problem, a lot of webmaster like to use the Chinese way to think that looks intuitive, but the Chinese path is included in the path than the pure letter of the way to include a lot of difficulties, so the webmaster should be cautious with Chinese path,
Inspection of common optimization elements
Common optimization elements include: canonical tags, nofollow tags, h tags, alt tags, etc.
1. Here to tell you the difference between nofollow and robots.txt, nofollow is mainly to prohibit the weight of the transmission to the station outside the site, while the robots.txt is mainly used to screen the site, one is responsible for the outside, one is responsible for the inside, a clear division of labor.
2.canonical tags are often used by us especially in the Forum, this label is generally applicable to a higher degree of similarity in the list page and page pagination, if you have a few pages or a high degree of similarity of the list you need to tell the search engine you use which page to participate in the keyword ranking, the specific wording as follows:< Link href= "http://www.***.net/zhishi/" rel= "canonical"/>
3.alt label Everyone is very familiar with, every picture of our site to add this attribute, tell the search engine picture content, this is also important not to ignore.
4.H tags are mainly applicable to the content of the title of more pages, if your content page under the headline has a title and a small title, the title is more, it will add H tag, so that the page has a rational, general home does not need to add this.
Analysis of Web page similarity analysis and website optimization mishap
1. The similarity of the site is mainly to look at the site's model text, look at the site's model text accounted for the entire content of the site, if the proportion of the sample text occupies too high, then directly affect the similarity of the site, this time need to streamline our boilerplate text.
2. Website optimization mishap, the main is the opening speed of the site diagnostics and long title, long path, such as these problems, the speed of the site to optimize here to recommend a tool that is the pagespeed of Firefox he can be very good to find your site affect the speed of the reasons, and give solutions, long title, Long path These problems we have to carefully analyze, not to be overlooked, these are the optimization of the deduction points.
Five, the site outside the chain of inspection
The main inquiry has not bought a link, the details of when to start to buy the link, where to buy the link, now there is no buy links to see its Baidu related domain, check out whether the site is a mass of content, view Google link link to see the site's High weight page processing.
The reason analysis of the website down right
We need to ask and analyze the following questions if the site we are taking over is down.
1. The website title has changed, the procedure and the path is replaced, the path has replaced has done the corresponding processing like robots.txt processing, has not changed the template.
2. View its IIS logs to see if it is a server issue or other problem causing the right to drop.
3. Analysis of the stability of the chain, resulting in the right to reduce the chain of the main three: first, the large-scale loss of the chain, mainly the account is deleted or buy the link instability. Second, the site has a large number of garbage outside the chain. Third, links in the link to reduce the right factors.
After analyzing the whole website, we can analyze the problem of the website, and make the solution for different problems, and then optimize the work for the next step.
This article by http://www.51diaoche.net original A5 Welcome reprint