From the front and the station factors insight into the site's hidden dangers

Source: Internet
Author: User

Intermediary transaction http://www.aliyun.com/zixun/aggregation/6858.html ">seo diagnose Taobao guest cloud host technology Hall

SEO industry from the rapid development of the Internet to now increasingly saturated. However, for every different webmaster, own face of the site factor problem, or is to pay for people to diagnose, or is the forum, blog to seek some master level to let people see, this issue of the theme Chun SEO blog mainly want to let everyone understand, from the station factors and station factors to insight into the site there are hidden dangers, Let more seoer learn their own independence, their own mining problems, solve their own problems, the site's hidden dangers. Next, I will systematically analyze the problem in depth around these two factors. For external factors, I will continue to introduce another topic in-depth analysis.

Pre-Station factors

What is the pre-station factor, the former factor can be divided into two points.

1. For Domain name diagnosis

For the initial preparation of the website construction, we should prepare the domain name, the space service provider (if is oneself builds the server, may ignore). This impact on the site is the biggest factor, if for a corporate website or personal webmaster, when the domain name, if the new purchase of domain name, you can affect the factors negligible. When if the choice is to rush the domain name, the first domain name has done is what industry, if it is done in the gray industry or sensitive industries, that is, can infer the effect of later.

2. For Space diagnosis

For the diagnosis of space, first of all, we have to check the spatial response speed, at the same time, in the same IP address of other sites to detect (with the IP address site K station), as well as targeted to check the stability of the decision server factors.

Ii. factors in the station

What is the station inside factor, the factor of the station can divide into nine points.

1, the basic structure of the site (frame style) (corresponding problem: The whole station keyword ranking problem)

For the structure of their own site inspection, frame style, whether it is in line with the Baidu spider favorite simple and clear. And for the user experience, OK.

2, label optimization factors (title, keywords keywords, description website description, alt,b factors, H factor, nofollow application) (corresponding problem: The whole station keyword ranking problem)

For title title, keyword collocation, the choice of target keywords in 2-3

For keywords keywords, you can choose the appropriate byte no more than 100

For description site description, with the title of the match, forming a good association, as well as for not excessive stack of keywords, bytes no more than 200

For picture Alt, it is written as follows:< p>< img sc= "http://www.***.com/bozhu.gif/" alt= "keyword" border= "0" </p >

For the B factor and H factor, I think we should understand that the B factor is mainly for the keyword font bold, but the H factor is mainly for the keyword font size display, for the user experience, is a better demo.

For nofollow applications, this I have written a previous article "On the chain process should pay attention to Nofollow label" (A5 original address: http://bbs.admin5.com/thread-7695077-1-1.html), This article I mainly tell the chain should value nofollow, this time, I will be the system of nofollow for the station optimization means, in fact, nofollow for the station optimization, mainly for some do not want to be tracked by spiders to link content, for example, A website adds a picture advertisement or the link advertisement, at this time, we nofollow, has achieved our optimization goal. For nofollow How to write, this I will not say more, can see I wrote the "Talk about the chain process should pay attention to Nofollow label", which has a description.

3, Robots.txt factor (corresponding problem: the right to fall)

Robots.txt is mainly on the site is not related to the file to screen, not to let search engines know. For example, in the Web site file, there will be some garbage files, or directory generation, at this time, we will be written on the robots file.

4, URL factors (corresponding problems: the right to drop, K station, weight impact)

For the detection of the URL, I wrote an article "How to make the whole station page to achieve the standardization of the URL" (A5 original address: http://bbs.admin5.com/thread-7709926-1-1.html), but this time I more in-depth analysis of the importance of the URL for optimization, For the length of the URL control, as well as the URL inside the page garbled and so on, must be considered.

5, Static page factor (corresponding problem: included, down right)

For static pages, we all know that Baidu has always loved static pages, at the same time, for the inclusion of aspects, but also more love. So on the site to achieve static page is mainly more flattering Baidu, let it be favored.

6, keyword density factor (corresponding problem: down the right, K station)

For keywords in the home page distribution density, should be maintained between 2%-8%. Do not overdo the production of keyword density, on the contrary, is considered to stack keywords, resulting in the possibility of reducing the weight K station.

7, the chain, dead chain, Friendship link factors (corresponding problems: included, down the right, K station, weight impact)

Inside chain: The link between the website page and the page, the quantity of the inside chain does not point too much to the homepage, should be natural. For example, when the content of your article is related to another topic, you can join the hypertext link to achieve the construction of the inner chain.

Dead Chain: Site link is invalid, cannot point to content.

Links: For links, the need for regular inspection.

8 Sitemap Site Map factors (corresponding questions: included)

Site Map is a friendly performance of search engines, is the spider crawling to the site, to give a clear sense of the site's direction, that is, can be beneficial to extend the stay time, for the collection will be very good.

9, page depth factors (corresponding questions: included, weight impact)

The general spider depth crawling page, will only crawl three times, that is, the entire Site page depth, should be three as the standard, that is, home-level page-level three page of the way to do. Chun personally tested, for level four page, the general spider will not go crawling, not to mention the problem.

For all of the above factors, is to their own site diagnosis needs to face the problem, Seoer can start when the site has those problems, more to detailed evaluation of their own web site facing problems. At the same time, the next theme, Chun SEO blog will be in-depth analysis of external factors site diagnostics.

Original address: http://www.ajunseo.com/post/26.html Chun SEO Blog

Related Article

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.