Website Basics experience score how to control

Source: Internet
Author: User

Intermediary transaction http://www.aliyun.com/zixun/aggregation/6858.html ">seo diagnose Taobao guest cloud host technology Hall

We know that the search engine for a comprehensive score of a website is divided into the basic experience score and user value experience score, these two points together is a site's final score, search engine will be based on this score to judge a site good or bad, So we want to get the keyword rankings want to get traffic must start from the base score value, in the same industry if our base score is not well and our competitors do well then we are very difficult to get the keyword ranking, today to share with you the basis of the score value how to do.

Baidu Statistics SEO suggestions (search engine friendliness)

Here suggest webmaster all join Baidu Statistics and Baidu Webmaster platform, because there are a lot of features for the site is very useful, Baidu Statistics has a recommendation of the function of SEO, 100 points of the full mark will give you the site has a score, this score the higher the better, mainly by the URL path and page content composition, Below is a list of your site which does not meet the requirements of the search engine and deducted the score.

So why does he have to look at URLs and page content because these two things are directly affecting spiders crawling. Spiders in the process of crawling path factor is very important, spiders crawl different URLs will form a large number of paths, for these factors search engine called search engine friendliness, even if we do a good job of the site but our search engine friendliness is less than 60% of our site can not get a good ranking.

Suggestions about Baidu

1.url length Baidu recommended URL of the longest length of not more than 255byte if our path exceeds this standard value, then spiders are difficult to track to your site, then our path will be spider discarded, means that our path is not included in the possibility.

2. Static page parameters use dynamic parameters on static pages, will cause spider many times and repeat crawl, in the path inside each has an equal sign will exist a parameter, our path inside the parameter more, the spider is harder to crawl and identify, generally three parameters within the normal, and our path to use standard parameters, Otherwise the search engine will be deducted points.

3.Meta Information Perfection The information on the headers is written by a search engine that includes title descriptions and keywords, and these general websites will notice.

4. Picture Alt information All the pictures of our website should be added to tell the search engine what the picture is.

5.Frame information is the framework information frame will lead to Baidu Spider crawl difficulties, Baidu recommends you try not to use, specific we in our source code to search whether there is a frame on it

6.Flash and video text information many of our web sites will do Flash wheel map will ignore the Flash text message, this and the picture of the alt description is a reason to tell the search engine flash is what, So we need to add text information to the side of the flash and know what it is when the search engine crawls.

We do not put important content information in the Flash or frame, so the search engine is unable to identify the crawl.

  

  

The search engine friendliness mainly refers to two aspects one is the convenient search engine spider crawls the website the path, the second is the aspect search engine crawls the page inside the text content information.

Second, Baidu statistics website Speed diagnosis

Baidu Statistics also has a Web site Speed Diagnostics tool, test the opening speed of the site this is also the basis of the score of a very important aspect. Generally speaking more than 80 points is more healthy.

  

At the same time, the tool will tell you where the site needs to be improved to make the opening faster.

  

Third, Baidu Webmaster Tools

Baidu Webmaster Tools inside the security monitoring, crawl anomalies, external chain analysis and other functions we have to our site for timely detection, in time to identify anomalies and to proceed to solve quickly, For example, we look at the chain analysis inside when we found that there is garbage outside the chain of the time we want to use his own rejection of the chain of the way to reject this garbage outside the chain. So our base scoring value will not be too low by search engine calculation.

Summary: Site base score is we can not ignore, the fundamental factors of the site we are in the construction of the site will be all good, like my website (www.51shiyanji.net) to build the site in accordance with the standards of the search engine to do, so the basis of the score will be guaranteed, If in the latter part of the site we found that the basis of problems more timely correction.

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.