Intermediary transaction SEO diagnosis Taobao guest Cloud host technology Hall
In our website optimization process, the results of natural search, to a large extent, depends on the content of the site. In fact, even though we've done a lot of hard work to optimize the site, we still have to check the content of the site regularly to make sure its status is in a virtuous circle without any signs of deterioration.
Measuring the search friendliness of content can help us identify and fix the problem, it may include the question of the content tag, the problem of highlighting the keywords, the problem of keyword density, links (including the inner chain and the chain), and how to be included in the search index library issues. There are too many problems we need to do, we must not wait for the problem to find a way to solve, should be prevented, we can never modify the error to maximize the conversion rate, we continue to conduct content testing will avoid many problems and improve already very good content, Next, Hefei SEO specifically said on the site content evaluation indicators.
Evaluation of content Labels
The Content Tag Analysis report can embody many problems appearing in HTML coding of Web page, many aspects of which can be analyzed or detected by tools, so it needs to be tested continuously. It contains the following:
HTML Tags: If your page has a large HTML coding error, it is likely to affect the spider program to process your Web page ability, the consequences are directly not included, especially such as "Baidu Spider" This relatively fragile sensitive spider type, the wrong HTML coding will have serious consequences.
Title and Description: Sometimes, due to a number of CMS programs (such as Zblog) caused by the description and keyword tags have problems, but also because of carelessness or do not understand SEO led to do not do title, this point some of the enterprise site is particularly serious, Gaoliangju in helping customers analyze the site, Often very helpless to find the title of some corporate site all the same, that is only a company name, by the time the description of the site is also very single, there is no way to arouse the interest of viewers click. In fact, each page has a unique title and description is not only for SEO, but also for the search results can give visitors the best user experience.
Redirect: Your webpage may use "meta Refesh" redirect, this is cannot be crawled by the spider program, at the same time if your website has a lot of jumps, also can't get very good crawl, this is a lot of Taobao guest website is easy to be k reason, jump link too much.
Page size: If the content of the Web page is too bloated, also not conducive to spiders crawl, as appropriate, paging, on the one hand will make the Web page slimming, on the other hand can increase PV.
Ii. keyword prominence and density assessment
The key part of each page is to improve the intensity and density of the keywords, confirm the page design to distribute the keywords appear in the title and other prominent places, and spread across the page, appear natural. In general, the keywords should appear as far as possible in front of the position and the end of the section of the article.
Write the content of the time to take into account the problem of keyword density, the current mainstream view that keyword density not more than 5%, in fact, repeated tests show that the keyword density of about 10% is advisable, when the writing content becomes a habit, in fact, do not need to deliberate to consider the keyword density is how much, Because naturally will follow the keyword density to carry out.
On the keyword density test, the internet has a lot of keyword density tools, you can choose as appropriate.
Iii. Evaluation of Links
In the evaluation of links, both the internal chain and the chain, it is critical to the ranking of search engines. We need to find out the problems of links early and avoid the negative effects caused by links.
Each month we should be on the site page of the dead link detection, because the dead link will cause the search engine spider crawling suddenly interrupted, the site is easily to be punished by search engines. We need to use professional dead link detection tools for page detection.
Not only to the site's internal links to detect, in fact, in our external chain release process, should record the main distribution of the chain, especially high-quality outside the chain should be detected at any time, so as to avoid a sudden loss of a number of high-quality links encountered search engine down right. Sometimes we will find that the sudden ranking is lost and can not analyze the reason, in fact, the loss of high weight links is one of the possible reasons.
Iv. Assessment of Web page collection
No matter how wonderful our content is, how good the Web design is, if this page is not indexed by search engines, it is meaningless. Because no one will ever see it.
In the evaluation of the Web site, we will use a professional words: Search engine indexed index, it is mainly refers to the actual site has the number of pages and search engines specifically included in the proportion of the number of pages.
At least once a week to check the rate is very important, because on the one hand we have to track the progress of the work, to see whether the search engine spider crawling is smooth, the Web page path has a good structure to facilitate the index, on the other hand, we often carry out the test can be in the problem, timely For example, suddenly found that the number of Baidu included a sharp reduction, there is no doubt that there has been a problem, to carefully analyze the problem, and then resolve as soon as possible. Our goal is very clear, is to let the search engine as much as possible to include our website page.
For example, when you find through the log Baidu Spiders crawl some pages, you can not find these pages in search engine results, or when you find that your site included in the number of days decreasing, or drastically reduced, and more often appear in the day after a number of pages, In a few days these pages were deleted in the search engine index.
These situations are worth your careful and timely response, for example, after the deletion of the reason may be your page repetition is higher, at this time should strengthen the original content increase, and if you find spiders have crawled some pages are not included in the time, then you should consider whether your web structure does not do well, or the contents of the problem, so as to make corrections. The number of pages included in some way directly reflects our search engine optimization effect. Some people may say that a lot of pages are not necessarily every page will have a ranking, then what is the meaning of these pages? It is reflected in two aspects, one is that visitors to your site may not know to see a page to leave, such a jump rate is too high, indicating that your website stickiness is too low, most of the time, for a normal site, Viewers will click on many other pages to browse, in this sense you need to have a large number of pages, in addition to be included in a large number of pages, in favor of a large number of internal chain to enhance the weight of keywords.
In a word, evaluating content is an indispensable part of our daily SEO work and directly determines the efficiency and results of optimization.
This article is provided by Hefei seo@ Gaoliangju, website http://www.airghost.cn/Welcome reprint or communicate with me.