Webmaster How to do a good job of website optimization

Source: Internet
Author: User

Intermediary transaction http://www.aliyun.com/zixun/aggregation/6858.html ">seo diagnose Taobao guest cloud host technology Hall

Although SEO is no stranger in China, even have a trend to form an industry, but so far the industry has not a very scientific system of analysis methods. The reason is probably due to the particularity of the industry of search engine optimization. Search engines strictly keep their algorithms in check, and only publish guidelines that make it hard to know why. So a lot of seoer are playing a game that never knows the exact rules, which is the root of this industry mess.

I have repeatedly emphasized the importance of the Google website Quality Guide, and because this is the only correct rule that the search engine tells the site owner, if it's not even good to master, I'm not sure where we can get more authoritative guidance. But in practice, although you read the guide more than many people know more about the rules of the search engine, but just know this thing is not enough, a scientific system of analysis can let you go farther.

I think SEO after so many years of development, no longer should be the kind of relying on perceptual analysis to do SEO analysis methods. The common use of this analysis is: I think the search engine will be how. For example: I think the search engine will not be so stupid, this can be handled well, I think the search engine will take this factor as one of the factors of ranking ... If you are relying on perceptual analysis to do seo, then your SEO flow change curve is also very sensitive. Certainly not to go unfounded conjecture and hearsay. For example: There is no theoretical basis to guess what the search engine will be or whenever the relevant personnel of the search engine and what the authority of the speech, to blindly listen to.

Since the search engine does not tell us the exact algorithm, how can we build the analytical method of this scientific system? The answer is: Start with a theory that you know is certain to be right, and slowly evolve in practice.

In the previous "Web page loading speed is how to affect the effect of the SEO" in the analysis process, is from an exact knowledge of the analysis, and then got another exact impact on SEO traffic factors. In this process, convinced that there is no wrong theory is: Search engine crawler must crawl through that page, will have the opportunity to include this page. According to the article in the next data analysis, you can get: Web page loading speed will greatly affect the SEO traffic.

Then analysis, what measures can affect the speed of the Web page loading? network environment, server hardware, CMS itself can affect the speed of Web page loading. Optimizing any one of these can increase the speed of Web page loading. That immediately can be concluded: network environment Impact SEO traffic, server hardware impact SEO traffic, the speed of the CMS itself to affect the flow of SEO.

Then analysis, the optimization of the CMS itself can do what? Enabling gzip compression, merging CSS and JS files, reducing DNS queries, enabling caching, and so on can optimize the speed of the CMS itself. ...... These things look so familiar, because they have been told to you in Google Webmaster tools, "Web site performance." However, according to the above analysis process, we can know that the "site performance" mentioned in these optimizations, are the optimization of the CMS itself, and did not mention the network environment and server hardware optimization. It's just that you're sure these two factors are really impacting SEO traffic. If there's an article on Google Blackboard or Google's official blog (which needs to go over the wall) to show you how to pick a good server custodian, don't be surprised because you already know why. Google has always been in this way to tell you how to optimize some of the factors, just stand in their position, will not explain to you in detail why to do so.

Through the analysis of data, you can also know who has a greater degree of influence, who is smaller.

Many common sense factors can evolve in such a step-by-step way, and this analytical process is very scientific. Whether it's for yourself or others, you can explain the principles clearly. And in this evolutionary process, you will find that you are more and more control of SEO traffic. Each step of the evolution means that you know more about the search engine, SEO knowledge structure and improve a little, at the same time, the ability to control the SEO flow has become stronger. At the same time, you find that you and the Web designer and the engineer's contradiction also less and more, because the good SEO, is not lets the SEO and the website designer as well as the engineer's benefit is contradictory.

Knowledge structure, SEO controllability, departmental relationship

As long as experienced a lot of such analysis process, will certainly subvert the original SEO knowledge structure of many people. Because a lot of previous SEO methods, many are perceptual analysis of the majority, did not explain why to do so, there is no data on the support, not even theoretical support, so did not grasp the point. I said in the "participle and Index Library", you may think that the details of things, is actually the focus, you think is the focus of things, in fact, can be ignored.

So, in the day-to-day work of SEO, what is the ability to support you to carry out such an analysis process?

I do not know whether people still remember me in the "How To learn SEO" mentioned in the four kinds of capabilities, in this analysis process:

1, understand the search engine related technology and principle: can fundamentally understand the search engine, determine a lot of correct theory, and can find a lot of clues worth to analyze.

2, understand the technology of Web site production: To let you know what factors on the site can affect the search engine which aspects, and what method to solve the problem.

3, data analysis ability: can understand various existing factors how to affect SEO traffic, and rely on this ability to dig more factors. The scientific system of the SEO analysis process, from beginning to end are inseparable from the data support.

4, know the search engine that you want to rank: No matter how hard you try, there are some problems that are not understood by the data or the theory. Each search engine, like people, has a certain temperament. You can get an answer by understanding the search engine. Knowing the search engine also allows you to get more analysis.

Finally, this from common sense to the scientific system of the SEO analysis method than to understand some of the search engine algorithm is more able to control the SEO traffic.

Many people may refute this point of view, such as some time ago, my friend and I said that a foreign trade, the founder of the website is from Google, they will be able to do SEO, I said it is impossible. Only those who have done their own search engines will understand why. For example: Alibaba's business-to-business web site is also a search engine, I know the collation of the rules, but if you give me a business site, to me to get traffic on Alibaba, in the absence of a set of scientific system methods, I will certainly do bad. Because the search engine algorithm is not subtraction, not this factor plus that factor to do well can get good traffic. The designer of the search engine knows the weight of this or that factor, as well as the approximate results that may be produced, but the exact result is that it cannot be controlled. Otherwise, Baidu's people will not search thousands of words every day to see the accuracy of the search results. Google's success, in part because Yahoo used its search technology, Google to accumulate a lot of data, practice and improve the algorithm.

Moreover, within the search engine, only a very small number of people know the weight of various factors, most of the design of search engine engineers, are responsible for a specific task, optimize and solve a specific problem, such as the crawler engineers to solve the task of improving the efficiency of this piece, The engineer who is responsible for the content loss reduces the content of the index duplication. Even the engineers who designed the search engine, not to mention a company far away from other countries. Otherwise, Baidu and Google so many of the engineers to leave the algorithm has not been a long time to leak.

If you can use open source program to do a small search engine, you can understand this problem. Even if the search engine's algorithms are all deployed by yourself, you can't anticipate the results. And do search engine is one thing, in search engine pull traffic is another thing. Otherwise Google will not know after knowing the original web page load speed impact SEO traffic.

Article Source: http://www.suptb.cn reproduced Please indicate the source, thank you!

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.