New station online before the SEO need to do work list

Source: Internet
Author: User
Tags nets

the launch of a new website looks simple, but to be basically in line with the appetite of search engines, then we need to do a lot. But how to do in the site on the pre-launch of the site to optimize the preparation of the website, the following author will provide 14 new stations on-line before the preparation of the matter bar.
first, domain name and space preparation work
undoubtedly, the choice of domain name and space is necessary. Does not seem to mention here, but the author here still want to mention more, because a good domain name to our website brings value is very big. So when we choose the domain name of the website, we must remember whether the domain name is easy to remember, whether it has been used before, the function of domain name resolution is complete, whether or not you have full ownership of the domain name, and so on. And for the site of the choice of space, the same is the exception of the important. Remember that the most important items are whether the space is stable, whether the access speed is fast, whether the space background function is complete, whether the space provider provides a scheduled backup, the space quotient service is good, and so on.
The choice of space here to mention is, if you do not plan to record, choose foreign space to pay attention to choose some high-profile space quotient. Because if it is a small company, and when they "run", our site is basically "down the drain". And for the choice of domestic space friends, you must have to record, this time also have to look at the qualification of space quotient, there is no doubt, million nets, new nets and other large sites of the relatively formal record, through the rate of natural high. Therefore, we should pay special attention to the choice of space quotient.
Second, the keyword positioning to be clear
in this regard, it is very important. Because only to determine the site's target keywords, the promotion of all the later work is much smoother, or is blind to promote. Many friends suggest that the new station on the line to try to avoid the hot keywords, the reason is the hot keyword is not easy to do, for the new station to do this is inevitably difficult. and choose more easy to do keyword, and then one step at a pace with the weight of the site to promote a series of micro-adjustment. But in my opinion, if you are going to do a website, the mind inside is a certain direction, I want to optimize which keyword, and to achieve a kind of effect. In the author's opinion, the site in the early keyword positioning, there is no need to do some of the less competitive words, on their own ideas to do, because so you can see the micro-changes in the site, from no weight to the right to heavy process, we still need to do something. At the same time, if we from the beginning of the target keyword is very clear, but also very likely to bring a good image to the search engine, thus more trust in our site, after all, do the site mindedness is always good.
about keyword positioning, here again on how to filter the target keyword. The author here to suggest is, must rely on third-party tools to select keywords, here are simply recommended several tools: Baidu index, Baidu promotion Backstage keyword Recommendation, Google keyword tools. At the same time we have to pay attention to keywords in the Search engine dropdown box, related search and keyword search volume and so on.
Third, the website code streamlining and practical
code optimization is a necessity in website optimization, but many people neglect it. Web site before the official launch, should be more observation of the site's code, such as the speed of the Web site is too slow to open too much redundant code is dragging the hind legs. The author here provides a few code optimization of the simple implementation of the program for your reference: whether the site using Div+css for layout, the site's JS effect is not too much, the Flash file is not too large, the picture is compressed processing, the CSS is used external calls, the content of the head is not too much and so on.
If the code is streamlined, it can also implement all the functions, then the streamlined effect is obvious. While you won't see anything in a short time, you'll soon find that search engines will like to crawl your site content and the same user experience will multiply.
Four, the website structure design is in place properly
website structure design, to a certain extent, how to do well, is very conducive to search engine spiders to crawl. Here the author's most important suggestion is that the structure of the site is designed to use a tree-shaped structure to plan the overall. Because search engines like the tree structure of the site, so we have to do so. Of course, the reason why the search engine likes, is also based on the user experience, because now the general user experience a better site, its site structure is the use of the tree structure to complete the layout.
structural design also need to pay attention to, on the one hand, the link between the various pages of text is appropriate. A website is made up of many web pages, but each of these pages forms a complete website. By the appropriate setting of the text in the station, relative to the promotion of the Site keyword ranking has a pivotal role, so this is also cannot be ignored.
Five, the original content of the site to be fully prepared
website is not a program to upload to the space is done, the most important thing is to keep our station active, and the most basic should belong to the site content update. Why should the site be original, here is highlighted because the original in the site optimization is an important role, simply said that the search engine especially do not like the pure paste copy content.
So when we are online, we must prepare a lot of original content to cater to search engines. So as soon as possible to pass the sandbox period, slowly make the search engine to give certain trust. It should be noted here that the content of the site needs to be updated regularly, can not appear today Update 10, only update one article tomorrow. Or that sentence, to regular quantitative to update the site.
vi. Website URL standardization, determine the preferred domain
site URL to reach the standard, not the same page appears multiple URLs point. Because once the standard is not unified, it will reduce the weight of this page. Here is a simple example, for example, some forum open source programs, in general, a post address is more than one, a dynamic address, a pseudo-static address, and there is the "Archive" address. This time we have to consider exactly which address to determine, in general, "archive" address is to do for SEO, so here I suggest this address you want to stay, and dynamic address, as far as possible to prohibit search engine crawl it.
The main domain of the site to determine the preferred domain, the same is conducive to the home page weight of the centralized. Take our common Discuz forum for example, the general Home address form for the site domain name, domain name/index.php, domain name/forum.php, this time we have to modify some code to achieve access to the site is the preferred domain name.
Seven, the production of bread crumbs navigation
breadcrumb Navigation, this is the site optimization process to be implemented in the issue. Because good breadcrumb navigation can not only make the search engine feel that our station has a sense of hierarchy, to grab more pages, but also to make our users not in the site browsing process "lost", to enhance the user experience is very effective.
viii. settings for robots.txt files
as Baidu Encyclopedia in the robots.txt interpretation, it is an agreement, is the search engine when visiting our site when the first to view the file. In this case we can set which files are allowed to be crawled by search engines, and which are not allowed to be indexed.
robots.txt settings are very important, if you add this file in the root directory of the site, then you have to be cautious, because the wrong robots.txt is likely to cause the search engine does not index your site. So when we use the Open source program to build a station if the file is not so familiar with, you can default to the official robots.txt on the line, which will also reduce some trouble. Of course, if you have a good knowledge of its settings, then according to the situation of their own site settings, the robots.txt can give us the effect of the site is very large.
Nine, remove the dead link of the website
website dead link is very scary, on the one hand for search engine is unfriendly, easy to cause bad impression to search engine and lead to lose weight, rank drop and PR drop and so on influence. For users browsing the site, it is even more frightening, if there is a dead link, then bound to give users a very bad feeling, which leads to a decline in user experience. Therefore, the site before the launch must be tested to see if there are dead links.
10, the production of site map
site Map, it's plain that this page contains all the page addresses on our site. Web site map is generally divided into two forms, one is presented to the user, easy from the map to get what they want, and just these things on the site users are not easy to find. And the other is presented to the search engine, so that the search engine for the content of our site crawl. So the site map for a long-term operation of the site is still very important.
11, 404 page Additions
about the 404 page, presumably even the friends who do not engage in the website will often meet. Whenever we visit an invalid page, a 404 page pops up to prompt the page for information that does not exist. and the role of 404 pages, for users, enhance the user experience is the most important. But when we set up the 404 page, we have to consider whether its status code really return "404", if you return directly to "200" or "302" then the search engine will easily be punished for our station. So there is a need to pay special attention to this when making 404 pages on the website.
12, the rational use of nofollow label
nofollow tags added to the hyperlink means to tell the search engine, do not track this page, so as not to pass the weight of the page effect. Such settings appear at most on the site "Company Profile", "we" and so on the page, so as to achieve the effect of page weight concentration.
13, establish a good attitude to do the station
"Good to do the station mentality", this point why put to the last, is because the mentality of this thing is indeed the decisive factor in the success or failure of things. If we choose to do the station, we have to deal with the search engine, and the search engine when dealing with, because of its uncertain factors too much, to maintain a good mentality, calmly face all the changes in the search engine algorithm, not anxious not impatient, everything from the actual, I believe our site will stand out.
    

New station online before the SEO need to do work list

Related Article

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.