Intermediary transaction SEO diagnosis Taobao guest Cloud host technology Hall
Go to the dual-line server to buy a good domain name and space, you can set up their own web site or blog, before setting up a site, it is best to give their own site plans to develop a detailed plan for the future, planning the specific development of the Web site, to avoid the future of the framework of the site to change the structure of the East West. Often change the content of the structure of the Web site architecture is very unfriendly to search engines, just like people, you often tell others to do a set of words, others will slowly distrust you, alienated you is the same truth. So do you with search engines, do not today this site is the title, tomorrow mood is not good or inspired to change the title, so that search engine every time you crawl the site to catch the same content of the article, the Yang search engine spiders will think you are playing it, it will begin to you more and more distrust, The weight of your site is also getting lower, and one day it can't stand, you really "stop" the food. It is said that GG is not easy K site, General K then you have to wait for 1.5 to say not necessarily; Baidu is a regular K-station, relative to GG more frequent, but after K, and so on a few months sometimes can be lifted.
These days learned about the impact of search engine optimization, a little want to change the domain name and space impulse. Think of oneself endure 2 months, now not easy to be Baidu, GG included nearly 200 content, suddenly change space and domain name will affect their site in GG and Baidu weight, or I do not have PR value, directly to the GG, Baidu left to change the impact of a period of time will not come to your site collection articles, It's hard to get the search engine's favor, some worry, some fear. Although the general search engine more like the content of the original article, but now there is no record, and did not get the PR value, some fear. Tell Nan a secret: lxdong.com I have registered myself, haha. Thanks for reminding me,!~.
Anyway, every time you write something is 2 pieces of nonsense, and then began to speak, may be used to. First of all, the establishment of a website to be consistent with the requirements of the organization of the Internet, HTML code, only in this way, your Web page can be more easily crawled by search engine spiders or robots crawl, so that spiders or robots crawl the site as much content, if your site frame structure in a lot of errors, May let the spider or the robot be forced to give up to turn, follow the friendship connection of your station to leave. Only the web site that follows the rules of the organization can get the highest weight and PR value of the search engine. We typically write the organization content specification of the universal standard to the top of the page leader, and when a spider or a marvel comes to your site, you see: This code will know that your site is following the organization Of course not to add such a piece of code to such a simple matter, you add does not represent you write the HTML code is the specification, for example, North Drift leader East is not really the North drift leader Mao Zedong, someone once joked to tell me: "You first see your space title, one is to write Mao Zedong's Thing", after listening to me is very depressed. You can verify your site by using: http://validator.w3.org to see that the code is not in line with the standards of the consortium.
Second, in the Web page, try to use CSS to control the page appears in the Div box, I think of him as a small box, practical CSS can control the occurrence of repeated code, a word change in the structure of the site is only CSS can be done, of course, to set up before you can, as much as possible with concise code to express the same meaning, such as, Practical Dreamweaver software design to do will produce a lot of useless code, you can according to their ability to properly define CSS, which can save a lot of unwanted code. CSS and div binding doomed table of Laid-off, cleared away those insignificant code, can add a lot of web browsing speed, if your site due to space instability, or code too messy, too, resulting in a decline in browsing speed, this is also very unfriendly search engine. CSS and DIV reasonable combination of use, you can maintain the visual consistency of the human eye, but also easy to control, than the table requires one by one adjustment, but also convenient a lot, save the time to find the code, to avoid different areas to display errors in the effect of deviation. such as: With a table is likely because you did not pay attention to a corner, resulting in the layout deformation, I met a few times, encountered that kind of trouble, impatient when still unable to find a problem. If you use CSS, change a defined div value can see the same change, do not need to go one by one to break the trap.
There are generally in the pages between the script, the following I referred to JS, JS is the script suffix name. We can try to put a similar JS file into the same JS file inside, save the circular call, the same JS file can also affect the browser access, but also easy to manage, eliminating the search for a west of a JS file in distress. Most importantly, through the collation of JS files, you can let search engines faster to crawl your site more content.
In addition, the web is also a very important task of static, some people ask, that is how to distinguish between static and dynamic pages??? Very simple, look at the page connection suffix name, generally html,htm,xhtml are static (such as: www.ldong.com/post/emarketing.html), the remaining such as ASP, or connection with "=", "?" are dynamic pages (e.g., www.lxdong.com/guestbook.asp). 2003 when several major search engines have refused to include dynamic page, because it may be dead cycle or is a trap, the impact of the search engine normal work, the last two years has been basically controlled, embedded anti-virus software, the end of the dynamic Web page making virus connection history, but by contrast, Static pages are much better indexed than dynamic pages. Whether the search engine on the dynamic page crawl technology is not mature, or dynamic page is still arrogant, choose static page is absolutely very good! Static pages are easy to manage, maintain, and do not need how sophisticated database call technology, through simple manual modification can achieve the desired effect, This of course refers to small web sites or blogs practical, Z-BOLG and Dede systems also support page statics. Today, a person on the QQ asked me, z-blog how to set up static page, in fact very simple, in the Z-blog site settings There is a static page settings and file reconstruction settings, as long as the choice to enable the custom static directory features and generate classification and static archiving of the first 2 options can be achieved, If you want to define different kinds of purpose files plus, put different kinds of articles in different categories, need to manually change, some trouble, I will not be in this detail.
The size of the Web page of the processing of the search engine is also very influential, if a Web site is a picture, open must be very slow, and the first file will be very large, this is the search engine most jealous. In 2003, the network has been circulated a word, a single page size of more than 100K of web pages, search engines generally included incomplete, this topic is very sensitive to 2003 years of network situation, that is just new ADSL, internet café industry when the rapid rise, many people still use the telephone line online, You said it was a lot of speed. When uploading pictures as far as possible with PS processing, open file Select storage for the Web-specific format, you can save a lot of space; when adding falsh to the Web page, like PS, open the file that chooses to publish HTML format, directly import the code into the Web page, If you are worried that some browsers do not support some form of Falsh player, you can write more than one version of the Falsh for the browser to choose. This will increase the speed at which visitors can watch Falsh. Note that, please use the so-called Web page weight loss software, reduce the content of the Web page is likely some features will not be displayed.
Finally, say ROTBOTS.txt file, z-blog program default does not, almost forgot to introduce rotbots, it is a plain text file, the purpose is to tell the search engine through this file you can crawl, which is not want to be crawled, so that the spider better crawl, specific how to set please forget to see it.
Three search engine spider name: googlebot-google spider baiduspider-Baidu spider Yahoo slurp-spider, all spider name case-sensitive and space general format!
User (user agent settings meaning):(search engine spider name or robot name
Disallow (refusal meaning):/(the relative path of the file to which you do not want to be accessed)
For example: User-agent:googlebot
Disallow:/upload/post/emarketing.html
Come on, you must have something you don't want people to see. You can try to get the Rotbots files sorted before the search engine visits your site again, and then upload to your space root directory.
Today, sleepy, sleep first, what do not understand in the following message or add me qq:3287924 I wake up tomorrow to you to answer.
More wonderful, please pay attention to SEO perfect theory Explorer-North drift leader East Blog:www.lxdong.com