The successful way of Internet Entrepreneurship (IV.): Improve the function to make the website more easy to use

Source: Internet
Author: User

Intermediary transaction SEO diagnosis Taobao guest Cloud host technology Hall

Operation is the key to the continuous growth of the site, only the content of the site for detailed and unique planning and implementation, coupled with the essential operations, the site can be in an invincible position in the competition.

Quickly enrich content for the new website

Second, to avoid duplication of practical content anti-theft law

Third, Content integration production site map

Iv. Forum Moving software Fast transfer data

V. Prevent Web site content from being plagiarized

Six, to the domain name limit say no

The above details can go to "Internet entrepreneurship Success (IV): Prevent Web content from being plagiarized" read.

Seven, let the website style change with time

Now many websites have the function of style switching, but usually only by manually clicking on the switch template, for personal sites or niche sites, if you can automatically adjust the style of the site according to time, such as in the daytime show a refreshing page template, and at night to show the style of black tones, will give visitors a refreshing feeling. This makes it easy for users to remember your site, but also highlight the site's "personality", below to introduce the implementation method.

1.ASP website Style Code

For the use of ASP architecture of the site, you can directly use the following ASP code, you can achieve the site style changes over time, the method is directly in the ASP program <head> and </head> tags add the following code:

<link rel= "stylesheet" type= text/css "href=" <% If Hour (now) <12 then Response.Write "Morning.css" ElseIf hour ( Now) <17 then Response.Write "Day.css ' Else Response.Write" night.css "End if%>"/>

Where morning.css represents the style sheet files that are displayed during the day, and Night.css is the style that is displayed at night, it is only necessary to make separate CSS files.

2.PHP Site Style Changes

If you are using a PHP-structured web site, you can use the following PHP code. Add it to the PHP file header code in the Web site.

<link rel= "stylesheet" type= "Text/css" href= "<?php" $hour = Date ("H"); if ($hour <) echo ' morning.css '; else if ($hour <) echo ' day.css '; else echo ' night.css ';? > "/>

Tips:

When you add code, you need to delete the default style sheet link file, or you may have page dislocation.

3. Static page with JavaScript code

For the site page generated HTML static page, the same can be a JavaScript code to achieve the site style change effect, the use of a very simple method, just add the following code to the site page.

<script type= "Text/javascript" ><!--function Getcss () {datetoday = new Date (); Timenow=datetoday.gettime ();
Datetoday.settime (TimeNow); Thehour = Datetoday.gethours (); if (thehour<12)
display = "Morning.css"; else if (thehour<17) display = "Day.css";

else display = "Night.css"; var css = ' < '; css+= ' link rel= ' stylesheet ' href= ' +display+ ' \/'; css+= ' > ';d ocument.write (CSS); --></script>

Tips:

Many browsers prohibit JavaScript from displaying, so don't forget to set a default CSS style to ensure that users can access the site normally.

The site has a basic framework and content, in fact, has been established, but the next maintenance is a long-term work. In the "New Station Maintenance Trilogy" article series, we will be the site of some common maintenance problems focused on the description, including perfecting the basic functions of the new site, production site map and content replication tips such as novice webmaster most likely to encounter problems.

Eight, improve the function to make the website more easy to use

1. Learn about browsing data

How many people visit the site every day, where they all come from, what operating system they use, and browse through those pages? The simplest and most useful way to know the answers to these questions is to add a statistical program to the site.

The dimension (vdoing) statistic is a client program that combines instant messaging software with web analytics Unlike other statistical services, it is a new web site traffic statistics and data Analysis Query tool, characterized by various types of data are based on software clients, and built-in a variety of data statistics modules, More detailed site statistics are available. And the client that supports the handset is already launched.

Figure 11

Dimension statistics are based on basic Web site data, plus demographic data and user model data. Although overseas already has many similar website statistics service, for instance Chatstat, Woopra and so on. Dimensional statistics and similar to them, are the traffic statistics and real-time communication functions, but because they are not Chinese software, servers placed in foreign countries, if it is the domestic web site using foreign statistical system, there are many problems in the speed and so on.

To use the statistical program, first download the installation software, and then register users, after entering the software, first add the need to count the site, and then get a code as follows:

<script type= "Text/javascript" charset= "UTF-8" src= "Http://s.vdoing.com/u/18/9500.js" ></script>

<noscript><a href= "http://www.vdoing.com" title= "vdoing statsx no.9500" ><img src= "http:// Simg.vdoing.com/m/9500/x01.gif?noscript "border=" 0 "></a></noscript>

Put the above code at the bottom of the site, the program will automatically start the statistics of the data, the reader can see the it.endto.com placed after the bottom effect.

2. Set up site search for content is no longer difficult

Generally speaking, the website program's built-in search function occupies a larger resource, and the search accuracy can not be guaranteed, using a large search engine to achieve the site search function, not only save resources, but also can make the search speed become faster.

We can use the search engine's station code to place in the website, this way not only the fast search speed, and has not had the influence to the website resources. The only drawback is that the site content needs to be indexed by search engines before they can be searched. But for most small and medium sized websites, it is already very good enough.

Baidu's users relatively more, many netizens are accustomed to using Baidu search. The following is Baidu search code and effect of the legend. In use, you need to replace the domain name www.shudoo.com with the domain name of your site.

<script language=javascript>

function g (formname) {var url = "Http://www.baidu.com/baidu";

if (formname.s[1].checked) {formname.ct.value = "2097152";}

else {Formname.ct.value = "0";}

Formname.action = Url;return true; </script><form name= "F1" onsubmit= "return G" >

<table bgcolor= "#FFFFFF" style= "FONT-SIZE:9PT;" ><tr height= ><td valign= "Top" ><img src= "http://img.baidu.com/img/logo-137px.gif" border= "0" alt= "Baidu" ></td><td><input name=word size= "maxlength= ><input" type= "Submit" value= "Baidu Search" ><br><input name=tn type=hidden value= "BDS" ><input name=cl type=hidden value= "3" >< Input NAME=CT type=hidden>

<input name=si type=hidden value= "www.shudoo.com" ><input name=s type=radio> internet <input name=s Type=radio Checked> www.shudoo.com</td></tr></table></form>

Figure 12

After adding a site search, users can easily search for the information they need. In our demo site, has synchronized Baidu site search results demo.

Nine, avoid the search engine "leaks"

Search engines are using specialized programs to robot from the index content, but for many sites, there are inconvenient to open the Web pages, such as the program's management interface, storage database or other important content of the directory. Robots.txt is a text file placed in the Web site directory where we can declare parts of the site that we do not want to be accessed by search engines.

1. Learn more about Robots.txt

Robots.txt is an ASCII-encoded text file stored in the root directory of the Web site, which usually tells the search engine's web crawler, the content of the site can be retrieved, and those that are not included. Search engine in the website content information, will first check the site's root directory exists robots.txt file, the file can be set up by Notepad, placed in the root directory on the site.

Figure 13

We can also use the Robots Meta tag tag on the head of each page to indicate whether the search engine can crawl the page, which is usually placed in the header area of the HTML code, which is formatted as follows:

Robots.txt is the common rules between search engines, foreign Google, Yahoo, Microsoft, domestic Baidu and other search engines are complying with the robots.txt document writing standards.

Tips:

Because some server systems are case-sensitive, the robots.txt file name must be all lowercase, preferably in the site root directory.

2. Reasonable writing hidden site content

In addition to some documents related to Web site security, the site's program scripts, style sheets and other files, even if the spider index, will not increase the number of sites included, but also only occupy the server bandwidth resources. So the unification needs to be set in the robots.txt file not to let search spiders index these files.

To write robots.txt file, you need to follow the corresponding specifications to fill out, Baidu has a special description page http://www.baidu.com/search/robots.html, above all kinds of robots.txt file usage examples.

Typically, the robots.txt file consists of the following two tags:

User: said the need to intercept the search engine name, such as Google is Googlebot, and Baidu Search is baiduspider.

Disallow: A column or file that does not need to be indexed,

Tips:

When writing robots.txt files must be very careful, if the error of all the Web pages are blocked, it would be worth the candle.

3. Analogy Exclude duplicate pages

Most Web sites are now built using dynamic programs, and static pages are generated for visitors to browse, but dynamic pages are still indexed by search engines. This will create duplicate pages, so that the search engine on the site's weight reduced.

At this time we can through the robots.txt file settings, to avoid search engine access to dynamic pages, to ensure that these pages will not be treated as duplicate content.

Please pay attention to:

Fifth chapter website Management and Maintenance

RELATED links:

The successful way of Internet Entrepreneurship (III.): From theory to practice website Business Modification

"Internet Entrepreneurship success: Web site planning, construction, promotion of profitable combat strategy" is a comprehensive explanation of the whole process of website construction of a book, is also the author of five years experience in the establishment of a summary of the station.

This book revolves around the whole process of website construction, from the beginning of the site planning, one by one explained the site creation, site content management, website maintenance, network marketing programs, website SEO, website profitability and other practical operations, and in the form of examples for you to introduce the success of the network business case and website operation of the misunderstanding, So that we can learn the most direct site experience.

A5 Webmaster Network will be serialized this book, if you want to first read, you can go to the Network of Excellence Online bookstore Purchase, the book in Xinhua bookstore throughout the country also has sales.

Author: Tao Qiufung (endto) publishing house: Computer newspaper Electronic audio-visual publishing house.

Related Article

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.