Improve the ranking of websites in search engines

Source: Internet
Author: User

In the ocean of the Internet, interconnection is the most important thing. websites that are not referenced by other websites are "information islands ". "Wine is good and you are afraid of alley depth", maybe it is a bit like Spam advertisement, but this is the case. Therefore, if your website is not for your own purpose, you need to actively promote your website. You need to pay attention to the following aspects for promotion through search engines:

Winning by quantity: Not necessarily joining the classification directories of large websites is website promotion. Any reverse link from other websites is useful. The classic way of website promotion is to add the classification directories of relatively large portal websites, such as Yahoo !, Baidu. In fact, there is a misunderstanding here: it is not necessary to add the classification directories of large websites to promote websites, because the search engine is no longer just an index of website directories, but a more comprehensive web index, therefore, reverse links from other websites are of great value, even in news reports, forums, and archive email lists. When sending emails to the Mail List of many large sites, be sure to add the address of your website to your signature.
Blogger (weblog for short) may have the deepest understanding of the meaning of the phrase "link is everything". Because there are a large number of links between Blog content, therefore, the most frequently cited blog pages rank higher in search engines than those of some large commercial websites.

Winning by quality: being referenced by websites with high PageRank can improve PageRank faster. The quantity is only one of the key factors. The links from pages with a high PageRank can also increase the PageRank of the target. Take my personal website www.whydomain0.com as an example: I have not added any category directories, I only contributed some articles to zdnet China. Due to the source links of the articles on the pages, the PageRank of the corresponding webpages and websites has been greatly improved after a while. Sometimes it is more important to be referenced by a website than the number of times. In accordance with this principle: can be listed in Yahoo! The first two layers of a large authoritative directory such as Google are very valuable.

Note: Do not use link farm to improve your site ranking: Google will punish those sites that actively link to link farm to improve their ranking, and the pages of the sites will not be indexed. But if your page is linked by another link farm, you don't have to worry about it, because this passive link will not be punished.

Do not mean to give links to other websites: If a webpage only has a large number of access links, but lacks export links, it will also be considered as a site of no value by the search engine. Make sure that your website can help search engines more accurately determine which information is most valuable to users. That is to say, if your website only has external reverse links and no links are exported, it also has a negative impact on the performance of your website in search results. Of course, it is rare that the website does not even have an export link, unless you deliberately do so. Under normal circumstances, you will naturally add links to some other websites on the webpage to lead visitors to what we think is important or can provide more valuable information, in addition, before promoting your website, you may first need to understand your website's current popularity in some search engines. The principle is very simple.

Website promotion is just a means. How to highlight the content and enable users who need relevant information to find your website as soon as possible is the purpose. PageRank is not like Yahoo! In this way, the portal site can rank top among all search results, because the search engine result is the ranking result of the combination of the matching degree of search keywords on the page and PageRank on the page. Therefore, the second key point is how to highlight keywords.

How to highlight keywords: theme-oriented keyword matching

Title Design is only one aspect of improving the keyword density (keywords density): In the process of keyword matching in modern search engines, the matching process not only looks at the summary of the current page: to a large extent, you can not only view what the webpage says you have, but also how to describe your website when others connect to it. For example, check: "100,000 why", the returned results have a http://why100000.com and this page is not Chinese, the reason can match, because a lot of links to its Chinese site used: <a href = "http://why100000.com"> 100,000 why Web sites </a>, so the keywords that do not exist on this page also become part of the page abstract.
Therefore, it is very important to grasp the theme style of the entire website. The more relevant the title of the link to the theme of the page to be linked, the more beneficial the page to be linked.

Do not leave the title blank: blank <title> </title> is a waste of the most valuable position. In traditional pages, HTML pages contain hidden information similar to the following, keywords used to describe the main content of the current webpage:
<Header> <meta name = "keyword" content = "why00000.com, sogo99.com"> Later, due to the misuse of this manual keyword adding method, in order to improve the probability of being hit by search engines in a large number of webpages, popular keywords that are irrelevant to the actual webpage content are often added, such: "Music MP3 download", so the new generation of search engines no longer care about the manual meta keyword statement in the page header file, the page title often has a higher proportion in the hit process of keywords in the search engine. If a keyword hits a higher score in the title than in the page, in this way, the ranking of the corresponding search results is higher.

Title length and content: do not be too long, generally within 40 characters, and fully highlight the proportion of keywords; if the longer title search engine is generally ignored, therefore, try to place the key words in front of the title. Remove unnecessary adjectives. After all, you can use nouns to find the desired content. Title content: Try to use some words that can be found by others through keywords (or too much, if the words in the title are not found in more than half of the content, it may be excluded from the index by the search engine ), therefore, it is necessary to query statistics based on keywords from other search engines in Web logs.

If there are many webpages, try to use different webpage titles and try to make the content of your website more accessible to the search engine index range; because the search engine uses some content as a duplicate page to exclude the index range based on the similarity of the page content. Except for <title> </title>, you can also use the

Other Website Design tips try to use static web pages: currently, few search engines can index dynamic web pages like Google, and even Google will not index all content, in general, Google prefers new and static content. Therefore, whether in terms of efficiency or convenience of Search Engine indexing, it is necessary to use the content publishing system to publish website content into a static webpage. For example:
Http://www.why%0.com/_ftp/than http://www.why100000.com/_ftp/Man.php? Mode = Man & Parameter = intro & Section = 3
It is easier to enter the search engine index. In addition, the keyword hit in the URL is sometimes better than that in the title. In addition, the more pages that can access Google indexes, the better. You can use a script similar to the following to count the indexing of your website by Google or Baidu.
#! /Bin/sh
Yesterday = 'date-d Yesterday + % Y % m % d'
Log_path = '/home/Apache/logs'
Grep-I googlebot $ log_path/access_log | awk '{print $7}' | sort-u> spider/incluyesterday.googlebot.txt
Grep-I baiduspider $ log_path/access_log | awk '{print $7}' | sort-u> spider/baiyesterday.baiduspider.txt

The website directory structure should be flat, because every deep level of directory, PageRank reduces the level. If the homepage is 3 and its sub-directory is 2, it may be unable to be included in the rating range.

Separation of performance and content: "green" web page: JavaScript and CSS in the web page should be separated as much as possible from the web page. On the one hand, code reuse is improved (and page cache is also convenient), and on the other hand, because the percentage of valid content in the webpage length is high, the proportion of related keywords on the page is also increased. In short, we should encourage the W3C standard to use more standard XHTML and XML as the display format to facilitate content storage for a longer period of time.

Allow all pages to have a quick entrance: site map, which allows web crawlers to quickly traverse all the content to be published on the website. If the homepage is accessed with flash or images, it is tantamount to rejecting the search engine. Apart from the user-friendly UI design, spider friendly is also very important.

Keep the website healthy: Use bad link check tools to check whether there is a dead link in the website.

Maintaining the stability and durability of webpage content/links: in the search engine index, the history of webpages is also an important factor, and the probability of webpages being linked for a long time is higher. In order to ensure that your webpage can be referenced by pages of other websites for a long time, it is best to keep the old page and make the link turn when there is a link update on your webpage, to maintain content continuity. You know, it is very difficult to rank a website and content in a search engine to "cultivate" a high level, no one wants to easily find their content, but click "404 page does not exist". Therefore, the website administrator has an error on the website. log analysis is also necessary. File Type factors: Google has the ability to index PDF, word (Power Point, Excel), and PS documents, because the content of such documents is more organized than the general HTML, academic Value is generally relatively high, so these types of documents are inherently higher than ordinary HTML documents PageRank. Therefore, we recommend that you use advanced formats such as pdf ps for access to important documents, such as technical whitepaper, FAQs, and installation documents, so that you can obtain top-level information in search results.
It is often found that a piece of news on a portal site is usually higher than the homepage of other sites. Therefore, when the overall PageRank of a site is improved, some of its own unimportant content will also be brought into the list that is prioritized by the search engine together with those high PageRank content. This is not very reasonable, because it often results in the archiving of mail lists on many large sites, which is usually higher than the PageRank on the homepage of other sites.

The importance of Website access statistics/log analysis and mining: website design not only passively caters to search engine indexes, it is more important to make full use of the traffic brought by the search engine for deeper user behavior analysis. Currently, keyword statistics from search engines are almost the standard functions of various web log analysis tools. I believe that commercial log statistics tools should be further enhanced in this regard. The Web Log statistics feature is so important that the new RedHat 8 has used the log analysis tool Webalizer as one of the standard server configuration applications.

Take Apache/Webalizer as an example. The procedure is as follows:
Record access Source:
In the Apache configuration file, set the log format to combined. Such logs contain extended information. One of the fields is the conversion source of the corresponding access: http_referer, if the user finds your webpage from the search results of a search engine and clicks it, The http_referer recorded in the log is the URL of the user's search engine result page, this URL contains the keywords that the user queries.
Default Configuration in Webalizer for search engine statistics: how to extract keywords in http_referer
By default, Webalizer provides query formats for popular search engines such as Yahoo and Google: Here I have added search engine parameter settings for domestic portal sites.
Searchengine Yahoo.com P =
Searchengine Altavista.com q =
Searchengine Google.com q =
Searchengine sina.com.cn word =
Searchengine Baidu.com word =
Searchengine sohu.com word =
Searchengine 163.com q =

In this way, When Webalizer statistics are set, the keyword in the http_referer URL from the search engine will be extracted, for example, all links from Google.com, the values of the Q Parameter are counted as keywords. From the summary statistics, you can find the number of times that the user finds you based on the keywords, and find the keywords that your users are most interested in. Further, if you have settings in Webalizer, you can also convert the statistical results into CSV logs, this allows you to import historical data into the database for further data mining. In the past, user analysis through Web logs mainly focused on simple log-based access time/IP address sources. Obviously, the search engine keyword-based statistics provide richer and more intuitive analysis results. Therefore, the potential commercial value of Search Engine Services is almost self-evident. Maybe this is Yahoo! Traditional search engine websites, such as AltaVista, are paying more attention to the search engine market after the portal mode. Let's look at Google's annual keyword statistics, who knows what users are more interested in than search engines on the Internet? Note that because Google uses UTF-8 encoding for IE in Windows 2000, many statistics sometimes need to be viewed in UTF-8 mode to display the correct characters. From statistics, we can see that Google has become the most common search engine among it developers with relatively high usage levels. Baidu users have already surpassed the traditional portal sites such as Sohu and sina. Therefore, the advantages of traditional portal websites in search engines will be very fragile. From the development trend of technology, there will be more service models that use Internet media for deeper data mining in the future.

Sorting: Zhang Qing QQ: 9365822
Http://www.why100000.com ("100,000 why" computer learning network)
Http://sogo99.com (sogou 99 web portal)

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.