Website Optimization Implementation Plan

Source: Internet
Author: User

Intermediary transaction SEO diagnosis Taobao guest Cloud host technology Hall

Here are my previous written for the Chinese cotton Trading network http://www.socotton.com and China's Rubber and Plastic exhibition Network http://www.99plas.com optimization plan, take out for your reference!

I. Implementation programme

1. The technical aspects of the website

Tle and Meta tags

Title:

1, title short refining, high generalization, contains keywords, rather than only a site name. But the keyword should not be too much, not more than 3 phrases.

2, the first 7 words on the search engine most important, so the keyword position as far as possible, the total number of words not more than 30 characters.

Meta attribute Keywords (Keywords)

Keywords writing the key is that each word can be found in the content of matching to help rank, but the recent search engine on the keywords attribute weight reduction, less important than the original

Description (Description)

1. The description of the occurrence of keywords, and the content of the text related to this part of the content is for people to see, so to write in detail, people interested in, attract users to click.

2. Also follow the short principle, the number of characters with spaces in the inside do not exceed 120 characters.

3. Notes supplementing the title and keywords that are not adequately articulated

Access structure

Two-level domain name access structure

The level two domain name expands the series. Since the level two domain name independent website, the directory level from the current level two domain name.

Directory and file naming

Keywords in directory and file names

Keywords can be used in directory names and file names. If it is a key phrase, it needs to be separated by a separator. We often use the hyphen "-" and underscore "_" to separate, the URL often appears in the space Code "%20". Therefore, if you use the file name in, the following three forms of separation may occur:

Made-in-china.htm

Made_in_china.htm

Made%20in%20china.htm

After the connection, the key words lose meaning. Directory and file names if you have a group of keywords, separate them with hyphens "-" instead of underscore "_".

URLs should be as short as possible.

GB2312 Chinese Path

Frame structure

Use the "Noframes" tab in your code to optimize, and consider the noframe tag as a link to a frame page in a normal text area and a descriptive text with a keyword (Title,meta) that also appears when the keyword text is opened with an if ER frame. IFrame can be embedded in any part of the Web page, but also can be arbitrarily defined for the search engine, the text in the IFrame is visible, but also can be traced to what the user sees is different, the search engine will be IFRAME content as a separate page off.

Image optimization

1. Compress image file size as much as possible while maintaining image quality.

2. Alt attribute

3. Add a descriptive text that contains a keyword above or below the picture

4. Link to this picture using links.

FLASH optimization

1, do a secondary HTML version:

Keep the original FLASH version, you can also design an HTML version of the appearance of the effect, you can also let search engines through the HTML version of the Web page to discover the site.

2, the Flash embedded HTML file:

You can also make up for changes to the Web page structure, that is, do not design the entire Web page as a FLA content embedded in the HTML file, so that user browsing does not weaken the visual effect

Table use

1. If a page uses a large length of text, in addition to being able to divide a single page of text into multiple facets, you can also consider placing text in different tables, which is not only easy to manage, but also allows each table content to be loaded sequentially when the page loads so that visitors can look at the downloaded content while waiting for the remainder to load, instead of waiting Wait for a long time to be loaded together.

2. Nested tables within the table is also not conducive to page loading, because the browser is loaded after the large table and then load the embedded small table, so the embedded table will eventually reduce the overall page load speed.

3. Use the XHTML standard as far as possible, using div instead of table. The concrete method is to use the FLOAT attribute inside the CSS, position attribute and so on to locate

Web page Weight Loss

CSS Styles

1. Web page production should be through CSS (cascading style forms) to unify the custom font style.

2. Note that all CSS files are stored separately in an external file named CSS.

JavaScript:

1. Simplify the function name and variable in JS.

2. Convert the public part of the Web page into a script and coexist in the JS file. This reduces file size, speeds downloads, and is easy to manage. But not the navigation and so optimized key code into JS, or search engine search is not

Use Base label:

1.Base label is a complete control label

Remove spaces and enter

1. If you want to lose weight more harshly, the last step is to remove the space, but also to reduce the size of the file a lot.

However, the page after the deletion of space because there is no ladder arrangement, it will be difficult to read.

Web page optimization in different locations

Home

1. Home keyword Selection

The keyword of the homepage should choose the core keyword. Usually these words are more difficult to optimize and require a lot of external links. When exchanging links, use the name of the site or the core keyword to name the link.

2. Web Links

Most of the homepage are links, the latest link to put in front of the page, easy to collect updated search engine frequency

Spider first came to the page is the homepage, the home must be constantly updated, spiders will come to the more frequent, linked pages will be included more quickly.

Section page optimization:

1. Keyword selection

Column page of the key words, relative to the homepage, you can select a number of key words

2. Internal links

Section page must link to the home page, the homepage to establish internal links. At the same time and other sections of the page should be linked to each other.

Column page A large number of linked content pages, the number of links in the control within 100.

Inner page optimization:

1. Key words

You can set Meta Title to the same content without paying special attention to the keywords. In the page more use h1,img, fervent label optimization keywords.

2. Content

Pages to be content-oriented, more than 5K, you can do the paging process. Content is best to original, the more the page included, from the search

The more traffic the engine gets.

3. Link

Usually there are few external links within the page, unless the content is very good, be reproduced. Inside the page to link columns and home page, a large number of internal page links, in favor of column pages and home page keyword rankings. The internal page is best to link content related to other pages, this can increase the relevance of the page, while user-friendly browsing, increase the PV site.

Robots.txt

robots.txt file format

The "robots.txt" file contains one or more records that are separated by a blank line (CR,CR/NL, or NL as

Terminator) can be annotated using # in this file, using the same method as in Unix. The records in this file usually start with one or more lines of user, followed by a number of disallow rows, details are as follows:

1.user-agent:

The value of the item is used to describe the name of the search engine robot, and in the "robots.txt" file, if there are more than one user-

The agent record indicates that multiple robot are subject to this protocol, and for that file, at least one user-

Agent Records. If the value of the item is set to *, the protocol is valid for any robot, and in the "robots.txt" file, there can be only one record for "user-agent:*".

2.Disallow:

The value of the item is used to describe a URL that you do not want to be visited, which can be a complete path or partial, and any URL that starts with Disallow will not be accessed by robot.

"Allow" extension

Googlebot recognizes robots.txt standard extensions called "Allow". Other search engine bots may not

Recognize this extension, so use a search engine that you are interested in. The "Allow" line works perfectly

Same as the "Disallow" line. Just list the directories or pages that you want to allow.

Match character sequences with * numbers:

You can use an asterisk (*) to match a sequence of characters.

Use $ to match the end character of the URL

You can use the $ character to specify a match to the end character of the URL.

Sitemap Site Map:

The new way to support the Sitemap is to include a sitemap file link directly in the robots.txt file.

Robots.txt Risks and Solutions:

1. The benefits of everything have disadvantages, and robots.txt also poses a risk: it also gives attackers the location of the site's directory structure and private data. Although this is not a serious problem when the security of the WEB server is properly configured, it reduces the difficulty of attacking malicious people.

2. If the setting is not correct, it will cause the search engine to delete all the indexed data.

Web page Similarity

The impact of Web page similarity on SEO:

Google's web page similarity limit of 60%, if more than this standard will cause the page is not included, or after the ranking after the post.

404 page

Correctly define 404 error pages

1. For existing information due to the path changes caused by the access, should be defined in IIS 404 error point to a dynamic page, in the page using 301 to jump to the new address, when the server returned 301 status code.

2. When accessing an incorrect link, 404 pages will be invoked, but because of the differences set in IIS it will result in different status codes being returned:

1.404 points to an HTM file, which is correct when the page returns the 404 status code.

2.404 point is a URL, such as/error.asp, if not set inside the page, just return the hint HTML code, will cause the page to return 200 status code, at this time the harm is, when many pages can not find, return and access to the normal page when the same 200 status code , will make the search engine that the link exists, and the content of the error page is included, when such a lot of links, will lead to a large number of pages to repeat, so that the site ranked lower. Processing method: After displaying the content, add the statement: response.status= "404 Not Found", so that the page returns 404 status code.

3. Avoid returning 302 status codes when calling 404 pages, which is easily considered by search engines as a redirect to cheating.

4. Detection method, use HttpWatch to view return code.

Import Links

High-quality Import Links:

1. Links to the directory of search engines and links to sites that have been added to the directory.

2. A website related to or complementary to your topic.

3. PR value is not less than 4 of the site.

4. Web sites with little export links.

5. The content of the high quality of the site, is the original content of many sites.

Export Links

Internal links

You can set up a number of two-level domain name child stations, the sub-station chain to each other, and then linked back to the main station, forming a group surrounded by a child station

The main station, to improve the ranking is quite favorable.

Keyword link text and contextual semantics

1. Search engines pay more attention to the keywords that appear in the link text. Therefore, whether you are exporting a link, importing a link, or an internal link, it is best to take into account that the link text contains keywords.

2. In addition to the link text, the text around the link is contextual semantics is also very important. The link analysis system determines the degree of association between the site and its linked objects by examining the context content of the link to assess the quality of the content of the page.

The importance of links

1. The location of the link, if the link is in the content location of the page, the weight is high, in the same position as the bottom of the footer, the weight is low.

2. The discrete nature of the link text, that is, if your site outside the chain are all neatly use the same kind of anchor text to describe, that Google will suspect that this is manually made out of the link, so it may fall right.

3. The Title attribute of the link, Google will think that this is related to the description, but this is not good for users visibility, relevance is not as direct as the use of anchor text high.

4. Links over a certain number. May not be Google search, Google's official example is 100, more than the number of Google spiders on the "aesthetic fatigue."

5. Link the site's IP address and point to the target difference, the effect is better, if it is exactly the same, then you are likely to be the same server on the site "close relatives marry", Google will discriminate oh.

6. Links on the page if it appears with the link anchor text, similar to the keyword, very good, will improve the relevance.

7. Links on the page if it appears in the theme-related sites, good, improve the relevance.

8. Link stability, if your link is very unstable, today appears 10,000, tomorrow dozens of, then Google will notice that you are likely to send a spam link oh.

9. Links to sites that are authoritative, such as: Edu and. gov, will increase the weight.

10. The reciprocal link will be greatly reduced right.

2. Website content Information

3. Links inside and outside the website

4. For search engine use promotion

5. External information Promotion

6. Server Optimization settings

Related Article

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.