About Seo optimization solutions (online materials)

Source: Internet
Author: User

1. Title Tag website title

General format:Article name-category name-site name
If it is too longArticle name-website name

Search engines only consider the limited number of words in the Title Tag, for example, the first 10 to the first 15 words. Therefore, a long keyword-filled title tag can only make your website look more junk.

2. meta tag

The title label should not exceed 10 to 12 characters. The description label cannot exceed thirty to thirty-thirty characters. Keyword tags only list important keywords that are indeed related to your website.
In all three tags, do not repeat or pile up keywords. Meta should not be the same for every page.

3. keywords in the URL

It is helpful for English web pages to see keywords in URLs. The search engine relevance algorithm may be of little help, but it is actually of great help to users. Because when the search results are listed in the search engine, you can determine from your file name what the webpage is discussing.
Keywords can be in the domain name, directory name, or file name, but do not pile up keywords in the URL.

4. Article length is good for Seo

First, the article should be at least 200 words. There are two reasons:
1) if there are only dozens of words, the search engine cannot easily determine the topic of a text, so it cannot determine which keywords are most relevant.
2) if the text is too short, it may contain less words than the content in the navigation system, menu, copyright statement, and other miscellaneous items. It is suspected that the content is copied. If there are too many pages with too few texts, the number of pages that can be copied inside the website may increase. Because the similarity between your web pages is too high, and the text with the difference is too short.
Long articles are easy to attract links. I have read many long articles, even dozens of pages. Although it is inconvenient to pull it during reading, such an article is easy to attract links.

5.robots.txt

Robots.txt is the first file to be viewed when a search engine accesses a website. The robots.txt file tells the Spider Program what files can be viewed on the server.
Syntax: the simplest robots.txt file uses two rules:
User-Agent: roaming bot applicable to the following rules
Disallow: the Web page to intercept

1. Every time a user tries to access a nonexistent URL, the server will record the 404 error in the log (the file cannot be found ). .
2. website administrators must keep spider programs away from directories on some servers to ensure server performance. For example, the log service on the hichina network has a program stored in the javascgi-bin#directory. Therefore, it is a good idea to add "disallow:/cgi-bin" to the robots.txt file, which can avoid indexing all program files by SPIDER and save server resources. Generally, files that do not need to be captured by spider on websites include: background management files, program scripts, attachments, database files, encoding files, style sheet files, template files, navigation images, and background images.
User-Agent:
The value of this item is used to describe the name of the search engine robot. In the "robots.txt" file, if there are multiple User-Agent records, it indicates that multiple robots will be restricted by "robots.txt". For this file, at least one user-Agent record is required. If the value of this item is set to *, it is valid for any robot. In the "robots.txt" file, only one record such as "User-Agent: *" can exist. If you add "User-Agent: somebot" and several disallow and allow rows to the "robots.txt" file, the name "somebot" is only subject to "User-Agent: restrictions on disallow and allow rows after somebot.
Disallow:
The value of this item is used to describe a group of URLs that do not want to be accessed. This value can be a complete path or a non-empty prefix of the path, URLs starting with the disallow value will not be accessed by the robot. For example, "disallow:/help" disables robot access to/help.html,/helpabc.html,/help/index.html, while "disallow:/help/" allows robot access to/help.html,/helpabc.html, cannot access/help/index.html. "Disallow:" It indicates that all URLs that allow robot to access the website must have at least one disallow record in the "/robots.txt" file. If "/robots.txt" does not exist or the file is empty, the website is open to all search engine robots.
Allow:
The value of this item is used to describe a set of URLs to be accessed. Similar to the disallow item, this value can be a complete path or a path prefix, URLs starting with the value of the allow item allow robot access. For example, "allow:/hibaidu" allows robot to access/hibaidu.htm,/hibaiducom.html, And/hibaidu/com.html. All URLs of a website are allow by default. Therefore, allow is usually used with disallow to allow access to some webpages and prohibit access to all other URLs.
Note that the order of disallow and allow rows is meaningful. The robot determines whether to access a URL based on the first matched allow or disallow rows.
Use "*" and "$ ":
Baiduspider supports Fuzzy Matching of URLs using wildcards "*" and "$.
"$" Matches the row Terminator.
"*" Matches zero or multiple arbitrary characters.

6. Create a sitemap Site Map

The XML Map allows search engines to quickly learn about updates to your site. Both Google and Yahoo have used sitemap XML map to accelerate the Indexing Service.

Related Article

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.