Analysis of several error methods for search engine optimization

Source: Internet
Author: User
Keywords Search

Intermediary transaction SEO diagnosis Taobao guest Cloud host technology Hall

First point: The site relative address problem caused by numerous dead links

Every site owner knows that frequent viewing of the site has a dead link is a routine maintenance work, however, the most likely to cause a dead link problem is avoidable, which seems to be simple, but the use of this link can reduce the occurrence of dead links from the source.

The relative address refers to the address name of the code shown for this example <a href= "index.asp" >2guys collection station </A>, site designers like the relative address can easily move from the local design preview to the network server, but if the site continues to develop, Need to increase the category directory or subdirectory, the relative address of the directory once changed, the site will appear 404 error message "Cannot find your page."

Another option is to use absolute address, example <a Href=http://www.2guys.cn/index.asp>2guys station </A>, many sites have used absolute web address, However, this will make the test process of the Web page more complicated, so many web designers prefer to use relative addresses.

A relatively flexible solution now is "absolute relative link" (differs relative link), or "Server relative link" (server-relative), "Relative Domain name link" (domain-relative), It is a mixed version of the relative address and absolute address, and an absolute relative link includes an initial backslash to tell the server which root directory to start with and track the path. For example: <a href= "/articles/f20070726.html" >f20070726</a>.

This is a more flexible way to make web designers more like, so that they can test and upload the Web page without the error caused by the domain name can not be displayed problems.

Now test your site there is no dead link method A lot is also very convenient, there are online testing of the Web site also has client software, at the end of the article is attached to check the invalid link URL.

2nd no static and Optimized dynamic Web page to prevent the search engine included

Google's guide has long told us that JavaScript, cookies, session IDs, frames or flash are most likely to make search engine spiders or bots think that all pages are like a code page, which hinders their crawling and ultimately affects the inclusion. Because some must optimize the picture, Flash.

In addition, there are "search robot traps (spider Traps)" problem, many dynamic site address with? Or a messy symbol for reaching this band through the link "? "The page, the technical search engine can crawl, but generally the search engine chooses not to crawl, this is in order to avoid a kind call" search robot Trap (spider traps) "The script error, this kind of error can let the search robot carry on the infinite loop crawl, cannot exit and waste time. Therefore, to make the Web page better included, you must let dynamic Web pages static.

Then, download a search engine spider Simulator, to try to "crawl" your site, if normal, your site can be included in the normal.

3rd of the Web site has done search engine optimization personnel use cheating means

If the enterprise really just wants to do SEO, then please be careful not to find the "guarantee" how much time to reach a certain location or the low cost of speaking very "virtual" company, search engine optimization knowledge is not confidential, if this as an excuse not to explain to the customer what specific methods to optimize, such a situation is usually careful.

They generally use software to some forums and message board mass external spam links, in the site stack keywords, invisible text links and so on, these methods can be in a short period of time, but will undoubtedly bring your site to hell, the consequences of light by the search engine down the right, that is, lower ranking, Heavy users from the search engine to delete the entire site and may no longer include the content of this IP address, but also may be implicated in the same IP section of other sites.

4th: The domain name problem of the website

Will search engines think of "www.2Guys.cn" and "2guys.cn" as two sites? The answer is yes. What happens if you ignore the problem? If you don't redirect one of the domain names to another, search engine will take these two domains under the Web page as a separate two sites, which is to say, the search engine that your site has a copy of the page, serious words is the whole station to replicate the problem, the consequences will be down the right penalty even in the search engine to delete the page.

The best way to search engine friendliness is to create a 301 permanent redirect that permanently points one domain to another. While Google has a webmaster tool that allows you to submit a specific URL as your preferred option, other search engines do not have this service.

About 301 of the Apache server and asp,php settings, etc. there are many online, you can search the Internet, here to say how to set the 301 DNS alias?

The following methods: <table width= "50%" border=0> <tr> <td>www.2Guys.cn</td><td> A </td><td >AAA.BBB.CCC.DDD (direct IP address) </td> </tr> <tr> <td>www.2Guys.cn</td><td> CNAME </td><td>www.2Guys.cn</td> </tr> </table>

This tells everyone (not just browsers and search engine bots) 2guys.cn is first name, www.2Guys.cn is 2guys.cn alias.

In addition, when you change the domain name, 301 Permanent redirect does not have the correct settings will have a great impact, so in this regard to be paid attention to do a good job of domain name redirection.


5th: The server on which the website is located is unstable

If you are using a low-cost server that is likely to mean that the site is unstable. Most ISP network service providers claim that their server 99.99% of the time is working properly, and you may find that they exaggerate the fact that you can use a network monitoring software to monitor whether your site is online, when to drop the line, and when the site opens unusually slowly.

You can use monitoring software to test the Web service provider's corporate website and choose the best performing company.

Performance instability of the site, such as often dropped the line to open a Web page or open the Web page response is slow, such as not only affect the user experience, and the search engine ranking performance has a negative impact.

6th: Content Management system is not friendly to search engines

Content Management systems can update Web content more simply, but this system-formed web pages are often unfriendly to search engines and do not support separate headings, meta descriptions, and so on. The title of this system may well be that all pages are unified rather than changed according to the title of the article. If you want to get good SEO results, the Content management system you use needs to be able to set up separate headings, meta descriptions, keywords, and so on.

Related Article

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.