Follow these guidelines to help Google find your site, index it, and rank it. Even if you choose not to adopt these recommendations, we strongly recommend that you pay attention to the quality guidelines, which briefly describe some of the irregularities that could result in the site being completely removed from the Google index or identified as a spam site by the system or manually. If a site is identified as a spam site, it may no longer appear in search results on google.com or any of the Google partner sites.
- Design and Content Guide
- Technical guide
- Quality Guidelines
after your site creation is complete :
- Visit http://www.google.com/submityourcontent/to submit the site to Google.
- Submit a sitemap using Google webmaster tools. Google uses a sitemap to understand the structure of the site and to improve the coverage of the page.
- For all sites that should know about your page, be sure to notify them that your site is online.
Design and Content Guide
- The site should have a clear hierarchy and clear text links. Each page should be opened at least through a static text link.
- Provide a site map for users, and the map should contain links to important parts of the site. If the site map contains too many links, you should divide the site map into multiple pages.
- Limit the links on specific pages to a reasonable number.
- Web site should be practical and informative, Web page text should be clear and accurate expression to convey the content.
- Take into account what words users use to find your page, and make sure that the text is actually included on the site.
- Try to use text instead of pictures to display important names, content, or links. The Google crawler does not recognize the text contained in the picture. If you must use a picture instead of textual content, consider using the "ALT" attribute to add some descriptive text.
- Ensure that the <title> element and ALT attributes are descriptive and accurate.
- Check that the links are corrupted and that the HTML format is correct.
- If you want to use a Dynamic Web page (that is, the URL contains "?" characters), please note that not every search engine information capture software can crawl dynamic and static Web pages. This helps to shorten the length of the parameter and reduce the number of parameters.
- See our recommended best practices for pictures, videos, and rich snippets.
Technical guide
- Use a text browser such as Lynx to check your site, as most search engine "spider" Programs view your site in almost the same way as lynx. If complex features such as Javascript, cookies, session IDs, frames, DHTML, or Flash cause you to not see the entire site in a text browser, search engine information capture software may encounter problems crawling your site.
- Allows search bots to crawl your site without using a session ID or parameter that tracks the access path to its Web site. These techniques are useful for tracking the behavior of individual users, but the access patterns of the bots are completely different. Using these techniques may result in incomplete indexing of the site, because bots may not be able to exclude URLs that look different but actually point to the same page.
- Make sure your network server supports the If-modified-since HTTP header. With this feature, your Web server can tell Google whether the content has changed since the last time it crawled your site. Support for this feature can save you bandwidth and overhead.
- Take advantage of the robots.txt file on the network server. The file tells the crawler which directories can be crawled and which directories are not. Make sure the file is the latest version for your site, so that you don't accidentally block the Googlebot crawler. Visit http://code.google.com/web/controlcrawlindex/docs/faq.html to learn how to provide instructions to your bots when they visit your website. You can use the Robots.txt analysis tool provided in Google Webmaster tools to test the robots.txt file to ensure that it is used correctly.
- Make sure that ads don't affect search engine rankings as much as possible. For example, the robots.txt file will block crawling of Google AdSense ads and DoubleClick links.
- If your company has purchased a content management system, make sure that search engines can crawl the pages and links created by the system.
- Use robots.txt to avoid crawling search results pages or other automatically generated pages that don't bring too much extra value to search engine users.
- Test the site to make sure it is displayed correctly in different browsers.
- Monitor the performance of your site and optimize load times. Google's goal is to provide users with the most relevant search results and the best user experience. Fast website speed can improve user satisfaction, but also improve the overall quality of the Web page (especially for users with slow Internet connection), we also hope that the overall speed of the Web site with the webmaster to improve their respective sites.
Google strongly recommends that all webmasters regularly use Page speed, YSlow, webpagetest, or other tools to monitor site performance. For more information, tools and resources, see make your network faster. In addition, the site Performance tool in Webmaster tools will show you the speed at which users around the world are browsing your site.
Quality Guidelines
These quality guidelines cover the most common forms of cheating or manipulation, and Google will investigate any other misleading behavior not listed here. Do not assume that a deception is not listed on this page, and Google will endorse that approach. As a webmaster, instead of spending a lot of time looking for holes that can be drilled, it's better to do whatever it takes to maintain the basic principles in order to bring a better experience to the user and thus improve the rankings.
If you believe that other websites are abusing Google's quality guidelines, please submit a spam report to us. Google wants to develop flexible automated solutions to solve these problems, so try to avoid manual handling of violations. We may not have a manual response to all reports, and spam reports are prioritized based on the impact on users, and sometimes this may result in the complete removal of sites containing spam from Google's search results. However, not all manual actions will perform the delete operation. Even if we take appropriate action on the reported site, the impact of these operations may not be obvious.
Quality Guidelines-Basic principles
- The main consideration when designing your Web page is the user, not the search engine.
- Please do not deceive the user.
- Please do not cheat in order to improve search engine rankings. A good rule of thumb is if you're comfortable explaining your own work to a competitor's website or Google employee. Another useful test is to ask yourself: "Can this help my users?" If there is no search engine, will I do this? ”
- Consider what makes your site unique, valuable, or attractive to visitors. Make your site stand out in the field.
Quality Guide-specific guidelines
Avoid using the following methods:
- Participate in link scenarios
- Participate in affiliate programs without bringing enough value added
- Loading Web pages with irrelevant keywords
- Create a webpage with malicious behavior, such as phishing, installing viruses, trojans, or other harmful software
- Abuse Rich web Snippets markup
- Send an automated query to Google
It is advisable to develop the following good habits:
- Monitor Web sites for hacker attacks and remove them immediately if they appear hacked
- Prevent user-generated spam from appearing on the site, or delete it if such content occurs.
If your site violates the requirements of one or more of these guidelines, Google may prevent it by performing manual actions on the site. After you resolve the issue, you can submit a reconsideration request for the site.