Tips for front-end SEO

Source: Internet
Author: User

A few days ago in Mu class online learning "SEO in the application of Web page making", feel very good, very useful, today, I made a small note, is also a summary after learning.

first, the principle of search engine work

  When we enter a keyword in the input box, click Search or query, and then get the result. Deep into the story behind it, search engines do a lot of things.

in the search engine site, such as Baidu, there is a very large database in the back, which stores a huge number of keywords, and each keyword corresponds to a lot of URLs, these URLs is Baidu program from the vast Internet 1.1 points download collected from, these programs called "search engine spider " or "web crawler . These industrious "spider " crawl on the internet every day, from one link to another link, download the content, analyze and refine, find the keywords, if "spider " that the keyword is not in the database and the user is useful to deposit into the database. Conversely, if "spider is spam or duplicate information, abandon do not, continue crawling, looking for the latest, useful information to save to provide user search. When the user searches, it is possible to retrieve the URLs associated with the keywords displayed to the visitor.

a keyword on the use of multiple URLs, so there is a sort of problem, corresponding when the most consistent with the keyword URL will be in the front. In the "Spider" Crawl Web content, refining the key word of the process, there is a problem: "Spider" can understand. if the content of the site is Flash and js, then it is not understand, will make confusion, even if the keyword is not appropriate to use. Accordingly, if the website content is its language, then it can understand, its language namely SEO.

ii. introduction of SEO

Full Name: Search中文版 optimization, seo. Since the search engine, SEO was born.

The meaning of existence: in order to improve the number of search engine natural search results in the page and the ranking of the optimization behavior. In short, it is hoped that Baidu and other search engines can be a lot of our collection of carefully crafted website, and when others visit the site can be ranked in front.

Category: White hat seo and Black Hat SEO. White Hat SEO, to improve and standardize the role of website design, so that the site for search engines and users more friendly, and the site can also be obtained from the search engine reasonable traffic, which is the search engine encouragement and support. Black Hat SEO, the use and amplification of search engine policy flaws to obtain more user access, such behavior is mostly deceptive search engines, general search engine companies are not supported and encouraged. This article for white hat seo, then white hat seo can do?

  1. The title, keywords, description of the site carefully set up, reflecting the site's positioning, so that search engines understand what the site is doing;

2. Website content optimization: The content and the keyword correspondence, increase the density of keywords;

3. Set up the Robot.txt file on the website reasonably;

4. Generate a site map for search engine friendly;

5. Add external links to the various websites to promote;

three,front-end SEO

Through the structure of the site layout design and Web page code optimization, so that the front-end page can not only enable browser users to understand, but also let the "spider " to understand.

(1) Website Structure Layout optimization: As simple as possible, straight to the point, advocating flat structure.  

 Generally speaking, the less the structure of the site, the more easily be "Spider " crawl, it is easy to be included. General small and medium-sized web site directory structure more than three,"Spiders " will not be willing to climb down, "in caseof dark lost how to do ." and According to the relevant survey: If the visitor has not found the required information after jumping 3 times, it is likely to leave. Therefore, the three-tier directory structure is also required for the experience. To do this we need to:

1. Control the number of page links

  Site home is the highest weight of the place, if the home link is too little, there is no "bridge ","spider " can not continue to crawl down to the inside page, directly affect the number of sites included. But home links can not be too much, once too much, no substantive links, it is easy to affect the user experience, will also reduce the weight of the home page, the results are not good.

Therefore, for the small and medium-sized enterprises website, the recommended home link within 100, the nature of the link can include page navigation, bottom navigation, anchor text links and so on, note that links to build on the user's good experience and guide users to obtain information based on.

2. Flat directory level, as far as possible to "spider " as long as the Jump 3 times, you can reach the site of any inside page. Flat directory structure, such as: "Plants"--"fruit"--"apple", "orange", "banana", through Level 3 can find bananas.

3. Navigation optimization

  Navigation should be as far as possible in the text way, but also with the picture navigation, but the picture code must be optimized, tags have to add "alt" and "title" property, tell the search engine navigation positioning, so that even if the picture does not display properly, The user can also see the prompt text.

Second, on each page should be added breadcrumbs navigation, benefits: From the user experience, can let users understand the current location and the current page in the entire site location, to help users quickly understand the organization of the site, thus creating a better sense of location, while providing the interface to return the various pages, user-friendly operation For the "Spider" , can clearly understand the structure of the site, but also added a lot of internal links, easy to crawl, reduce bounce rate.

4. structure layout of the website--the details that cannot be neglected

 Page Header:logo and main navigation, as well as user's information.

Page body: The body of the left side, including breadcrumb navigation and text; Top right-hand articles and related articles, benefits: retain visitors, so that visitors stay, for "Spiders", these articles are related links, enhance the page relevance, but also enhance the weight of the page.

At the bottom of the page: Copyright information and links.

  Special note: page navigation, the recommended way of writing: "First 1 2 3 4 5 6 7 8 9 dropdown box", so that "spider " can directly jump according to the corresponding page number, drop-down box directly select the page jump. And the following writing is not recommended, "first page Next End", especially when the number of pages in particular, "spiders" need to go through a lot of crawling down, to crawl, will be very tired, will easily give up.

5. Control the size of the page, reduce the HTTP request, improve the loading speed of the website.

A page should not be more than 100k, too large, the page loading speed is slow. When the speed is slow, the user experience is not good, can not keep visitors, and once timed out, "spider" will also leave.

(2) page code optimization

 1.<title> Title: Only emphasis on the key, try to put the important keywords in front, keywords do not repeat, try to do every page <title> title do not set the same content.

  2.<meta keywords> Tags: keywords, listed a few pages of the important keywords can be, remember too piled up.

3.<meta description> Tags: page description, need to highly summarize the content of the Web page, remember not too long, too much piling up keywords, each page will be different.

tags in 4.<body>: Try to make the code semantically, use the appropriate tags in place, and do the right thing with the right tags. Let the reader and "Spider " are at a glance. For example:H1-h6 is used for the title class,<nav> tag is used to set the page main navigation and so on.

5.<a> Tags: in-page links, to add "title" attribute to explain, so that visitors and "spider" know. While external links, links to other sites, you need to add el= "nofollow" property, tell the "Spider " do not crawl, because once the "spider" crawled outside the link, it will not come back.

6. Text title to use "Spider " think it is the most important, if you do not like the default style set by CSS. Try to do the body headings with h2> tags, and other places should not randomly use the H title tag.

7.<br> Tags: for text-only wrapping, such as:

<p>
First line text content <br/> second line text content <br/> Third line text content </p>

8. The table should use the <caption> table title tag

9. should be described using the "alt" attribute

  10.<strong>, <em> Tags: use when you need to emphasize. <strong> tags in the search engine can get a high degree of attention, it can highlight keywords, performance important content,<em> label emphasis on the effect of the second only to <strong> tag.

<b>, <i> Tags: only used to display the effect, in the SEO will not have any effect.

10, text indentation do not use special symbols &nbsp; You should use CSS to set it up. Copyright symbols do not use special symbols &copy; Can directly use Input method, spell "Banquan", select the serial number 5 can be played copyright symbol ©.

12, clever use of CSS layout, the important content of the HTML code in the front, the front of the content is considered to be the most important, priority to "spider" Read, the Content keyword crawl.

13. Important content do not use JS output, because "spider" Do not know

14. Use the IFRAME framework as sparingly as possible, because "spiders" generally do not read the contents

15. Use display with caution :None: For text content that you do not want to display, you should set Z-index or set it outside the browser display. Because the search engine will filter out the contents of Display:none.

16. Keep your code streamlined

17.js code if you are manipulating DOM operations, you should try to place the body end tag before theHTML code.

Tips for front-end SEO

Related Article

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.