Front-end SEO

Source: Internet
Author: User

White Hat SEO

SEO white hat technology:White hat seo definition: white hat seo As the name implies is aboveboard outflow of water. A kind of justice method, is to use in line with the mainstream search engine distribution policy provisions of SEO optimization methods.  It is contrary to black hat seo, white hat seo has been considered the best SEO technique in the industry, it is in the case of avoiding all risks of operation, but also to avoid any conflict with the search engine distribution policy, it is also the highest professional ethics standards of seoer practitioners. White hat seo significance: white hat seo focus on long-term interests, may be 3-5 years or longer to need more time longer, more effective. Adhere to the use of cheating means, adhere to a few years, no accident, your site should be able to get good traffic and rankings, there is a profit point. And the subsequent reliance on search engines is even smaller.  The long-term interests of the development of advice or white hat approach. White hat seo may feel like using black hat technology to get rankings and profits, but soon the site will be blocked, you have to go back to spend time to do another site. Why not use the white hat method to do a Web site? To give people a really useful website? White Hat website do not worry will be blocked by the search engine, you can also proudly say: This site is mine. 10 years and 20 years later, this website is still bringing you profit, why not? From the Black Hat SEO point of view, they make sense. Most Black Hat SEO is the use of the program, they build a millions of page page, put their spiders out to crawl on it. Even if their website is punished in a month or two, they may have earned thousands of tens of thousands of. For him, the investment was very rewarding. A white hat SEO concerns more long-term interests, two years three years, or even 10 years after nine years of interest.  You insist not to use cheating means, adhere to ten years, no accident, your website should be able to get good traffic, with the flow has a profit. As a seoer should master the 30 kinds of white hat SEO optimization method: 1, according to Web2.0 standards to build stations, preferably through the network verification; 2, the site generated pseudo-static pages, of course, directly for the static page better. Static pages are more easily indexed by search engines, 3, focus on the original content of the website, 4, the similarity of the pages should not exceed 70%; 5, the page collation and orderly, the text is divided reasonably; 6, the website updates regularly, updated daily or regular days an update; 7, page content around the main key words to expand, Do not deviate from the whole station theme, 8, the URL embodies English keywords, 9, the page title appears in keywords; 10, the page label appears keywords (1-3);

11, the description of keywords appear in the label, 12, the content of the natural distribution of keywords; 13, the content of the comments appear the best keywords; 14, the content by the front and the last part of the keywords; 15, add keywords in the H1,H2 tag; 16, link anchor text contains keywords; 17. The name of the image contains keywords; 18, alt attribute added keywords, 19, the page keyword density 6-8%;20, the keywords in bold or italic expression; 21, find high PR value Station import link; 22. Import links for content related pages; 23, Import link anchor text contains keywords; 24. Anchor text is in the content of the Web page; 25, the anchor text appears around the relevant keywords; 26, the site's external link page content and keyword relevance as high as possible; 27, external import links to have regular, avoid short time suddenly add a lot or reduce a lot; 28. The page that contains the external import link does not have more than 100 links ; 29, import links preferably from different IP addresses;

30, anchor text to diversify (such as: Enterprise Win, Enterprise win marketing, Enterprise win Network marketing planning).

SEO optimization 

1, static pages will be information pages and channels, the homepage of the site to a static page, is conducive to search engines faster and better collection.

2. The keyword optimization of the page title must list the title of the information, the name of the site, and the relevant keywords.

3, meta-tag optimization (the past, the important method of search engine optimization, is now not the key factor, but still can not be ignored) mainly include: Meta description, meta keywords set the keyword density to moderate, usually 2%-8%, This means that your keywords must appear several times in the page, or within the scope of the search engine, to avoid piling keywords.

4, for Google to make sitemaps Google's sitemaps is the original robots.txt extension, it uses XML format to record the entire site information and for Google to read, so that the search engine can be more comprehensive content of the site.  Can be made using the Sitemap Builder provided by Google (technician required): Https://www.google.com/webmaster ... emap-generator.html A more comprehensive sitemaps can also be produced by technical staff.

5, the image of the keyword optimization of the image of the Alternative keyword also do not ignore, its other aspect of the role is that when the picture can not be displayed, you may give the visitor an alternative explanation statement.

6, avoid the nesting of the table now the table nested too much, search engine usually only read 3

7, the use of Web standards for site reconstruction as far as possible to make the site code to conform to the HTML 4.0 or XHTML 1.0 specification. Through the XML+CSS technology for website Reconstruction, reduce non-tabular and redundant code, improve the extensibility of web pages, compatibility, can make more browser support.

8, the site structure of the flat planning directory and content structure preferably not more than 3 layers, if there are more than three layers, preferably through the sub-domain to adjust and simplify the structure of the layer. In addition, the canonical approach to directory naming is to use English instead of phonetic alphabet

9, the page capacity of the rationalization of reasonable page capacity will improve the speed of the page display, increase the friendliness of the search engine spider program. Also suggest JS script and CSS script to use linked files as far as possible

10. External file policy put JavaScript files and CSS files in the JS and CSS external files respectively. The advantage of this is that you put important page content at the top of the page, and you can reduce the file size. Help search engine quickly and accurately crawl page important content. The other font (font) and formatting tags are also used sparingly, and CSS definitions are recommended.

11, external links as much as possible to other topics related to your website links to the site, as far as possible with the high PR value of the site to link with each other. If the website provides export links related to the subject, it is considered by the search engine to have rich content related to the subject, and also facilitates ranking, such as the concept of various investment sites and investment and financing websites. In addition to avoid links regardless of the quality of the large area of the net, to search engines rather less refined.

12, the site map of the site's own site map is a search engine more comprehensive index of your site is an important factor. Proposed to make a text-based site map, including all the columns of the site, sub-columns. Site Map of the three major factors: text, links, keywords, are extremely conducive to search engines to crawl the main page content. In particular, dynamically generated catalog sites need to create site maps.

13, image hotspot In addition to AltaVista, Google explicitly support the image hotspot links, other engines are not supported temporarily. When a "spider" Program encounters this structure, it cannot be identified. So try not to set the image Map link.

14, Flash application Flash because it does not contain text information, should be used for function display and advertising, less for the site columns and pages.

15, JS script in the browser does not support JS Footstep noscript> tag will play an important role in the search engine spider search will also help.

16, frame frame tag will be ignored by the search, as far as possible, if you must use, you should use the noframe tag correctly, in the Noframe>/noframe> area contains a link to the frame page or a descriptive text with keywords, the same The keyword text also appears in areas outside the frame.

17, the information of the internal link to improve the site rankings and PR values, such as related information, referral information, etc.

--------------------------Website Structure Layout optimization

It is recommended to use a flat structure. The directory level of the site to be as small as possible, small and medium-sized sites not more than 3. The following 3 aspects should be noted for flattening the structure:

Control the number of links on the homepage (the highest weight on the homepage). The homepage should have the effective link, the crawler through the homepage connection arrives inside the page, if does not have the valid link to directly affect the website to ingest the quantity. Of course, the home page can not have too many links, the links in the pages do not put on the homepage, the link will affect the user experience more. The number of links on the homepage of the SME website should be within 100. The nature of the link can be page navigation, anchor chain. A flattened directory hierarchy. Try to get the spider to jump 3 times to get to any of the site's internal pages. For example, the design of the site can use the following Level 3: Home page, columns, content pages flattened structure, but not the depth of the structure. For example, we're going to make a site about plants that can be structured in the following hierarchy: 3-tier directory structure

Navigation SEO optimization. Navigation can be divided into main navigation and parent navigation. Navigation should use text, and if you use picture navigation for the user experience, you should set the necessary and attributes for the picture alt title . Secondly, breadcrumb navigation should be used in the design of the navigation. It enables users to understand the current location and how the content of the site is organized. (Convenient??? Crawl)

Breadcrumbs Navigation: Role

    1. 1, let the user know the current location, and the current page in the entire site location.
    2. 2, embodies the site's architecture level, can help users to quickly learn and understand the content and organization of the site, thus forming a good sense of location. 3. Provide quick access to all levels to facilitate user operation. 4. Google has integrated breadcrumb navigation into search results, so optimizing the name of each level of breadcrumb navigation and using keywords can be optimized for SEO. Breadcrumb trails are helpful for improving the user experience. 5, user-friendly, bread crumbs are mainly used to provide users with a secondary way to navigate a site, through a large multi-level site to provide all the pages of the breadcrumb path, users can more easily locate to the previous directory, guide users to pass; 6, reduce the return to the previous page of the Click or action, do not use the browser "back" button or the main navigation of the site to return to the previous level of the page, 7, do not often occupy screen space, because they are usually horizontal arrangement and simple style, breadcrumb path does not occupy too much space on the page. The advantage is that they have little or no negative impact on content overload; 8. To reduce bounce rate, breadcrumbs are a great way to lure first-time visitors to browse the site after entering a page. For example, a user searches for a page through Google and then sees a breadcrumb path, which will persuade the user to click on the previous page to browse the topic of interest. This, in turn, can reduce the overall bounce rate of the site. 9, in favor of Baidu Spider crawl site, the spider directly along that chain to go on, very convenient. 10, bread crumbs conducive to the construction of the site chain, with bread crumbs greatly increased the internal connection of the site, improve the user experience. 4. The size of the page is controlled below 100k.
Code SEO Optimization
    • Labels place the title of the page, and the individual pages are different.
    • <meta keywords>List a few important keywords.
    • <meta description>A high-level overview of Web content.
    • <br />Tags should be placed in the middle of the text, that is, using <p><br /></p> an alternative to abrupt <br /> markup.
    • HTML semantics. For example, to do a navigation, we can use the div + span tag:
<Divclass= "NAV" >    << Span class= "Hljs-title" >span> course </span>    < Span class= "Hljs-tag" ><span>| </span>    <span> Question and answer </span>    <span>| </span>    <span> community </span> </DIV>           

But the code above is not semantically ( div and tags span are the least semantically labeled in HTML), the recommended practice is to use ul and li tag, and then to do the same with the appropriate CSS style:

<ul class="nav">    <li>课程</li>    <li>问答</li>    <li>社区</li></ul>
.nav li{    float: left;    list-style: none;    display: block;    margin: 0 5px;    border-right: 1px solid #000;}
    • <a>tag to add a description ( title attribute), for links to external Web sites to use rel="nofollow" properties to tell the crawler not to crawl other pages.
    • Text headings to use tags, subtitle to use . You can use CSS styles for explicit style settings. Remember: do not use tags in unimportant places .
    • <table>The optimization of the tag. Use <caption> the tag to specify a caption for the table.
<Table><Caption> table title</Caption><Tr><Th> Quarter</Th><Th> Sales</Th></Tr><Tr><td>1</td>        <td>33665.25</td>    </tr>    <tr>        <td>2</td>        < Span class= "Hljs-tag" ><td>21215.99</ td>    </tr></TABLE>        
    • Emphasize that important content in the Web page should use <strong> tags and avoid using <b> tags (unfriendly to search engines). <em>the weight is second only to <strong> . Use tags and tags only if the displayed effect changes <b> <i> .

    • labels should use ALT to explain

    • <strong><em> and <b><i> tags

Tips
    1. Put important HTML code at the front, the unimportant parts of the ad are placed at the end of the entire document, and then use CSS styles to control the ad div floating around.
    2. Important content do not use JS output.
    3. Use the framework as little as possible iframe .
    4. You should use attributes instead of code for elements that you don't want to display temporarily z-index display:none; , because spiders filter display none The contents of the property.
    5. Try to streamline your code.

    6. External links: rel= "nofollow" avoid spiders and can't go back.

-----------------------------------

1. Please reduce the HTTP request

  Basic principle:

When the browser (client) and the server communication, it has consumed a lot of time, especially when the network situation is worse, this problem is particularly prominent.

The process of a normal HTTP request: If you enter "Www.xxxxxx.com" in the browser and press ENTER, the browser then connects to the server that the URL points to, then the browser can send the request information to the server, the server receives the requested information, and then returns the corresponding information. After the browser receives the response from the server, the data is interpreted for execution.

And when we request the webpage file has many pictures, CSS, JS even music and so on information, will frequently establish the connection with the server, and releases the connection, this must cause the resource waste, and each HTTP request will have the performance burden to the server and the browser.

With the same speed, downloading a 100KB image is faster than downloading two 50KB pictures. Therefore, please reduce the HTTP request.

  Workaround:

Merge images (CSS sprites), merge CSS and JS files, and use lazyload and other techniques to optimize images.

2. Please understand Repaint and Reflow correctly

Note: Repaint and Reflow are redrawing and rearrangement, please allow me to show off my limited knowledge of a few English words. 囧

  Basic principle:

Repaint (redraw) occurs when the appearance of an element is changed, but without changing the layout (wide height), such as changing visibility, outline, background color, and so on.

Reflow (Reflow) is the DOM changes affect the geometry of the element (and height), the browser will recalculate the geometry of the element, will invalidate the affected part of the rendering tree, the browser will verify all the other nodes on the DOM tree visibility properties, This is also the cause of reflow inefficiency. Such as: Change the size of the window, change the size of the text, content changes, browser window changes, style property changes and so on. If the reflow is too frequent, the CPU usage will go up in the miso, so it is necessary for the front end to know the knowledge of Repaint and reflow.

  Ways to reduce performance impact:

It is mentioned above that by setting the Style property to change the node style, each setting will cause a reflow, so it is best to set the class mode, the animated element, its position property should be set to fixed or absolute, This does not affect the layout of other elements, and if the functional requirements cannot be set position to fixed or absolute, then the smoothness of the speed is weighed.

In short, because Reflow is sometimes unavoidable, it is only possible to limit the extent of Reflow's influence.

3. Reduce the manipulation of the DOM

Basic principle:

The cost of DOM manipulation is high, which is often a performance bottleneck in Web applications.

Be born slow. In high-performance JavaScript, the analogy: "Think of Dom as an island, and JavaScript (ECMAScript) as another island, connected by a toll bridge." So every visit to the DOM teaches a bridge fee, and the more times you visit, the more money you pay. Therefore, it is generally recommended to minimize the number of bridges.

Workaround:

modifying and accessing DOM elements will cause the repaint and reflow of the page, and looping over DOM operations is more of an evil behavior. So use JavaScript variables reasonably to store content, consider the performance overhead of loops in a large number of DOM elements, and write once at the end of the loop.

Reduces query and modification of DOM elements, which can be assigned to local variables when queried.

  Note: In IE: hover will slow down the response.

4. Using JSON format for data exchange

Basic principle:

JSON is a lightweight data interchange format that is ideal for data interchange in a completely language-independent text format. At the same time, JSON is a native JavaScript format, which means that working with JSON data in JavaScript does not require any special APIs or toolkits.

Compared to XML serialization, JSON serialization typically results in smaller data volumes than XML serialization, so JSON is used as a way of exchanging data in well-known websites such as Facebook.

JS Operation JSON:

In JSON, there are two kinds of structures: objects and arrays.

1. An object ends with "{" Starting with "}". Each "name" is followed by a ":" and "," (comma) separated by the name/value pair. The name is enclosed in quotation marks, and if the value is a string, it must be enclosed in quotation marks, and the numeric type is not required. Such as:

var obj={"name": "Darren", "age": $, "location": "Beijing"}    

2. An array is an ordered collection of values (value). An array ends with "[" Start, "]". Use "," (comma) to separate values. Such as:

var jsonlist=[{"name": "Darren", "age": +, "location": "Beijing"},{"name": "Weidong.nie", "Age": $, "location": "Hunan" }];

The operation of this array and object literal is very convenient and efficient. If you know the JSON structure in advance, it is wonderful to use JSON for data passing, and you can write very useful, aesthetically readable code. If you are a purely front-desk developer, you will love json very much.

5. Efficient use of HTML tags and CSS styles

  Basic principle:

HTML is a language used to describe a Web page, it uses tag tags to describe the Web page, as a qualified front-end development, you need to know its common tags represent the meaning (SEO) and attributes (presentation).

CSS refers to cascading style sheets (Cascading style sheets), if you think of the page as a person, HTML is a human skeleton, CSS is a person's clothes, a person's taste from his clothes can be at a glance.

A professional front-end development is also a good refactoring, because in the page often there are a variety of unreasonable nesting and repeated definition of CSS styles, I do not want you to refactor the page, but I hope you encounter this situation when you solve these problems. HTML like this:

<table><tr><td>
<table><tr><td>
...
</td></tr></table>
</td></tr></table>

Or a CSS like this:

Body. box. Border ul Li P strong span{color: #000}

All of this is a very bad way to use HTML and CSS.

Correct understanding:

HTML is a markup language, you must understand its properties before using a reasonable HTML tag, such as Flow elements,metadata Elements, phrasing Elements. The basis is to know the block-level elements and inline elements, box model, SEO knowledge.

CSS is used to render the page, and there is the problem of rendering efficiency. CSS selectors are matched from right to left, where CSS selectors are sorted in order of overhead from small to large:

ID selector #box
Class selector. Box
Tag Div
Pseudo-class and pseudo-element a:hover

When a page is triggered to cause reflux (reflow), inefficient selectors can still cause higher overhead, so avoid inefficiencies.

6. Using CDN Acceleration (content distribution Network)

Basic principle:

The full name of the CDN is the Content Delivery network, which is the contents distribution networks.

The basic idea is to avoid the bottlenecks and links on the internet which may affect the speed and stability of data transmission, so that the content can be transmitted faster and more stably. By placing the node servers in the network, a layer of intelligent virtual network based on the existing Internet, the CDN system can re-direct the user's request to the nearest service node according to the network traffic and the connection of each node, the load condition and the distance and response time of the user. "-Baidu Encyclopedia.

The above a few words how many can settle down read, so I still tell the story to introduce again, by the way, the story source is unclear, ^_^:

Ancient war everyone must know, because the ancient traffic is not developed, so when the foreign offensive often can not be timely counter-attack, and so the court levy after the soldiers sent to the border when those invaders are already missing traces, this let the ancient emperor is very depressed. Later, the emperors learned to be smart, they will be a large number of troops in advance to the border, so that they usually mita, wartime soldiers, such a strategy has played a significant role.

Deficiencies:

Real-time is not very good is the CDN's fatal flaw. As the demand for CDN increases, this flaw will be improved so that the Web content pages from the remote server are kept in sync with the Web pages in the replica server or cache. The workaround is to transfer the new network content directly from the server side to the cache when the network content changes, or to replicate the network content of the data source server to the cache server as real-time as possible when access to the network content increases.

7. Put the CSS and JS into the external file reference, CSS Head, JS put the tail

Basic principle:

Note: This is a very basic and must follow the knowledge point, but for the integrity of the article to add in it, hey.

The benefits of introducing external files are obvious, and it is necessary to do so when the project is slightly more complex.

Easy to maintain, easy to expand, easy to manage and reuse.

The right way:

JavaScript is the Overlord in the browser, why say so, because in the browser when executing JavaScript code, can not do other things, namely <script> Each time the page waits for parsing and execution of the script (whether JavaScript is inline or out-of-chain), the JavaScript code continues to render the page after execution is complete. This is also the JavaScript blocking feature.

Because of this blocking feature, it is recommended to put JavaScript code before the </body> tag, which can effectively prevent JavaScript blocking, but also make the HTML structure of the page can be released faster.

The HTML specification clearly states that the CSS should be included in the

8. Streamline CSS and JS files  

Basic principle:

There is a very important rule that has not been mentioned, that is, the compression of CSS and JavaScript, directly reduce the volume of downloaded files. My personal use of the way is to use the YUI Compressor, which is characterized by: remove the comments; remove extra spaces; fine optimizations; identifier substitution.

Yui Compressor is a Java program, if you are familiar with Java, you can quickly get started using Yuicompressor.jar; If you are unfamiliar with Java is also OK, you can use Yui Compressor, the following describes its use.

YUI Compressor's configuration and use:

Configure the usage environment first:

1. First make sure that the JDK is installed in your computer

2. Configure the necessary environment variables (details can not be 32, so do not know how to set or search it)

3. In the CMD interface, enter Javac to test whether the installation was successful

Using the method can be from CMD to enter Yuicompressor.jar disk, I take my own yuicompressor-2.4.2.jar as an example:

1. Compress JS

Java-jar Yuicompressor-2.4.2.jar api.js > Api.min.js

2. Compress CSS

Java-jar Yuicompressor-2.4.2.jar style.css > Style.min.css

  

Of course, there is another kind of more stupid way to use, to catch the interest of friends themselves can go to try more.

9. Compress pictures and use picture Sprite technology

Basic principle:

Note: In fact, compressed picture and Picture Wizard is two aspects of the technology, but since all is about the optimization of the picture or put a piece of it.

Now because of the breakdown of the work, professional front-end engineers have little chance to cut the picture, but about the image compression still have to understand slightly, the general picture compression methods are:

1. Reduce the image resolution;

2. Change the image format;

3. Reduce the quality of picture preservation.

About the image Sprite (sprite) technology is directly related to our work, whether in the CSS image or in the HTML structure of the image will produce HTTP requests, front-end optimization of the first is to reduce the number of requests, the most direct and effective way is to use the Picture Wizard (CSS Sprite). The picture Wizard is to put many pictures in a large picture, through the CSS to display part of the picture.

As for the Picture Wizard operation details are not much to do introduction, online related content a lot.

10. Take care to control cookie size and contamination

Basic principles and methods of use:

Basic and advanced knowledge of cookies can be seen in an article I wrote, "JavaScript Operation Cookie."

Because the cookie is a local disk file, every time the browser will read the corresponding cookie, it is recommended to remove unnecessary coockie, so that the Coockie volume as small as possible to reduce the impact on user response;

When using cookies to operate across domains, be aware of setting Coockie on the domain name of the adaptive level so that the subdomain is unaffected

Cookies are life-cycle, so please be aware that setting a reasonable expiration time, reasonably expire time and not removing coockie prematurely, will improve the user's response time.

Front-end SEO

Related Article

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.