Seo Course Notes (2)

Source: Internet
Author: User

Title ):
Web page optimization starts with the title. In the search results, the text displayed in the first line of each captured content is the title of the page. Similarly, a page is opened in a browser, and the title of the page is displayed above the address bar. Therefore, title is the core of a page. Note the following when writing a title:


1. The title is short, concise, and highly generalized. It contains keywords instead of only one company name. However, you should not have too many keywords. do not have more than three phrases. The title of an enterprise website is usually based on the company name + keyword.



2. The first few words are the most important to the search engine. Therefore, the keyword should be placed first.

3. It is best to organize the title into short sentences or phrases that conform to the syntax structure and reading habits, so as to avoid meaningless listing of phrases.

Keywords ):
Keywords prompt search engine: the content of this website is centered on these words. Therefore, the key to keywords writing is that each word can be matched in the content to facilitate ranking. For keyword writing skills in Meta, refer to the previous article "Keyword strategy ".

Description ):
The description section uses short sentences to tell search engines and visitors about the main content of the webpage. In the search results obtained by searching with the core Keywords of the website, the description is usually displayed as several lines of description text after the title. Description is generally considered important after title and keywords. Note the following when writing the description:

1. keywords appear in the description, which are related to the body content.

2. Similarly, the short principle should be followed. The number of characters, including spaces, should not exceed 200 characters


3. Additional descriptions that are not fully stated in title and keywords

Other Meta Tags:
To restrict content crawling by search engines, you can use the following robots Meta Tags:
<Meta name = "Robots" content = "All | none | index | noindex | follow | nofollow">
ALL: files will be retrieved, and links on the page can be queried;
None: the file will not be retrieved, and the link on the page cannot be queried;
Index: the file will be retrieved;
Follow: links on the page can be queried;
Noindex: the file will not be retrieved, but the link on the page can be queried;
Nofollow: files will not be retrieved, and links on the page can be queried.

Introduction to the ghost file application, e-book expert Ping Wensheng wrote a special article to explain in detail "robots.txt and robots meta tag "). Other commonly used meta tags include:

<Meta name author> author
<Meta name Classification> category of the website's directory
<Meta name copyright> copyright notice
<Meta name generator> what software does the website use ......



It is best to write the title and meta tags of the home page and important internal pages separately, and reflect different contents for Topic topics.
Dynamic Web page optimization
Dynamic websites refer to website content updates and maintenance through a software with a database background, that is, Content Management System (CMS. ASP, PHP, cold fusion, CGI, etc.ProgramDynamically generate pages. Dynamic Pages do not actually exist in the network space. Most of their content is usually from the database connected to the website. They are generated only after a value is entered in the Variable Area when a user request is received. The dynamic web page extension name is .asp;.php;cfmor .cgi, but not the static Web page .html#.htm. In its url "? "," = "," % ", And" & "," $. The benefits of using dynamic technology for websites, in addition to adding website interaction functions, also has the advantages of easy maintenance and updates, so it is used for many large and medium-sized websites.


But most search engine spider programs cannot interpret symbols "? . This means that dynamic web pages are hard to be retrieved by search engines, and the chances of being found by users are also greatly reduced. Therefore, before building a website, we must first correct our thinking, that is, we should try not to use dynamic implementation for webpages that can adopt static performance, and use static performance for important webpages. At the same time, the technology is used to convert dynamic web pages into static Web pages so that the URL does not contain "? "" = "And other similar symbols. You can also make some changes to the website to indirectly increase the search engine visibility of dynamic web pages. That is, adhere to the principle of "Dynamic and Static braking.



Solutions for different technologies:
Dynamic Web pages developed for different programs have corresponding solutions. Some of the content compiled by the author Karen is as follows:


1. cgi/perl
If you are using CGI or Perl on the website, you can use a script to pick up all the characters before the environment variable, and then assign the remaining characters in the URL to a variable. In this way, you can use this variable in the URL. However, for web pages with some built-in SSI (server-side include: server-side embedded) content, the main search engine can provide indexing support. Some web pages suffixed with .shtmlare also merged into ssifiles, equivalent to common. html files. However, if these webpages use the cgi-bin path in their URLs, they may not be indexed by the search engine.


2. asp
ASP (Active Server Pages: dynamic web page development technology on Web servers) is used in Microsoft-based network servers. Web pages developed using ASP are generally suffixed with. asp. Do not use the symbol "? ", Most search engines Support dynamic web pages developed using ASP.

3. Cold Fusion
If you are using cold fusion, you need to re-configure it on your server to enable it to replace the symbol "? Replace the value with the symbol "/", and pass the replaced value to the URL. In this way, the browser end is a static URL page. When the search engine searches for the converted file, it does not encounter "? ", So you can continue to index the entire dynamic page, so that your dynamic web page is still readable to the search engine.

4. Apache server
Apache is one of the most popular HTTP server software. It has a rewrite module called mod_rewrite, that is, the URL rewrite function. This module enables you to convert URLs containing environment variables to URL types supported by search engines. This rewrite function can be used for webpage content that does not need to be updated much after release, such as news.

Create a static entry:
Under the guidance of the "static and dynamic combination and static braking" principle, you can also make some modifications to the website to increase the search engine visibility of dynamic web pages as much as possible. For example, you can add a dynamic webpage to a link of a static homepage or website map to present the dynamic webpage in a static directory. You can also create a dedicated static portal page (gateway/entry) for a dynamic page, link to the dynamic page, and submit the static portal page to the search engine.

Create static webpages for important pages with relatively fixed content, such as website introductions with rich keywords, user help, and website maps containing important page links. The homepage should be static as much as possible, and all important dynamic content should be presented in text link mode. Although the maintenance workload is increased, it is worthwhile from the SEO perspective.

You can also create a static image website for your important dynamic content.

Paid login search engine:
Of course, the most direct way to improve search engine visibility is to log on to a dynamic website published by the database-connected Content Management System (CMS, directly submit a dynamic web page to the search engine directory or make keyword ads to ensure that the website is indexed by the search engine.


Improvements to search engine support for dynamic websites
Search engines have been improving their support for dynamic pages. So far, Google, Hotbot, Baidu, and so on have begun to capture dynamic website pages (even including URLs "? "Page ). But when these search engines capture dynamic pages, in order to avoid "Search robot traps" (spider traps, script errors, such errors will allow search robots to capture infinite loops and cannot exit )", only the dynamic pages linked from the static pages are crawled, and the dynamic pages linked from the dynamic pages are not crawled, that is, the links in the dynamic pages are not deeply accessed.

Note the following before using a dynamic URL:



· Do not include session IDs in file URLs, and do not use IDS as parameter names (especially for Google );
For example, in the book "network marketing basics and practices", the URL address on Dangdang's introduction page is: asp? Product_id = 493698 "> If product_id = 493698, the page cannot be read.

· The fewer parameters, the better. Try not to exceed 2;

· Try not to use parameters in the URL to increase the depth and quantity of captured dynamic pages.

Information provided by Google to website administrators:
Baidu webmaster common problems:

Part 5: Key of search engine optimization: link strategy
Links are the soul of a website. The user obtains a wide range of website content through hyperlink. The search engine spider also tracks the website link layer by layer to capture the website information. For search engines, especially Google, the key to determining a website ranking is how many high-quality external links direct to this website. This is an external or reverse link, also known as an import link (inbound links or backlinks ). The export links from websites to other websites and links between pages within the website also have a greater or less impact on rankings.

Search engine category directory
The search engine directory is the manual classification directory of the search engine. Today, log on to the category directory is a very basic and important task after the website is built. In particular, the importance of logging on to several major categories does not lie in whether visitors can find your website through the directory link, the main reason is that your website has obtained important and high-quality external links through these directories, which plays an important role in improving the ranking of your website.


Currently, the most important classification directories for Chinese websites are: Open Directory ODP, Yahoo !, Portal search engine directories: Sohu, Netease, and Sina. You can choose between free logon and paid logon. You must manually enter your account to log on, whether it is free or paid.



Free Logon:
The most famous and important logon is the world's largest open directory library Open Directory Project:
The purpose of ODP is to create the most comprehensive and authoritative directory on the Internet and a high quality resource library recognized by the public. With this purpose, global volunteer editors select websites with high-quality content for approval to enter the category directory.


Because important search engines such as Google use the ODP database, it is the basis of Google's monthly deep index. Therefore, submitting a webpage to ODP becomes the top priority for each website. It is free to log on to the ODP directory. However, you must accept strict manual review and wait for a long time. In the end, you may fail to log on to the website, and you must submit the file repeatedly.



Because dmoz directories play an important role in website rankings, more and more websites are submitting to dmoz or submitting illegal websites, in the manual editing team that volunteered to join, the phenomenon of privilege for private purposes and poor quality appeared. Many factors make it very difficult to log on to dmoz. No website can be included by dmoz. Therefore, the only way to ensure successful logon is to carefully and fully comply with the dmoz logon terms. Note:



1. Ensure that the website content is original, not reprinted, mirrored, or copied.
If your website contains only some membership product/service information and links, or a large number of copies of other website content, your website may be rejected by dmoz. Dmoz removes images, copies, or non-original images from directories even if they are already indexed. Therefore, you must add original product or service Introduction information to your website.


2. Do not use false, cheating, or exaggerated means
Search robots are also banned from directory editing. The following sections detail what is search engine cheating. Any false or exaggerated text in the website description will be rejected. Illegal content is not included.

3. Ensure the good appearance of the website
If a website contains a large number of misspelled characters, dead links, or "under construction", or the download speed is slow, it will hinder the editing of your website. In addition, there are very few opportunities for websites from free hosts to be indexed by dmoz.


4. Ensure that the website contains specific contact information
A dmoz editor said that if he could not find the actual contact address or phone number on the website, he would think the website was not credible. Therefore, if your website has only one email address as the contact information, it is extremely unfavorable for the successful recording.



5. Make sure that the website is submitted to the correct directory.
Selecting appropriate category directories and subdirectories is the core of website submission. Many website logon failures are caused by incorrect directories. Therefore, before submitting a website, you must first browse the entire directory. It is best to know which directory the competitor's website is located. After confirmation, click "Submit webpage" in the upper-right corner of the Directory page ".



6. Write down the submission date, directory name, and Edit Email Address
After submitting your website to ODP, write down the submission date and directory. If the submitted directory contains the editor information, you 'd better remember the edited name and email. This information is useful when you need to query the processing status of the submitted website or submit the website again.



7. Do not submit the website multiple times
Because editing is performed based on the order of website submission dates, many websites in a directory are waiting for approval in the queue. From successful submission to final recording, it may take up to two weeks, or even half a year later. Therefore, you need to wait patiently for the results. If the website is very large and has many different content branches, you can try to submit different content webpages to the corresponding directories in dmoz.


Once dmoz includes your website, it will soon be indexed by some large search engines and portals such as Google, Lycos, Netscape, AOL, Hotbot, and directhit.


Other important free directories include the Yahoo directory. Yahoo was the first website to start classification directories. logging on to the Yahoo directory today is also an important part of website promotion. Google regards the link from the Yahoo directory as an important score of the website ranking.

Recommended tools:
Check whether the website has logged on to multiple important directories: PhP "> www.123promotion. co. uk/directory/index. php

Paid login sion ):
English Yahoo implements a paid Logon Policy for commercial websites, and domestic portal search engine directories also adopt paid logon policies for commercial websites. The business model of paid logon includes regular logon and fixed ranking, which are generally paid on a yearly basis. The website will log on to the directory immediately after payment, without waiting or being affected by other factors. The search programs of the portal search engine are also more focused on capturing their own paid directory data.

In general, paid login is necessary for commercial websites and websites that use a large number of methods that are not conducive to the Construction of search engines.

About the automatic logon software (submitting tools ):
Since the B2B business platform and search engine marketing, there have been automated login software that submits supply and demand information or websites to these intermediate platforms and search engines, and has been widely used in the market. The function of the search engine automatic logon software is to log on to n search engine directories around the world from customers' websites at one time, and some even ensure that the ranking of websites in the search engine can be greatly improved. In fact, many marketing practitioners have questioned the actual effect of such automatic login software:

1. The search engines that truly bring traffic to websites are mainly concentrated on several major search engines, and the search volume of other small search engines is extremely small, even if the websites are successfully indexed by these small search engines, it cannot count on the traffic volume they bring.

2. Today's mainstream search engine directories often use paid login or strict manual review. These directories are highly disgusted with websites submitted by the self-dynamic login software, and some explicitly propose a method to reject automatic submission.

3. Among the thousands of search engines that come with the software, the software itself prompts that the logon was successful at around 6%. It is difficult to check whether the logon was successful one by one in actual operations.


4. If the website has an import link from other websites, even if you do not need to log on, mainstream search engine robots will also crawl your webpage.

5. From the perspective of adding external import links, an English website can submit an English search engine automatically. This is better than nothing, but it cannot have too many expectations.



In general, automatic login software, which is prevalent in the free promotion period, has become increasingly valuable in today's mainstream commercial promotion platforms that have adopted payment policies.
High-quality import link
When determining the ranking of a website, a search engine not only analyzes the content and structure of the webpage, but also analyzes the links of the website. An important factor affecting website rankings is obtaining as many high-quality external links as possible, also known as import links. Even if the website is not submitted to the directory, you can quickly crawl the search engine by using your website link on other important websites and add points to the rankings.


The basis for incorporating the import link into the ranking is that the search engine believes that if your website is valuable, other websites will mention you. The more you mention, the greater the value. This leads to the importance of link popularity in search engine optimization.

Only in this way, people try to "CREATE" external links for the website, resulting in a lot of spam links and websites, so the search engineAlgorithmDuring Adjustment, only high-quality external links are valued, and spam-like practices are often counterproductive. Therefore, we need to have this understanding of the link breadth today: even if we get links to hundreds of sites with poor quality or irrelevant content, it is not a link to a site with high quality and highly relevant or complementary content.



Link quality analysis:

Links from the following websites can be referred to as high-quality links:
◆ Links in the search engine directory and links to websites already in the directory


◆ Websites related to or complementary to your subject
◆ Websites with a PR value of no less than 4
◆ Important websites with high traffic, high visibility, and frequent updates (such as search engine news sources)
◆ Websites with few export links
◆ Use your keywords to rank the top three pages in the search results
◆ Websites with high content quality



Compared with high-quality links, the following links are called spam links, which do not affect or affect website rankings:


◆ A large number of posts in message books, comments, or BBS are carried with website links
◆ Websites that have added too many export links (dozens or even hundreds of "Links" Have one of your websites)
◆ Add link farm, bulk link exchange programs, and cross link to automatically exchange links with a large number of member websites, it is regarded as a typical spam link by search engines and is very likely to be punished or implicated. Google will permanently delete websites that use linked programs.



Two other import links are mistaken for increasing the link breadth:



◆ Click the paid search engine ad link, such as Baidu bidding ranking and Google's keyword advertisement on the right.
◆ Multi-layer membership Alliance (Affiliate Program) links.


These links do not direct to your website, but to the website of the Alliance owner, so that they can track and click billing, so they will not increase the link breadth of your website.


Of course, if you host this membership alliance, your server will be tracking member websites and directing them back to your site. In this case, it will also help increase your link breadth.

How to obtain high-quality import links:

1. Submit a website to the search engine directory
See the previous article

2. Search for website exchange links
Commonly known as links or mutual links. The foundation of the reciprocal link is that the content of your website is of high quality, otherwise the request link is not easy to succeed.

Exchange link objects include:


◆ Websites that have joined the search engine category directory
All websites in the directories related to your industry in the main search engines are ideal link objects.


◆ Websites connected to your competitors
To find these websites, enter "link:" in the search engine and enter the competitor's domain name.
"Link:" and "link:" can also obtain target customers from competitors while obtaining links.

◆ Objects in the business chain
Competitors are most related to your subject, but the exchange link is unlikely. Therefore, you can consider exchanging links with upstream and downstream partners in your business chain, including distributors, agents, suppliers, and other websites.

◆ Easily discovered websites
Such as those websites that make search engine advertisements, other websites that are vigorously promoted, and websites that are naturally ranked well.



Find the above websites and analyze the links to check whether they are high-quality link objects. Such as the traffic Alexa ranking, the PR value, and the number of exported links are the basic conditions for the test. It is best to provide the link HTML for the other partySource codeInformation, so that the link exchange object simply needs to copy theseCodeAnd embedded in your own web page. Note that your website can be accessed through several different URLs. A unified URL must be provided during link exchange, typically based on the that best fits the needs of most people.



The process of exchanging links is obtained after careful analysis on the website of the other party through a sincere one-to-one connection, rather than the spam bombing of the Internet.


3. The website is actively linked or reproduced
This is the link most appreciated by search engines, and it is also the root cause why search engines attach importance to external links. If your website has rich content and high quality, other websites will take the initiative to link your website to their website. Especially when your website provides a lot of related free resources and knowledge bases, the chances of being linked and reposted by other websites are high.

4. Professional posting on important websitesArticle
Post articles on important websites with the target keywords, add your website signature in or at the end of the article, or put your link and website description on the author's profile. In this way, both high-quality reciprocal links and target customers can be obtained. Do not forget to add a website link to your blog log or personal homepage. Note that the title of each article you post should contain keywords. Websites are websites with high traffic and high prestige. In my personal experience, you can use keywords on your website to subscribe to news from mainstream search engines. websites that are used as news sources are regarded as important websites by search engines. The search engine searches these news sources once a day and updates them frequently. The links on these websites naturally become updated objects, which delivers excellent results.

, Submit the website in the Industry Directory
Submit your website to more related network directories, industry directories, business directories, yellow pages, and white pages as much as possible to join the Enterprise Library.

Recommended tools:
Query the import links and quantity of the website: (comprehensive query: including link breadth, PR, Alexa ranking)
Http:// PR whois Alexa Query (can be compared with multiple competitor websites at the same time) (check the indexing status of 10 search engines at the same time)
ASP "> www. seotoolkit. co. uk/link_popularity_checker.asp
Search box command:
Enter: Link: in Google
In yahoo, enter: Link:
Export link and internal link
The export link is the link to other websites in your website. In addition to analyzing your import links, the search engine robot will also analyze the sites you are picking up. If the content of the exported link site is associated with your website subject, it will also benefit the search engine. This is also the reason why the subject-related websites should be selected for the link exchange. Finally, links between pages in a website are also included in the Link Analysis, which affects the page level of the website and ultimately affects the ranking.


Export link:
Adding industry/professional resource website links related to your keywords on your website will not only enrich the website content, but also improve the search engine's good impression on your website.

This is particularly worth the attention of some webmasters. Due to the lack of original materials, they often reselect articles from other websites as the content of their own websites, but do not indicate the source, lest export links contribute to competitors, not conducive to their own website access. In fact, it is necessary for a search engine to export a proper amount of links.

Of course, exporting is different from importing. You must control the number even if the subject is related. From the perspective of "webpage level", the search engine considers that the more exported links a page provides, the less benefit the other pages of the website will enjoy. Therefore, the number of exported links on a page should be properly controlled. The number should not exceed 15. The export links of the home page should be controlled at least 10, and the extra links should be separately arranged on the secondary pages. Google believes that the maximum number of exported links to a page should not exceed 100.



Internal link:
In addition to the import and export links, pages of a website are often linked to each other, such as the "related articles" list after an article ends, linking other articles under the same topic of the website not only facilitates visitors, but also facilitates search engines. Websites should be aware to link important content pages to other pages, so that search engines can know that this is your important page, so as to provide a higher PR value for key indexing.

The website can also establish multiple second-level domain name sub-sites to link the sub-sites to each other, and then link them back to the main site to form a sub-station group surrounded by the main site, which is quite advantageous for improving the ranking.



Both export links and internal links must ensure that the links are valid rather than dead links. Too many dead links not only cause inconvenience to users, but also affect the search engine friendliness and rankings.

Attachment: Free link detection tool:
1. Xenu:
2, W3C:
Keyword link text and context Semantics
The search engine attaches great importance to the keywords in the link text. Therefore, whether it is an export link, an import link, or an internal link, it is best to include keywords in the link text. For example, you 'd better select websites with keyword names. For a website with a webpage design, you can add an export link with the word "Webpage Design" in the industry resources.


It should be noted that if the text of an imported link to a website is identical, it may be ignored or punished by the search engine. Because the search engine wants to import links that are naturally created by others, this unified text makes the search engine suspect that the website is artificially created to increase the link breadth. Therefore, it is best for websites to use different texts when exchanging external links. For example, the link texts of the online mail studio include "online mail", "online mail Design Studio", and "Shenzhen Webpage Design Studio.



In addition to link text, context semantics is also very important. The link analysis system checks the link context content to determine the correlation between the website and its linked objects to evaluate the content quality of the page. Through semantic analysis, you can also determine whether keywords are fraudulent.



Part 6: dangerous search engine optimization cheating
Because the technical search engine is automatically completed by the Spider Program in the website ranking process, there is no human participation, which provides the possibility of success for those who adopt the method of deceiving the Spider Program for the ranking principle. Therefore, during the development of Seo, the discussion on cheating methods has become a hot topic in the industry.


Seo cheating Methods
To Do Seo, you must understand the basic cheating methods to avoid unintentional punishments. The following are common fraud measures:



1. Keyword stacking:
In order to increase the frequency of keyword occurrence, stuffing is intentionally used to repeatedly write a keyword in the webpage code, such as meta, title, comment, image ALT, and URL address ).

2. False keywords:
You can set keywords irrelevant to the website content in Meta, such as setting popular keywords in the title to mislead users into the website. In the same case, the link keyword is inconsistent with the actual content.

3. Invisible text/link:
In order to increase the frequency of keyword appearance, intentionally put a text with the same background color and containing dense keywords in the webpage. Visitors cannot see it, but the search engine can find it. Similar methods include ultra-small text, text hidden layer, and other means. An invisible link adds a link to the target optimization page on other pages based on the invisible text.

4. Redirection (re-direct ):
When a user enters this page, the refresh mark (Meta Refresh), CGI program, Java, JavaScript, or other technologies are used, the user will jump to another page quickly. Redirection allows search engines and users to access different web pages.


5. Steal the webpage:

Related Article

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.