Intermediary transaction http://www.aliyun.com/zixun/aggregation/6858.html ">seo diagnose Taobao guest cloud host technology Hall
What is site content aggregation? refers to the original content of the site according to a certain topic or keyword to rearrange the order to generate a new list or topic page. The original intention of Web site aggregation is to facilitate the user to expand the content of the same topic to read, but the development to the present, this aggregation has become a lot of web sites in order to quickly obtain traffic in the search engine using a method of SEO. The so-called SEO technology is usually a double-edged sword, used well when the flow of water from the wind;
Any site will be content aggregation, content aggregation forms of diversification, the most common such as site columns, special topics, tag (tags), combined search page.
The content is aggregated to show the user that there is no problem, the problem is after the SEO optimization. In particular, some of the content of the larger information-type Web sites and large-scale industry sites.
How can seo people use aggregated pages to get traffic quickly?
1, for the industry to carry out a comprehensive long tail keyword collection and collation.
2, the establishment of certain rules to improve the site search function.
3, make common Content Aggregation page template.
4, according to the keyword search content apply common template batch generate a large number of aggregation pages.
5, through the internal and external chain to enhance the aggregation page Baidu collection.
So far many sites still use such a way to get a lot of traffic from search engines, and some large Web sites artificially search aggregation pages even up to dozens of millions of. A lot of Web sites these aggregate page traffic in the overall traffic accounted for a lot. This SEO method seems to have no problem, but we must be honest to face the following issues:
1, the relevance of the content.
That is, according to the keyword search matches the content and the keyword itself relevance how? Most site search function is relatively weak, and the search technology itself is high threshold, even Baidu Google in search results accurate issues are still efforts, not to mention the ordinary network companies. And if the aggregate page keyword and content matching correlation is very poor, then lost the original meaning of aggregation content, no value to users, and any open user alone for the search engine SEO behavior, ultimately difficult to escape was identified as cheating.
2, the degree of repetition of the page.
If a very similar two long tail keywords generate aggregations, the results of the search are likely to be exactly the same, which will cause the page to be completely duplicated, and if there are 10 or 20 or more similar keywords, the site aggregation page will be heavily duplicated.
3, aggregate number of pages (proportional).
The normal aggregation page is based on the content page, but the aggregate page after SEO may be more than you can imagine in number. Imagine a Web site with 1w of content that generates 2W aggregation pages, which is what it is. And the situation is really happening. Point water with a summary is: This is not scientific ah! To sum up: Aggregation page is not a lot of quantity, but the content is accurate.
4, aggregation page and content page conflict.
Let's assume that when a user searches for a long tail keyword that comes to your site, the keyword is included in the title of the aggregate page and the final content page. Then if the ranking of the page is aggregated page, then users need to click on the link to enter the aggregation page, and then further view the page on a number of titles click on the specified link to enter the Terminal page. Obviously, this is very unfriendly to the user experience. The location of the water is known, the current search engine is also very annoying this kind of user unfriendly behavior.
Conversely, if you can actually solve some of the problems mentioned above, then your aggregation page will become more and more meaningful to users, so that the search engine is not objectionable.
Some suggestions on excellent aggregation pages:
1, the content is aggregated through the artificial label.
By searching for aggregated content, although there is a certain advantage in the build speed, but usually limited to the technical strength will appear many shortcomings. Instead, by manually adding the label to the article, it will be able to solve this series of problems to a large extent. That is, the editor creates a new label for this article (or select it by searching for old tags) while writing the article. The aggregation page then calls the content page that has been manually tagged.
2, similar labels are merged.
For example, there are three very similar long tail keywords need to be ranked, then we do not need to create three tags at the same time, but only need to build a core tag. And by modifying the title of the label to cover multiple keywords at the same time. In addition, the editor also recommends that you label the article
Use the old label first, if there is no relevant label to create the new.
3, the label content is sorted by time.
The Content Dependencies and page duplication of the aggregated pages will be effectively resolved by the previous two steps, and the third problem we are going to address is the update of the aggregation page. Many web site generated aggregate pages will not be updated for years, so Baidu snapshots will usually be relatively slow and not conducive to a long-term ranking, so you can call the content rule when the latest time call. The editor adds this keyword tag to a new article, and the aggregation page is updated automatically.
4, the label management backstage.
Most of the site in order to save trouble is directly to the label keyword into the database, the advantage is convenient and simple, but the problem is unable to manage and optimize. Therefore, the Label keyword can be set up a management background, so that the tags can be modified in a timely manner, merger, deletion and other management and optimization.
5, Aggregation page URL optimization
Solve the aggregation page content problem, we need to further optimize the aggregation page URL, many Web site aggregation pages are directly using dynamic URL, and with a large number of parameters, such aggregation page is not conducive to search engines indexed and get rankings, a better way is to address these URLs for pseudo static processing.
In aggregate page URL optimization, a lot of classified information sites do a better job. For example 58 City, in the Information Classification aggregation page URL optimization is very normative.
The nature of the aggregation page is the same as the site column, category, and site topic. The purpose is to aggregate a topic or knowledge of the relevant content, so that interested users can more easily read. Therefore, we are in the aggregation of content or thematic production, must not be aggregated for aggregation, for SEO and aggregation. But to start from the user itself, in favor of user experience on the basis of aggregation, this is the site content aggregation way! Article original in the Water blog, reprinted please specify address: http://www.dianshui.net/seo/450.html