Use Scrapebox's harvest feature to collect a large number of related blogs

Source: Internet
Author: User
Keywords Blog Harvest scrapebox

Scrapebox is a batch automatic http://www.aliyun.com/zixun/aggregation/8771.html "> Blog comments and blog address collection software, widely used by Black hat seo." Price 57 $ USD, compared to 520 $ expensive xrumer, although the performance of a greater weakness, cost-effective is very good.

Low price is scrapebox quickly swept, in mass software, compared to expensive but how to let their comments break through the WordPress audit system, this is the essence of the use of this software. Scrapebox Blog Address collection part is really weak, do not collect automatic repetitive filtering function, this point hrefer do much better, Hrefer is the edge of the collection side filtering duplicate, and there is no limit.

Support blog: WordPress, MT, blog.engine which WordPress support is relatively good.

Due to domestic security and can not say the impact of factors, we visit foreign sites is very slow, and many sites can not open, so a good VPS plus private agent is scrapebox to play the full efficiency of the guarantee. In the domestic use no agent without VPS Scrapebox is incomplete.

Scrapebox powerful place is also the richness of plug-ins, a large number of plug-ins can help SEO outside the chain to complete the acquisition of the chain, analysis and other work. Due to widespread abuse, too many people use, the software has been very weak, want to buy the encyclopedia readers need to invest a certain amount of funds to configure a good VPS and agents, and learn to use skills to play its role.

Use Scrapebox group, you need to have a large number of sites related to their own list of topics, the need for continuous accumulation.

Just one of my collection methods.

1. Prepare 1-5 roots that match the theme of the website, log on to Google Keyword Tool, search for the results separately. This will generally get a few groups of 800 words within the keywords;

2. Copy all the keywords to the Harvest module, first use Dups to repeat, and then use the keyword scrape function to expand the keyword, complete and then repeat, this will come out a large number of keywords related to the topic;

3. Search type to choose WordPress Blogs;

4. Start multi-threaded search, settings–> use muti-threaded harvester;

5. If necessary, you can change the number of proxy attempts, the default is 3

6. Google,yahoo,aol are checked, the other default, so that the search will be slower, and there will be a lot of duplication, but the results will certainly be more

Ps:bing now to the API, not too troublesome to apply for a

7. Hang a bunch of agents to start searching

8. After the end of the search, export the keyword search failure, and then search again

9. List will be saved in the Harvester_sessions directory, each file up to 100w, more than will be saved in the new list file, add these list to the list to repeat the URL, export the final list.

The author of this article: Catop

This article address: HTTP://WWW.XRUMER.CN/17

Related Article

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.