Web crawler: The use of the Bloomfilter filter (the URL to the heavy strategy)

Source: Internet
Author: User
Tags bitset memory usage
Preface:

Has recently been plagued by a heavy strategy in the web crawler. Use some other "ideal" strategy, but you'll always be less obedient during the run. But when I found out about the Bloomfilter, it was true that this was the most reliable method I have ever found.

If, you say the URL to go heavy, what is difficult. Then you can read some of the following questions and say the same thing.


about Bloomfilter:

Bloom filter is a binary vector data structure presented by Howard Bloom in 1970, which has good spatial and temporal efficiency and is used to detect whether an element is a member of a set. If the test result is yes, the element is not necessarily in the collection, but if the test result is no, the element must not be in the collection. So Bloom filter has a recall rate of 100%. So that each test request returns "within the set (possibly wrong)" and "Not in the collection (absolutely not within the set)", it is obvious that Bloom filter is sacrificing the correct rate to save space.


previous strategies for going heavy: 1. The thought of the URL to go to a heavy policy create a unique property in a database in the database to create a uniquely indexed, check for data to be inserted before data is inserted use Set or HashSet to save data, ensuring unique Use a map or a fixed-length array to record whether a URL has been accessed


2. The problem of the above strategy

(1) for creating a unique property of a field in a database, you can avoid some repetitive actions. However, after multiple MySQL errors, the program may crash directly, so this method is not desirable

(2) If we want to check the data to be inserted before each insert, this will affect the efficiency of the program.

(3) This method is the first time I try to use, give up the reason for continued use: OOM. Of course, this is not a memory leak for the program, and there's really so much memory to be consumed in the program (because there are far more URLs parsed from the queue to be accessed).

(4) In previous blogs, I mentioned using the map object to save access to the URL. But now I want to deny it. Because, after a long run, the map also takes up a lot of memory. But it's smaller than the 3rd way. Here is the use of Map<integer, integer>, and memory usage in a long run:



use of Bloomfilter: 1. In general bloomfilter use of memory:



2. The bloomfilter use of memory in the crawler (4 hours has elapsed):


3. Program Structure Chart



general use of 4.BloomFilter

Here is a section on Bloomfilter Java code, reference to: http://www.cnblogs.com/heaad/archive/2011/01/02/1924195.html

If you read the above article, I believe you have learned that the space complexity of the Prum filter is S (n) =o (n). On this point, I believe you have learned this from the memory usage above. So here are some of the relevant Java code presentations. And in the process of checking the weight is also very efficient, time complexity is t (n) =o (1).


Bloomfilter.java

[Java]  View plain copy print? import java.util.bitset;      public class bloomfilter {              /* bitset Initial allocation 2^24 a bit */       private static final int DEFAULT_SIZE = 1 <<  25;              /*  The seeds of different hash functions, the general should take prime numbers  */       private static final int[] seeds =  new int[] { 5, 7, 11, 13, 31, 37, 61 };               private BitSet bits = new  Bitset (default_size);              /*  Hash Function object  */       private simplehash[] func = new simplehash[seeds.length];          public bloomfilter ()  {            for  (int i = 0; i <  seeds.length; i++)  {                func[i] = new simplehash (Default_size, seeds[i]);            }       }           //  marks a string into bits        public void add (string  Value)  {           for  (simplehash f :  func)  {               bits.set (F.hash (value),  true);   &NBSP;&NBSP;&NBSP;&NBSP;&NBSP;&NBSP;&NBSP;&AMP;NB

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.