Massive log data, extract a day to visit Baidu the most times the IP.
The first is this day, and is to visit Baidu's log in the IP out to write to a large file. Note that the IP is 32-bit and has a maximum of 2^32 IP. The same can be used to map the method, such as module 1000, the entire large file mapping to 1000 small files, and then find out the frequency of each of the most frequent IP (can be used hash_map frequency statistics, and then find the largest number of frequencies) and the corresponding frequency. Then in the 1000 largest IP, find out the most frequent IP, that is, the request.
algorithm idea: Divide and conquer +hash
The 1.IP address has a maximum of 2^32=4g, so it can not be fully loaded into memory processing;
2. You can consider the idea of "divide and conquer", according to the IP address of the hash (IP)%1024 value, the vast number of IP logs stored in 1024 small files. Thus, each small file contains a maximum of 4MB IP addresses;
3. For each small file, you can build a hash map with the IP key, the number of occurrences, and the most current occurrence of the IP address;
4. You can get the most occurrences of IP in 1024 small files, and then get the most occurrences of IP based on the general sorting algorithm;
If this is not the case, then if the direct partition into 1024 files, there may be an IP in each sub-file exists, but not the most occurrences of each file, so that it is possible to cause inaccurate results
+hash processing massive log data by divide-and-conquer method