1. Massive log data, extract the most visited Baidu one day the most number of that IP.
The first is this day, and is to visit Baidu's log in the IP out to write to a large file. Note that the IP is 32-bit and has a maximum of 2^32 IP. The same can be used to map the method, such as module 1000, the entire large file mapping to 1000 small files, in each small file to find the largest frequency of the IP (can be used hash_map frequency statistics, and then find the largest number of frequencies) and the corresponding frequency. Then in the 1000 largest IP, find out the most frequent IP, that is, the request.
Algorithm idea: Divide and conquer +hash
The 1.IP address has a maximum of 2^32=4g, so it can not be fully loaded into memory processing;
2. According to the IP address of the hash (IP)%1024 ( the same IP will be divided into the same file ), the vast amount of IP logs stored in 1024 small files. Thus, each small file contains a maximum of 4MB IP addresses;
3. For each small file, you can build a hash map with the IP key, the number of occurrences, and the most current occurrence of the IP address;
4. Can get 1024 small files in the most occurrences of the IP, and then sorted by the number of occurrences of the most frequently occurring IP;
Big Data algorithms