QQ space Python crawler v2.0 -- thumb up data analysis, pythonv2.0 --

Source: Internet
Author: User

QQ space Python crawler v2.0 -- thumb up data analysis, pythonv2.0 --

After remembering the previous v1.0 space crawler, I want to write another crawler to analyze my likes.

 

FirstAnalyze Json:

 

You can find that the node for thumb ups isData --> vFeeds (list) --> like --> likemans (list) --> user --> nickname & uin

The Code is as follows:

1 for I in range (0, page): 2 try: 3 html = requests. get (url_x + str (numbers) + url_y, headers = headers ). content 4 data = json. loads (html) 5 6 if 'vfeeds 'in data ['data']: 7 for vFeed in data ['data'] ['vfeeds']: 8 if 'like' in vFeed: 9 for like_man in vFeed ['like'] ['likemans ']: 10 qq_list.append (int (like_man ['user'] ['uin']) 11 # This dict needs to be defined in the loop, because the following list. append () is the reference passed 12 like_me_map = dict () 13 like_me_map ['Nick _ name'] = like_man ['user'] ['nickname'] 14 like_me_map ['qq'] = like_man ['user'] ['uin'] 15 like_me_list.append (like_me_map) 16 numbers ++ = 4017 time. sleep (10) 18 print ('before analysis '+ str (numbers) + 'data record') 19 minutes T: 20 numbers + = 4021 time. sleep (10) 22 print ('quarter '+ str (numbers) + 'analysis error near data bar ')

 

Like_me_list is a list of dict, and qq_list is a collection of all qq numbers. Now we define a dict to facilitate query of qq and nickname:

1 # create a map corresponding to QQ and nickname to query 2 qq_name_map = dict () 3 for man in like_me_list: 4 qq_name_map [man ['qq'] = man ['Nick _ name']

 

 

Use set to automatically remove duplicates and calculate count:

1 # calculate the number of likes and map the number to QQ to map2 qq_set = set (qq_list) 3 for qq in qq_set: 4 like_me_result [str (qq)] = qq_list.count (qq)

 

 

Then pressSort likes in descending orderThe code here is ugly =. =:

1 # The following process is as follows: Save a new map as the final result after sorting by the number of likes, the Code is not elegant =. = 2 num_result = sorted (like_me_result.values (), reverse = True) 3 print (num_result) 4 for num in num_result: 5 for key in like_me_result.keys (): 6 if like_me_result [key] = num: 7 result [qq_name_map [key] + '(' + key + ')'] = num

 

Finally, write the file, as shown in the following figure:

1 try: 2 with open (OS. getcwd () + '\' + 'like_me_result.txt ', 'wb') as fo: 3 for k, v in result. items (): 4 record = k + ': thumb up' + str (v) +! \ R \ n' 5 fo. write (record. encode ('utf-8') 6 print ("thumb up data result analysis and writing completed") 7 8 bytes t IOError as msg: 9 print (msg)

 

 

However, I found that the json like data returned by the QQ space is not complete,

Num indicates the number of likemans who liked likemans. A total of 13 likemans liked likemans.

I guess it may be becauseThe mobile QQ space page does not need to display the like information, So there is no complete like information,

 

The UI shows no like information above =. =

Therefore, this crawler can only be regarded as a semi-finished product.

 

Related Article

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.