Pyhton Micro-Blog crawler (2)--Get the Twitter user attention list

Source: Internet
Author: User

The main goal of this article is to get a list of Twitter users ' concerns, as well as key information such as the ID, nickname , details links , number of fans , and concerns of each microblogging user in the list.

The implementation code looks like this:

#-*-Coding:utf-8-*-"" "Created on Thu Aug 3 20:59:53 2017 @author: Administrator" "" Import requests import JSON I Mport Time Import random import pymysql.cursors def crawldetailpage (url,page): #读取微博网页的JSON信息 req = Requests.get (URL) jsondata = Req.text data = json.loads (jsondata) #获取每一条页的数据 content = data[' cards '] #print (conte NT) #循环输出每一页的关注者各项信息 for i in content:followingid = i[' user ' [' id '] followingname = i[' user ' [' s
        Creen_name '] Followingurl = i[' user ' [' profile_url '] followerscount = i[' user ' [' followers_count '] Followcount = i[' user ' [' Follow_count '] print ("---------------------------------") Print ("userid: {}". form
        at (followingid)) print ("user nickname: {}". Format (followingname)) print ("User Details Link: {}". Format (Followingurl))
        Print ("User fans: {}". Format (Followerscount)) print ("User concern: {}". Format (followcount)) ' Database operations
"#获取数据库链接        Connection = pymysql.connect (host = ' localhost ', user = ' root ',
                                  Password = ' 123456 ', db = ' Weibo ', CharSet = ' utf8mb4 ') Try: #获取会话指针 with Connection.cursor () as cursor: #创建s QL Statement sql = INSERT INTO ' following ' (' Followingid ', ' followingname ', ' followingurl ', ' followerscount ', ' follow Count ') VALUES (%s,%s,%s,%s,%s) "#执行sql语句 cursor.execute (SQL, followingid,followingname,f
            Ollowingurl,followerscount,followcount)) #提交数据库 Connection.commit () Finally: Connection.close () for I in Range (1,11): Print ("Getting a list of concerns for page {}:". Format (i)) #微博用户关注列表JSON链接 URL = "HT
    Tps://m.weibo.cn/api/container/getsecond?containerid=1005052164843961_-_followers&page= "+ str (i) Crawldetailpage (url,i) #设置休眠时间 t = Random.randint (31,33) print ("Hibernate time: {}s". Format (t)) Time.sleep (t)
 

the results of the operation are shown in the following illustration:

the data in the MySQL database is stored as shown in the following illustration:

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.