Python crawler starter Case: Get a list of word-chopping words

Source: Internet
Author: User
Tags urlencode

Hundred word slash is a very good word memory app, in the course of learning, it will record every word you learn and the number of times you answer the wrong, through this list can be easily found in the memory of which words are always repeated error cannot remember. We use Python to crawl this information and learn the basics of Python crawlers.

First come to the Hundred word slash website: http://www.baicizhan.com/login

This site is required to login, but fortunately no verification code, we can first look at the login process of the browser post what data. Open the Browser development tool (F12), take the Chrome browser as an example, to record the network of the browser during the logon process:

We can find that during the login process, the browser submits the data to Http://www.baicizhan.com/login by post. What data has been submitted? We can see it in the form data below.

Where the email is the user name, Raw_pwd is the password, where the data is to be URL-encoded, we can point to the view URL encoded look after the encoding. URL encoding requires Urllib library.

In the Request Headers section, we also see cookies. Therefore, we also need a cookie library to process our cookies.

1 ImportUrllib2 ImportUrllib23 ImportCookielib4 5email ='Your_email' 6PWD ='Your_password' 7data = {'Email': Email,'raw_pwd':p WD}8Post_data =urllib.urlencode (data)9 TenOpener =Urllib2.build_opener (urllib2. Httpcookieprocessor (Cookielib. Cookiejar ())) One  AResponse = Opener.open ('Http://www.baicizhan.com/login', Post_data) - Print(Response.read ())

In this way, we can find that the printed page is the source code after the login, which indicates that we successfully implemented the login.

Next, let's analyze the page for the word list: http://www.baicizhan.com/user/words/list

When we click on the page number, we are actually sending a GET request. Then we look at response and find out that it's a JSON, and we'll look at it (we can parse the JSON in http://www.json.cn/online)

If we want to parse json in Python, we need the JSON library. Let's print the first two pages of the word to see:

1 ImportUrllib22 ImportCookielib3 ImportUrllib4 ImportJSON5 6email ='Your_email' 7PWD ='Your_password' 8data = {'Email': Email,'raw_pwd':p WD}9Post_data =urllib.urlencode (data)Ten  OneOpener =Urllib2.build_opener (urllib2. Httpcookieprocessor (Cookielib. Cookiejar ())) A  -Opener.open ('Http://www.baicizhan.com/login', Post_data) -  the  forIinchRange (1, 3): -Content = Json.loads (Opener.open ("http://www.baicizhan.com/user/all_done_words_list?page=%s"%i). Read ()) -      forWordinchcontent["List"]: -         Printword["Word"] +         Printword["word_meaning"].strip () -         Printword["Wrong_times"]

In this way, we can print out the first two pages of the word and the number of interpretations and errors.

In order to get all the learned words, only a little modification is needed, and then we can do some subsequent processing of these data stores.

Python crawler starter Case: Get a list of word-chopping words

Related Article

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.