Python's ultimate way to crawl Web data! You deserve it!

Source: Internet
Author: User
Tags iso 8601 iso 8601 date iso 8601 date format

Suppose you search the Web for the raw data needed for an item, but the bad news is that the data exists in the Web page and there are no APIs available to get the raw data. At this point, you can solve this--

Import Pandas as Pdtables = pd.read_html ("https://apps.sandiego.gov/sdfiredispatch/") print (Tables[0])

It's so easy! Pandas can find all the important HTML tables on the page and return them as a new Dataframe object.

The input table 0 row has a column header and requires it to convert the text-based date to a time object:

Import Pandas as PDCALLS_DF, = pd.read_html ("http://apps.sandiego.gov/sdfiredispatch/", header=0, parse_dates=["call Date "]) print (CALLS_DF)

Get:

Call Date call Type Street cross Streets Unit 2017-06-02 17:27:58 Medical HIGHLAND av wightman st/university av E17 2017-0  6-02 17:27:58 Medical HIGHLAND av wightman st/university AV M34 2017-06-02 17:23:51 Medical EMERSON ST Locust St/evergreen St E22 2017-06-02 17:23:51 Medical EMERSON ST locust st/evergreen ST M47 2017-06-02 17:23:15 Medical marauder WY BARON LN /frobisher St E38 2017-06-02 17:23:15 Medical marauder WY BARON Ln/frobisher St M41

This is a single line of code and data cannot be used as a JSON record.

Import Pandas as PDCALLS_DF, = pd.read_html ("http://apps.sandiego.gov/sdfiredispatch/", header=0, parse_dates=["call Date "]) print (Calls_df.to_json (orient=" Records ", date_format=" ISO "))

Run the following code and you will get a nice JSON output (even with the appropriate ISO 8601 date format):

[{' Call Date ': ' 2017-06-02t17:34:00.000z ', ' Call Type ': ' Medical ', ' Street ': ' Rosecrans ST ', ' Cross Streets ': ' HANCOCK S T/alley "," Unit ":" M21 "}, {" Call Date ":" 2017-06-02t17:34:00.000z "," Call Type ":" Medical "," Street ":" Rosecrans ST "," C Ross Streets ":" HANCOCK St/alley "," Unit ":" T20 "}, {" Call Date ":" 2017-06-02t17:30:34.000z "," Call Type ":" Medical "," ST Reet ":" SPORTS ARENA BL "," Cross Streets ":" CAM DEL RIO west/east DR "," Unit ":" E20 "}//etc ...]

You can even save data to a CSV or XLS file:

Import Pandas as PDCALLS_DF, = pd.read_html ("http://apps.sandiego.gov/sdfiredispatch/", header=0, parse_dates=["call Date "]) calls_df.to_csv (" Calls.csv ", Index=false)

Run and double-click Calls.csv to open in the spreadsheet:

Of course, pandas can also filter, classify, or process the data more easily:

>>> calls_df.describe () call Date call Type Street cross Streets unitcount 2 69unique 60top 2017-06-02 16:59:50 Medical CHANNEL WY LA Salle St/western ST e1freq 5 5 5 2first 2017-06-02 16:36:46 nan nan nan Nanla St 2017-06-02 17:41:30 nan nan nan nan>>> calls_df.groupby ("Call Type"). Count () call Date Street cross Streets Un Itcall typemedical 66Traffic accident (L1) 3 3 3 3>>> calls_df["Unit"].unique () array ([' E46 ', ' MR33 ', ' T4 0 ', ' E201 ', ' M6 ', ' E34 ', ' M34 ', ' E29 ', ' M30 ', ' M43 ', ' M21 ', ' T20 ', ' E20 ', ' M20 ', ' E26 ', ' M32 ', ' SQ55 ', ' E1 ', ' M26 ', ' BLS4 ' , ' E17 ', ' E22 ', ' M47 ', ' E38 ', ' M41 ', ' E5 ', ' M19 ', ' E28 ', ' M1 ', ' E42 ', ' M42 ', ' E23 ', ' MR9 ', ' PD ', ' lccnot ', ' M52 ', ' E45 ', ' M12 ', ' E40 ', ' MR40 ', ' M45 ', ' T1 ', ' M23 ', ' E14 ', ' M2 ', ' E39 ', ' M25 ', ' E8 ', ' M17 ', ' E4 ', ' M22 ', ' M37 ', ' E7 ', ' M31 ', ' E9 ', ' M "SQ56", ' E10 ', ' M44 ', ' M11 '], Dtype=object)

Enter the group: 125240963 to get the source code

Python's ultimate way to crawl Web data! You deserve it!

Related Article

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.