Python crawler-Using the Scrapy crawler framework

Source: Internet
Author: User

Question 1: Using the Scrapy framework, after downloading scrapy using the command prompt pip command, you cannot use the Scrapy command, and Scrapy is not an internal or external command. Also not a running program

Resolution: At first, I was installing Python in D:\python, and after installing Scrapy, he was installed by default in this path, and then scrapy under the path D:\python\Scripts path, and the creation of the project can only be in this directory.

If you want him to be able to run a command under DOS, you need to know where he is, and then you have to learn the role of the environment variable path. So this is where you add the Scrapy address to the path.

What to do: Start > Control Panel > System > Advanced system Settings > Environment variables > System Variables >path> Add all the master/slave addresses that contain the scrapy path in the V column (note at the end of the original address plus; all;) I'm just adding d. : \python\;D: \python\scripts;d:\python\lib\site--packages\openssl

Perfect solution

Issue 2: After resolving the above problem, run the Scrapy crawler, the result appears import:no module named Win32API

Workaround: Python does not have a library that comes with access to the Windows system APIs and needs to be downloaded. The name of the library is called Pywin32, which can be downloaded directly from the Internet.

The following link addresses can be downloaded: http://sourceforge.net/projects/pywin32/files%2Fpywin32/(Download the Python version for you)

Run the following code if the Importerror:no module named Win32API appears or the Importerror:no module named Win32con appears, your library is not installed.

Introduce the two most important modules in this library: Win32API and Win32con, another simpler is to use the PIP command to download the Win32.

* * Specific operation: Run the command installation module: PIP Install Pypiwin32

Then run the program, the perfect solution ...

The above is a Python crawler scrapy today encountered 2 major problems in the online search for a lot of solutions, in the actual operation process, the above two solutions successfully resolved, the program also successfully run

Python crawler-Using the Scrapy crawler framework

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.