First, you need to install scrapy in Linux. If Python 2.4 is installed in Linux, you need to install python on your own, the built-in python is used, so does it meet our requirements for python? No. The Python that comes with the system lacks Python-Dev, then installs the setuptool tool, and uses easy_install-u scrapy to install scrapy. If you have other requirements, you can install it on your own. After using scrapy on your machine to develop a project, you must note that after uploading the project to the server, you must configure the environment variable to add the current project to the executable environment variable, there are several ways to add environment variables, such as writing a bash/shell script and using export. It can also be used as a pythonProgramMember, directly add it to the setting file
Import OS
Import Time
Import Sys
SYS. Path. append ( ' % S ' % OS. getcwd ())
Bot_name = ' Crawl '
Bot_version = ' 1.0 '
Spider_modules = [ ' Crawl. Spiders ' ]
Newspider_module = ' Crawl. Spiders '
Default_item_class = ' Crawl. Items. crawlitem '
User_agent = ' % S/% s ' % (Bot_name, bot_version)
Item_pipelines = [ ' Crawl. Pipelines. crawlpipeline ' ]
Depth_limit = 5
Download_delay = 3
Log_level = ' Error '