CENTOS7 scrapy Installation process

Source: Internet
Author: User

Not much to say, directly open the whole

First, install the development package group, upgrade the operating system

#yum Groupinstall "Development Tools"-y#yum update-y

Note:

1. If python on your system is not python2.7 or above, please upgrade to python2.7 or above (as the scrapy requires more than Python version 2.7)

#下载python2.7#wget http://python.org/ftp/python/2.7.3/Python-2.7.3.tar.bz2 #解压 #tar-jxvf python-2.7.3.tar.bz2 #cd Python-2.7.3 #安装 #./configure#make all #make install#make clean#make distclean #查看python version #/usr/local/bin/pytho N2.7-v #建立软连接 to make the system default Python point to Python2.7#mv/usr/bin/python/usr/bin/python2.6.6#ln-s/usr/local/bin/python2.7/usr/bi N/python #解决系统 python Soft links point to the Python2.7 version, because Yum is not compatible with Python 2.7, Yum does not work properly, we need to specify the Python version of Yum Vim/usr/bin/yum the #! of the file header /usr/bin/python changed into #!/usr/bin/python2.6.6

2, it is strongly recommended to upgrade python2.7 after the installation of Pip and Setuptools, if not this operation will appear a lot of momin problems, let you acid cool to the dawn!!

3, if you are upgrading to python2.7, the larger possibility is all compiled by Python setup.py installation, required to include but not limited to these packages

Lxml,zope.interface,twisted,characteristic,pyasn1-modules,service-identity,scrapy

PS: I was compiled and installed in the beginning, most of the issues are:

Error:command ' gcc ' failed with exit status 1

later I found that if there is such a hint is not missing devel package is less a Lib library file; the most annoying is the installation scrapy prompt succeeds, but cannot create the project, the test sample can not run, Finally, I changed CENTOS7 decisively!


################## #以下内容都是Centos 7 on the operation, upgrade to python2.7 's classmates please detour ##############

II, vim/etc/yum.repo/ Rpmforge.repo specify Rpmforge, to install Liffi-devel "If you do not specify a source, yum Install liffi-devel will not be prompted to find"

[rpmforge]name = Red Hat Enterprise $releasever-rpmforge.net-dag#baseurl = http://apt.sw.be/redhat/el5/en/$basearch/d Agmirrorlist = Http://apt.sw.be/redhat/el7/en/mirrors-rpmforge#mirrorlist = file:///etc/yum.repos.d/ mirrors-rpmforgeenabled = 1protect = 0gpgkey = File:///etc/pki/rpm-gpg/RPM-GPG-KEY-rpmforge-daggpgcheck = 1
#rpm--import Http://apt.sw.be/RPM-GPG-KEY.dag.txt#yum Install liffi-devel-y

Third, If the system is installed audit This package, please remove it, it will affect the installation of scrapy

#yum Remove Audit

Iv. Development kits required to install the Scarpy

#yum install-y python-devel openssl-devel libxslt-devel libxml2-devel

v. Installing PIP and Setuptools

#yum Install Python-pip-y#pip install setuptools#pip install Setuptoos--upgrade

VI, installation Scrapy

# pip install scrapycollecting scrapy  using cached  scrapy-1.0.3-py2-none-any.whlrequirement already satisfied  (use --upgrade to  Upgrade): cssselect>=0.9 in /usr/lib/python2.7/site-packages  (from Scrapy) requirement already satisfied  (use --upgrade to upgrade):  queuelib in  /usr/lib/python2.7/site-packages  (from scrapy) requirement already satisfied  ( Use --upgrade to upgrade): pyopenssl in /usr/lib/python2.7/site-packages  ( from scrapy) requirement already satisfied  (Use --upgrade to upgrade):  w3lib>=1.8.0 in /usr/lib/python2.7/site-packages  (from scrapy) Collecting lxml   (from scrapy)   using cached lxml-3.4.4.tar.gzcollecting twisted>=10.0.0   (from scrapy)   using cached twisted-15.4.0.tar.bz2requirement already satisfied  (use --upgrade to  upgrade): six>=1.5.2 in /usr/lib/python2.7/site-packages  (from Scrapy) collecting service-identity  (from scrapy)   using cached service_ identity-14.0.0-py2.py3-none-any.whlrequirement already satisfied  (use --upgrade to  upgrade): cryptography>=0.7 in /usr/lib64/python2.7/site-packages  (from  pyopenssl->scrapy) collecting zope.interface>=3.6.0  (from twisted>=10.0.0->scrapy)   Using cached zope.interface-4.1.3.tar.gzCollecting characteristic>=14.0.0  ( From service-identity->scrapy)   Using cached  characteristic-14.3.0-py2.py3-none-any.whlcollecting pyasn1-modules  (from service-identity-> Scrapy)   using cached pyasn1_modules-0.0.8-py2.py3-none-any.whlrequirement already satisfied  (use --upgrade to upgrade):  PYASN1  in /usr/lib/python2.7/site-packages  (from service-identity->scrapy) requirement  already satisfied  (use --upgrade to upgrade):  idna>=2.0 in /usr/lib /python2.7/site-packages  (from cryptography>=0.7->pyopenssl->scrapy) Requirement  already satisfied  (use --upgrade to upgrade):  setuptools in /usr/lib/ python2.7/site-packages  (from cryptography>=0.7->pyopenssl->scrapy) Requirement already  satisfied  (use --upgrade to upgrade):  enum34 in /usr/lib/python2.7/ site-packages  (from cryptography>=0.7->pyopenssl->scrapy) requirement already  satisfied  (use --upgrade to upgrade):  ipaddress in /usr/lib/python2.7/ site-packages  (From cryptography>=0.7->pyopenssl->scrapy) requirement already satisfied  (use --upgrade  to upgrade): cffi>=1.1.0 in /usr/lib64/python2.7/site-packages  (from  cryptography>=0.7->pyopenssl->scrapy) requirement already satisfied  (use --upgrade  to upgrade): pycparser in /usr/lib/python2.7/site-packages  (from cffi>= 1.1.0->cryptography>=0.7->pyopenssl->scrapy) Installing collected packages: lxml,  zope.interface, Twisted, characteristic, pyasn1-modules, service-identity,  scrapy  running setup.py install for lxml  running setup.py  install for zope.interface  running setup.py install for  twistedsuccessfully installed scrapy-1.0.3 twisted-15.4.0 characteristic-14.3.0  lxml-3.4.4 pyasn1-modules-0.0.8 service-identity-14.0.0 zope.interface-4.1.3 

Vii. Creation of the project

[Email protected] workspace]# scrapy startproject tutorial2015-10-15 21:54:24 [scrapy] info:scrapy 1.0.3 started (bot:s Crapybot) 2015-10-15 21:54:24 [Scrapy] info:optional features Available:ssl, http112015-10-15 21:54:24 [scrapy] Info:ove    Rridden settings: {}new scrapy project ' tutorial ' created in:/workspace/tutorialyou Can-start your first spider with: CD Tutorial Scrapy genspider example example.com

VIII. directory Structure

[Email protected] workspace]# Tree.└──tutorial├──scrapy.cfg└──tutorial├──__init__.py├──items . py├──pipelines.py├──settings.py└──spiders└──__init__.py3 directories, 6 files

ix. scrapy Related documents

  • Http://www.tuicool.com/articles/URNVV3E "Compile and install scrapy, but unfortunately, I didn't succeed."

  • Https://scrapy-chs.readthedocs.org/zh_CN/0.24/intro/overview.html "very early scrapy Chinese translation"

  • http://scrapy.org/

  • http://doc.scrapy.org/en/master/

X. SUMMARY

    • Again proved the degree Niang really bad use;

    • Be sure to look at the official documents, the search is not comprehensive. This can take a lot less detours, reduce unnecessary workload;

    • The problem to think about first, cool 3s "Also on a q[triple claw hit] time", and then to search for problems;

    • Solve the problem to form a document, convenient oneself also convenient others.

This article is from the "gentle" blog, make sure to keep this source http://essun.blog.51cto.com/721033/1703367

CENTOS7 scrapy Installation process

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.