Scrapy setup ubuntu 16.04 或任何其他



我今天安装的新 Ubuntu 16.04 预装了 python:

p@Scrapy:~$ python --version
Python 2.7.11+
p@Scrapy:~$ python3 --version
Python 3.5.1+

如手册页所述 http://doc.scrapy.org/en/latest/intro/install.html 我 http://doc.scrapy.org/en/latest/topics/ubuntu.html#topics-ubuntu 打开此链接并尝试按照描述的步骤安装 Scrapy。

但是我在步骤 3 之后收到错误

sudo apt-get update && sudo apt-get install scrapy

The following packages have unmet dependencies:
scrapy : Depends: python-support (>= 0.90.0) but it is not installable
Recommends: python-setuptools but it is not going to be installed
E: Unable to correct problems, you have held broken packages.

我昨天发布了一个关于 Scrapy 错误的问题,看起来这也是设置问题,但在 Windows 上。无法在Windows上设置Scrapy所以我今天尝试使用Ubuntu,再次没有运气。

所以质疑如何在 Ubuntu 16.04 上设置 Scrapy 或者可能是其他版本?看起来刮擦手册已经过时了。我认为Scrapy项目已经死了,但我知道人们仍然使用Scrapy。可能是Scrapy仅适用于python 2.+?所以我会留在Windows上。无法检查所有变体。这需要太多时间。任何人都可以提到与 Scrapy 一起使用的稳定配置(操作系统 + python 版本)吗?

谢谢。

更新

在这里,我尝试了Docker。我从终端创建Dockerfile,其他步骤:

p@ScrapyPython3:~$ cat Dockerfile
$ cat Dockerfile
FROM ubuntu:xenial
ENV DEBIAN_FRONTEND noninteractive
RUN apt-get update
# Install Python3 and dev headers
RUN apt-get install -y 
python3 
python-dev 
python3-dev
# Install cryptography
RUN apt-get install -y 
build-essential 
libssl-dev 
libffi-dev
# install lxml
RUN apt-get install -y 
libxml2-dev 
libxslt-dev
# install pip
RUN apt-get install -y python-pip
RUN useradd --create-home --shell /bin/bash scrapyuser
USER scrapyuser
WORKDIR /home/scrapyuser
p@ScrapyPython3:~$ sudo docker build -t redapple/scrapy-ubuntu-xenial .
Sending build context to Docker daemon 81.21 MB
Step 1 : $ 
Unknown instruction: $
p@ScrapyPython3:~$ sudo docker run -t -i redapple/scrapy-ubuntu-xenial
Unable to find image 'redapple/scrapy-ubuntu-xenial:latest' locally
Pulling repository docker.io/redapple/scrapy-ubuntu-xenial
docker: Error: image redapple/scrapy-ubuntu-xenial not found.
See 'docker run --help'.
p@ScrapyPython3:~$ pip install scrapy
Requirement already satisfied (use --upgrade to upgrade): scrapy in ./.local/lib/python2.7/site-packages
Requirement already satisfied (use --upgrade to upgrade): queuelib in ./.local/lib/python2.7/site-packages (from scrapy)
Requirement already satisfied (use --upgrade to upgrade): pyOpenSSL in ./.local/lib/python2.7/site-packages (from scrapy)
Requirement already satisfied (use --upgrade to upgrade): Twisted>=10.0.0 in ./.local/lib/python2.7/site-packages (from scrapy)
Requirement already satisfied (use --upgrade to upgrade): six>=1.5.2 in ./.local/lib/python2.7/site-packages (from scrapy)
Requirement already satisfied (use --upgrade to upgrade): w3lib>=1.14.2 in ./.local/lib/python2.7/site-packages (from scrapy)
Requirement already satisfied (use --upgrade to upgrade): service-identity in ./.local/lib/python2.7/site-packages (from scrapy)
Requirement already satisfied (use --upgrade to upgrade): cssselect>=0.9 in ./.local/lib/python2.7/site-packages (from scrapy)
Requirement already satisfied (use --upgrade to upgrade): lxml in ./.local/lib/python2.7/site-packages (from scrapy)
Requirement already satisfied (use --upgrade to upgrade): parsel>=0.9.3 in ./.local/lib/python2.7/site-packages (from scrapy)
Requirement already satisfied (use --upgrade to upgrade): PyDispatcher>=2.0.5 in ./.local/lib/python2.7/site-packages (from scrapy)
Requirement already satisfied (use --upgrade to upgrade): cryptography>=1.3 in ./.local/lib/python2.7/site-packages (from pyOpenSSL->scrapy)
Requirement already satisfied (use --upgrade to upgrade): zope.interface>=3.6.0 in ./.local/lib/python2.7/site-packages (from Twisted>=10.0.0->scrapy)
Requirement already satisfied (use --upgrade to upgrade): pyasn1-modules in ./.local/lib/python2.7/site-packages (from service-identity->scrapy)
Requirement already satisfied (use --upgrade to upgrade): pyasn1 in ./.local/lib/python2.7/site-packages (from service-identity->scrapy)
Requirement already satisfied (use --upgrade to upgrade): attrs in ./.local/lib/python2.7/site-packages (from service-identity->scrapy)
Requirement already satisfied (use --upgrade to upgrade): setuptools>=11.3 in ./.local/lib/python2.7/site-packages (from cryptography>=1.3->pyOpenSSL->scrapy)
Requirement already satisfied (use --upgrade to upgrade): ipaddress in ./.local/lib/python2.7/site-packages (from cryptography>=1.3->pyOpenSSL->scrapy)
Requirement already satisfied (use --upgrade to upgrade): enum34 in ./.local/lib/python2.7/site-packages (from cryptography>=1.3->pyOpenSSL->scrapy)
Requirement already satisfied (use --upgrade to upgrade): idna>=2.0 in ./.local/lib/python2.7/site-packages (from cryptography>=1.3->pyOpenSSL->scrapy)
Requirement already satisfied (use --upgrade to upgrade): cffi>=1.4.1 in ./.local/lib/python2.7/site-packages (from cryptography>=1.3->pyOpenSSL->scrapy)
Requirement already satisfied (use --upgrade to upgrade): pycparser in ./.local/lib/python2.7/site-packages (from cffi>=1.4.1->cryptography>=1.3->pyOpenSSL->scrapy)
p@ScrapyPython3:~$ scrapy version
The program 'scrapy' is currently not installed. You can install it by typing:
sudo apt install python-scrapy

更新 1看起来 Dockerfile 中的第一行不应该在那里。如果我删除它($猫Dockerfile)

我可以启动并运行 docker 映像,但 pip 安装再次没有运气。此外,我意识到我必须使用 docker 做一些开销才能在那里上传抓取文件(编码时我更喜欢 GUI)。此 Docker 映像中是否安装了任何工具?这里安装日志:

p@ScrapyPython3:~$ sudo docker run -t -i redapple/scrapy-ubuntu-xenial
scrapyuser@41bef38de45d:~$ python --version   
Python 2.7.11+
scrapyuser@41bef38de45d:~$ python3 --version
Python 3.5.1+
scrapyuser@41bef38de45d:~$ pip install scrapy
Collecting scrapy
Downloading Scrapy-1.1.0-py2.py3-none-any.whl (294kB)
100% |################################| 296kB 245kB/s 
Collecting queuelib (from scrapy)
Downloading queuelib-1.4.2-py2.py3-none-any.whl
Collecting pyOpenSSL (from scrapy)
Downloading pyOpenSSL-16.0.0-py2.py3-none-any.whl (45kB)
100% |################################| 51kB 12.7MB/s 
Collecting Twisted>=10.0.0 (from scrapy)
Downloading Twisted-16.2.0.tar.bz2 (2.9MB)
100% |################################| 2.9MB 472kB/s 
Collecting six>=1.5.2 (from scrapy)
Downloading six-1.10.0-py2.py3-none-any.whl
Collecting w3lib>=1.14.2 (from scrapy)
Downloading w3lib-1.14.2-py2.py3-none-any.whl
Collecting service-identity (from scrapy)
Downloading service_identity-16.0.0-py2.py3-none-any.whl
Collecting cssselect>=0.9 (from scrapy)
Downloading cssselect-0.9.2-py2.py3-none-any.whl
Collecting lxml (from scrapy)
Downloading lxml-3.6.0.tar.gz (3.7MB)
100% |################################| 3.7MB 389kB/s 
Collecting parsel>=0.9.3 (from scrapy)
Downloading parsel-1.0.2-py2.py3-none-any.whl
Collecting PyDispatcher>=2.0.5 (from scrapy)
Downloading PyDispatcher-2.0.5.tar.gz
Collecting cryptography>=1.3 (from pyOpenSSL->scrapy)
Downloading cryptography-1.4.tar.gz (399kB)
100% |################################| 409kB 1.4MB/s 
Collecting zope.interface>=3.6.0 (from Twisted>=10.0.0->scrapy)
Downloading zope.interface-4.2.0.tar.gz (146kB)
100% |################################| 153kB 1.2MB/s 
Collecting pyasn1-modules (from service-identity->scrapy)
Downloading pyasn1_modules-0.0.8-py2.py3-none-any.whl
Collecting pyasn1 (from service-identity->scrapy)
Downloading pyasn1-0.1.9-py2.py3-none-any.whl
Collecting attrs (from service-identity->scrapy)
Downloading attrs-16.0.0-py2.py3-none-any.whl
Collecting idna>=2.0 (from cryptography>=1.3->pyOpenSSL->scrapy)
Downloading idna-2.1-py2.py3-none-any.whl (54kB)
100% |################################| 61kB 10.8MB/s 
Collecting setuptools>=11.3 (from cryptography>=1.3->pyOpenSSL->scrapy)
Downloading setuptools-23.0.0-py2.py3-none-any.whl (435kB)
100% |################################| 440kB 1.2MB/s 
Collecting enum34 (from cryptography>=1.3->pyOpenSSL->scrapy)
Downloading enum34-1.1.6-py2-none-any.whl
Collecting ipaddress (from cryptography>=1.3->pyOpenSSL->scrapy)
Downloading ipaddress-1.0.16-py27-none-any.whl
Collecting cffi>=1.4.1 (from cryptography>=1.3->pyOpenSSL->scrapy)
Downloading cffi-1.6.0.tar.gz (397kB)
100% |################################| 399kB 1.3MB/s 
Collecting pycparser (from cffi>=1.4.1->cryptography>=1.3->pyOpenSSL->scrapy)
Downloading pycparser-2.14.tar.gz (223kB)
100% |################################| 225kB 1.1MB/s 
Building wheels for collected packages: Twisted, lxml, PyDispatcher, cryptography, zope.interface, cffi, pycparser
Running setup.py bdist_wheel for Twisted ... done
Stored in directory: /home/scrapyuser/.cache/pip/wheels/fe/9d/3f/9f7b1c768889796c01929abb7cdfa2a9cdd32bae64eb7aa239
Running setup.py bdist_wheel for lxml ... done
Stored in directory: /home/scrapyuser/.cache/pip/wheels/6c/eb/a1/e4ff54c99630e3cc6ec659287c4fd88345cd78199923544412
Running setup.py bdist_wheel for PyDispatcher ... done
Stored in directory: /home/scrapyuser/.cache/pip/wheels/86/02/a1/5857c77600a28813aaf0f66d4e4568f50c9f133277a4122411
Running setup.py bdist_wheel for cryptography ... done
Stored in directory: /home/scrapyuser/.cache/pip/wheels/f6/6c/21/11ec069285a52d7fa8c735be5fc2edfb8b24012c0f78f93d20
Running setup.py bdist_wheel for zope.interface ... done
Stored in directory: /home/scrapyuser/.cache/pip/wheels/20/a2/bc/74fe87cee17134f5219ba01fe82dd8c10998377e0fb910bb22
Running setup.py bdist_wheel for cffi ... done
Stored in directory: /home/scrapyuser/.cache/pip/wheels/8f/00/29/553c1b1db38bbeec3fec428ae4e400cd8349ecd99fe86edea1
Running setup.py bdist_wheel for pycparser ... done
Stored in directory: /home/scrapyuser/.cache/pip/wheels/9b/f4/2e/d03e949a551719a1ffcb659f2c63d8444f4df12e994ce52112
Successfully built Twisted lxml PyDispatcher cryptography zope.interface cffi pycparser
Installing collected packages: queuelib, idna, pyasn1, six, setuptools, enum34, ipaddress, pycparser, cffi, cryptography, pyOpenSSL, zope.interface, Twisted, w3lib, pyasn1-modules, attrs, service-identity, cssselect, lxml, parsel, PyDispatcher, scrapy
Successfully installed PyDispatcher Twisted attrs cffi cryptography cssselect enum34 idna ipaddress lxml parsel pyOpenSSL pyasn1 pyasn1-modules pycparser queuelib scrapy service-identity setuptools-20.7.0 six w3lib zope.interface
You are using pip version 8.1.1, however version 8.1.2 is available.
You should consider upgrading via the 'pip install --upgrade pip' command.
scrapyuser@41bef38de45d:~$ scrapy version
bash: scrapy: command not found

Scrapy 安装文档需要更新。对此真的很抱歉。

来自 http://archive.scrapy.org/ubuntu 的 Ubuntu 软件包不是最新的(因为我在 2016-06-15 上写了这些行),所以如果你想要最新的(Py3 兼容)不要使用它们。

http://doc.scrapy.org/en/latest/intro/install.html#ubuntu-9-10-or-above,您链接的页面具有使用具有(相当多)依赖项的pip的替代设置:

If you prefer to build the python dependencies locally instead of relying on system packages you’ll need to install their required non-python dependencies first:
sudo apt-get install python-dev python-pip libxml2-dev libxslt1-dev zlib1g-dev libffi-dev libssl-dev
You can install Scrapy with pip after that:
pip install Scrapy

另请查看 https://stackoverflow.com/a/37677910/2572383

如果你想要Python 2和Python 3,我建议安装下面所有这些:

apt-get install -y 
python3 
python-dev 
python3-dev
# for cryptography
apt-get install -y 
build-essential 
libssl-dev 
libffi-dev
# for lxml
apt-get install -y 
libxml2-dev 
libxslt-dev
# install pip (if not already installed)
apt-get install -y python-pip

另一个建议:安装virtualenvwrapper因此,您可以创建一个本地 Python 3 虚拟环境:

$ mkvirtualenv --python=/usr/bin/python3 scrapy.py3
Already using interpreter /usr/bin/python3
Using base prefix '/usr'
New python executable in /home/paul/.virtualenvs/scrapy.py3/bin/python3
Also creating executable in /home/paul/.virtualenvs/scrapy.py3/bin/python
Installing setuptools, pkg_resources, pip, wheel...done.
virtualenvwrapper.user_scripts creating /home/paul/.virtualenvs/scrapy.py3/bin/predeactivate
virtualenvwrapper.user_scripts creating /home/paul/.virtualenvs/scrapy.py3/bin/postdeactivate
virtualenvwrapper.user_scripts creating /home/paul/.virtualenvs/scrapy.py3/bin/preactivate
virtualenvwrapper.user_scripts creating /home/paul/.virtualenvs/scrapy.py3/bin/postactivate
virtualenvwrapper.user_scripts creating /home/paul/.virtualenvs/scrapy.py3/bin/get_env_details

然后简单地pip install scrapy虚拟环境中:

(scrapy.py3) paul@paul-SATELLITE-R830:~/src/scrapy.org$ pip install --upgrade --no-cache-dir scrapy
Collecting scrapy
Downloading Scrapy-1.1.0-py2.py3-none-any.whl (294kB)
100% |████████████████████████████████| 296kB 1.7MB/s 
Collecting cssselect>=0.9 (from scrapy)
Downloading cssselect-0.9.1.tar.gz
Collecting queuelib (from scrapy)
Downloading queuelib-1.4.2-py2.py3-none-any.whl
Collecting parsel>=0.9.3 (from scrapy)
Downloading parsel-1.0.2-py2.py3-none-any.whl
Collecting Twisted>=10.0.0 (from scrapy)
Downloading Twisted-16.2.0.tar.bz2 (2.9MB)
100% |████████████████████████████████| 2.9MB 1.9MB/s 
Collecting lxml (from scrapy)
Downloading lxml-3.6.0.tar.gz (3.7MB)
100% |████████████████████████████████| 3.7MB 2.0MB/s 
Collecting PyDispatcher>=2.0.5 (from scrapy)
Downloading PyDispatcher-2.0.5.tar.gz
Collecting six>=1.5.2 (from scrapy)
Downloading six-1.10.0-py2.py3-none-any.whl
Collecting pyOpenSSL (from scrapy)
Downloading pyOpenSSL-16.0.0-py2.py3-none-any.whl (45kB)
100% |████████████████████████████████| 51kB 2.1MB/s 
Collecting service-identity (from scrapy)
Downloading service_identity-16.0.0-py2.py3-none-any.whl
Collecting w3lib>=1.14.2 (from scrapy)
Downloading w3lib-1.14.2-py2.py3-none-any.whl
Collecting zope.interface>=4.0.2 (from Twisted>=10.0.0->scrapy)
Downloading zope.interface-4.2.0.tar.gz (146kB)
100% |████████████████████████████████| 153kB 2.1MB/s 
Collecting cryptography>=1.3 (from pyOpenSSL->scrapy)
Downloading cryptography-1.4.tar.gz (399kB)
100% |████████████████████████████████| 409kB 2.0MB/s 
Collecting attrs (from service-identity->scrapy)
Downloading attrs-16.0.0-py2.py3-none-any.whl
Collecting pyasn1-modules (from service-identity->scrapy)
Downloading pyasn1_modules-0.0.8-py2.py3-none-any.whl
Collecting pyasn1 (from service-identity->scrapy)
Downloading pyasn1-0.1.9-py2.py3-none-any.whl
Requirement already up-to-date: setuptools in /home/paul/.virtualenvs/scrapy.py3/lib/python3.5/site-packages (from zope.interface>=4.0.2->Twisted>=10.0.0->scrapy)
Collecting idna>=2.0 (from cryptography>=1.3->pyOpenSSL->scrapy)
Downloading idna-2.1-py2.py3-none-any.whl (54kB)
100% |████████████████████████████████| 61kB 3.1MB/s 
Collecting cffi>=1.4.1 (from cryptography>=1.3->pyOpenSSL->scrapy)
Downloading cffi-1.6.0.tar.gz (397kB)
100% |████████████████████████████████| 399kB 2.1MB/s 
Collecting pycparser (from cffi>=1.4.1->cryptography>=1.3->pyOpenSSL->scrapy)
Downloading pycparser-2.14.tar.gz (223kB)
100% |████████████████████████████████| 225kB 1.9MB/s 
Installing collected packages: cssselect, queuelib, six, w3lib, lxml, parsel, zope.interface, Twisted, PyDispatcher, idna, pyasn1, pycparser, cffi, cryptography, pyOpenSSL, attrs, pyasn1-modules, service-identity, scrapy
Running setup.py install for cssselect ... done
Running setup.py install for lxml ... done
Running setup.py install for zope.interface ... done
Running setup.py install for Twisted ... done
Running setup.py install for PyDispatcher ... done
Running setup.py install for pycparser ... done
Running setup.py install for cffi ... done
Running setup.py install for cryptography ... done
Successfully installed PyDispatcher-2.0.5 Twisted-16.2.0 attrs-16.0.0 cffi-1.6.0 cryptography-1.4 cssselect-0.9.1 idna-2.1 lxml-3.6.0 parsel-1.0.2 pyOpenSSL-16.0.0 pyasn1-0.1.9 pyasn1-modules-0.0.8 pycparser-2.14 queuelib-1.4.2 scrapy-1.1.0 service-identity-16.0.0 six-1.10.0 w3lib-1.14.2 zope.interface-4.2.0

我也在使用 Ubuntu 16.04,以下内容对我有用:

  1. 下载并安装 Anaconda,这是一个由 Continuum Analytics 免费提供的以数据为中心的 Python 发行版。目前(2017-07-09)Python 3.6和Python 2.7可用。

  2. 决定您的项目中需要哪个版本的scrapy并使用conda,他们的paket管理器来确定在Anaconda的默认安装中可用的版本:
    conda search scrapy产量

    1.3.3 py36_0默认值

  3. 如果您可以接受 1.3.3 版本,只需安装它:conda install scrapy

  4. 根据Scrapy网站的说法,最新的稳定版本是1.4.0,您可能希望使用它。您可能会运行pip install scrapy来获取它,但我推荐另一种方法:

  5. 检查 Anaconda 云并意识到版本 1.4.0 可以通过 conda-forge 获得,这是一个社区驱动的项目,其中许多软件包已被"锻造"以使其可与conda一起安装。

  6. 现在,您有两个选项可以从conda-forge通道
    • 安装最新版本,明确声明通道conda install -c conda-forge scrapy=1.4.0
    • 将 Conda-Forge 通道添加为conda config --add channels conda-forge安装的主要软件包源,然后只需使用conda install scrapy安装 Scrapy。

相关内容

最新更新