全网最详细使用Scrapy时遇到0: UserWarning: You do not have a working installation of the service_identity module: 'cannot import name 'opentype''. Please install it from ..的问题解决(图文详解)

    不多说,直接上干货!

  但是在运行爬虫程序的时候报错了,如下:

D:\Code\PycharmProfessionalCode\study\python_spider\30HoursGetWebCrawlerByPython>cd shop

D:\Code\PycharmProfessionalCode\study\python_spider\30HoursGetWebCrawlerByPython\shop>scrapy crawl tb
:0: UserWarning: You do not have a working installation of the service_identity module: ‘cannot import name ‘opentype‘‘.  Please install it from <https://pypi.python.org/pypi/service_identity> and make sure all of its dependencies are satisfied.  Without the service_identity module, Twisted can perform only rudimentary TLS client hostname verification.  Many valid certificate/hostname mappings may be rejected.
2018-01-19 21:18:52 [scrapy.utils.log] INFO: Scrapy 1.5.0 started (bot: shop)
2018-01-19 21:18:52 [scrapy.utils.log] INFO: Versions: lxml 4.1.1.0, libxml2 2.9.7, cssselect 1.0.3, parsel 1.3.1, w3lib 1.18.0, Twisted 17.9.0, Python 3.5.2 |Anaconda custom (64-bit)| (default, Jul  5 2016, 11:41:13) [MSC v.1900 64 bit (AMD64)], pyOpenSSL 16.2.0 (OpenSSL 1.0.2j  26 Sep 2016), cryptography 1.5, Platform Windows-10-10.0.16299-SP0
2018-01-19 21:18:52 [scrapy.crawler] INFO: Overridden settings: {‘NEWSPIDER_MODULE‘: ‘shop.spiders‘, ‘SPIDER_MODULES‘: [‘shop.spiders‘], ‘ROBOTSTXT_OBEY‘: True, ‘BOT_NAME‘: ‘shop‘}
2018-01-19 21:18:52 [scrapy.middleware] INFO: Enabled extensions:
[‘scrapy.extensions.logstats.LogStats‘,
 ‘scrapy.extensions.corestats.CoreStats‘,
 ‘scrapy.extensions.telnet.TelnetConsole‘]
2018-01-19 21:18:52 [scrapy.middleware] INFO: Enabled downloader middlewares:
[‘scrapy.downloadermiddlewares.robotstxt.RobotsTxtMiddleware‘,
 ‘scrapy.downloadermiddlewares.httpauth.HttpAuthMiddleware‘,
 ‘scrapy.downloadermiddlewares.downloadtimeout.DownloadTimeoutMiddleware‘,
 ‘scrapy.downloadermiddlewares.defaultheaders.DefaultHeadersMiddleware‘,
 ‘scrapy.downloadermiddlewares.useragent.UserAgentMiddleware‘,
 ‘scrapy.downloadermiddlewares.retry.RetryMiddleware‘,
 ‘scrapy.downloadermiddlewares.redirect.MetaRefreshMiddleware‘,
 ‘scrapy.downloadermiddlewares.httpcompression.HttpCompressionMiddleware‘,
 ‘scrapy.downloadermiddlewares.redirect.RedirectMiddleware‘,
 ‘scrapy.downloadermiddlewares.cookies.CookiesMiddleware‘,
 ‘scrapy.downloadermiddlewares.httpproxy.HttpProxyMiddleware‘,
 ‘scrapy.downloadermiddlewares.stats.DownloaderStats‘]
2018-01-19 21:18:52 [scrapy.middleware] INFO: Enabled spider middlewares:
[‘scrapy.spidermiddlewares.httperror.HttpErrorMiddleware‘,
 ‘scrapy.spidermiddlewares.offsite.OffsiteMiddleware‘,
 ‘scrapy.spidermiddlewares.referer.RefererMiddleware‘,
 ‘scrapy.spidermiddlewares.urllength.UrlLengthMiddleware‘,
 ‘scrapy.spidermiddlewares.depth.DepthMiddleware‘]
2018-01-19 21:18:52 [scrapy.middleware] INFO: Enabled item pipelines:
[]
2018-01-19 21:18:52 [scrapy.core.engine] INFO: Spider opened
2018-01-19 21:18:52 [scrapy.extensions.logstats] INFO: Crawled 0 pages (at 0 pages/min), scraped 0 items (at 0 items/min)
2018-01-19 21:18:52 [scrapy.extensions.telnet] DEBUG: Telnet console listening on 127.0.0.1:6023
2018-01-19 21:18:53 [scrapy.core.downloader.tls] WARNING: Remote certificate is not valid for hostname "www.taobao.com"; ‘*.tmall.com‘!=‘www.taobao.com‘
2018-01-19 21:18:53 [scrapy.core.engine] DEBUG: Crawled (200) <GET https://www.taobao.com/robots.txt> (referer: None)
2018-01-19 21:18:53 [scrapy.downloadermiddlewares.robotstxt] DEBUG: Forbidden by robots.txt: <GET https://www.taobao.com/>
2018-01-19 21:18:53 [scrapy.core.engine] INFO: Closing spider (finished)
2018-01-19 21:18:53 [scrapy.statscollectors] INFO: Dumping Scrapy stats:
{‘downloader/exception_count‘: 1,
 ‘downloader/exception_type_count/scrapy.exceptions.IgnoreRequest‘: 1,
 ‘downloader/request_bytes‘: 223,
 ‘downloader/request_count‘: 1,

  根据提示,去下载和安装service_identity,地址为:https://pypi.python.org/pypi/service_identity#downloads,下载whl文件 

PS C:\Anaconda3\Lib\site-packages> pip install service_identity-17.0.0-py2.py3-none-any.whl
Requirement already satisfied: service-identity==17.0.0 from file:///C:/Anaconda3/Lib/site-packages/service_identity-17.0.0-py2.py3-none-any.whl in c:\anaconda3\lib\site-packages
Requirement already satisfied: pyopenssl>=0.12 in c:\anaconda3\lib\site-packages (from service-identity==17.0.0)
Requirement already satisfied: pyasn1-modules in c:\anaconda3\lib\site-packages (from service-identity==17.0.0)
Requirement already satisfied: attrs in c:\anaconda3\lib\site-packages (from service-identity==17.0.0)
Requirement already satisfied: pyasn1 in c:\anaconda3\lib\site-packages (from service-identity==17.0.0)
Requirement already satisfied: cryptography>=1.3.4 in c:\anaconda3\lib\site-packages (from pyopenssl>=0.12->service-identity==17.0.0)
Requirement already satisfied: six>=1.5.2 in c:\anaconda3\lib\site-packages (from pyopenssl>=0.12->service-identity==17.0.0)
Requirement already satisfied: idna>=2.0 in c:\anaconda3\lib\site-packages (from cryptography>=1.3.4->pyopenssl>=0.12->service-identity==17.0.0)
Requirement already satisfied: setuptools>=11.3 in c:\anaconda3\lib\site-packages\setuptools-27.2.0-py3.5.egg (from cryptography>=1.3.4->pyopenssl>=0.12->service-identity==17.0.0)
Requirement already satisfied: cffi>=1.4.1 in c:\anaconda3\lib\site-packages (from cryptography>=1.3.4->pyopenssl>=0.12->service-identity==17.0.0)
Requirement already satisfied: pycparser in c:\anaconda3\lib\site-packages (from cffi>=1.4.1->cryptography>=1.3.4->pyopenssl>=0.12->service-identity==17.0.0)
PS C:\Anaconda3\Lib\site-packages>

Microsoft Windows [版本 10.0.16299.98]
(c) 2017 Microsoft Corporation。保留所有权利。

C:\Users\lenovo>scrapy version
:0: UserWarning: You do not have a working installation of the service_identity module: ‘cannot import name ‘opentype‘‘.  Please install it from <https://pypi.python.org/pypi/service_identity> and make sure all of its dependencies are satisfied.  Without the service_identity module, Twisted can perform only rudimentary TLS client hostname verification.  Many valid certificate/hostname mappings may be rejected.
Scrapy 1.5.0

C:\Users\lenovo>

    可见,在scrapy安装时,其实还有点问题的。

  其实这种情况下scrapy已经安装好了 可以使用 只是有部分功能 有影响就是其中提到的 service_identity模块。其实这个模块是已经安装了的。但是为什么还会报错呢。耗费了我两个小时 各种发帖 搜索。终于在一位大神那里找到了答案。
  原因是不知道因为什么原因导致本机上的service_identity模块太老旧,而你通过install安装的时候 不会更新到最新版本。

    然后,再执行

Microsoft Windows [版本 10.0.16299.98]
(c) 2017 Microsoft Corporation。保留所有权利。

C:\Users\lenovo>scrapy version
:0: UserWarning: You do not have a working installation of the service_identity module: ‘cannot import name ‘opentype‘‘.  Please install it from <https://pypi.python.org/pypi/service_identity> and make sure all of its dependencies are satisfied.  Without the service_identity module, Twisted can perform only rudimentary TLS client hostname verification.  Many valid certificate/hostname mappings may be rejected.
Scrapy 1.5.0

C:\Users\lenovo>pip install service_identity
Requirement already satisfied: service_identity in c:\anaconda3\lib\site-packages
Requirement already satisfied: pyasn1-modules in c:\anaconda3\lib\site-packages (from service_identity)
Requirement already satisfied: attrs in c:\anaconda3\lib\site-packages (from service_identity)
Requirement already satisfied: pyopenssl>=0.12 in c:\anaconda3\lib\site-packages (from service_identity)
Requirement already satisfied: pyasn1 in c:\anaconda3\lib\site-packages (from service_identity)
Requirement already satisfied: cryptography>=1.3.4 in c:\anaconda3\lib\site-packages (from pyopenssl>=0.12->service_identity)
Requirement already satisfied: six>=1.5.2 in c:\anaconda3\lib\site-packages (from pyopenssl>=0.12->service_identity)
Requirement already satisfied: idna>=2.0 in c:\anaconda3\lib\site-packages (from cryptography>=1.3.4->pyopenssl>=0.12->service_identity)
Requirement already satisfied: setuptools>=11.3 in c:\anaconda3\lib\site-packages\setuptools-27.2.0-py3.5.egg (from cryptography>=1.3.4->pyopenssl>=0.12->service_identity)
Requirement already satisfied: cffi>=1.4.1 in c:\anaconda3\lib\site-packages (from cryptography>=1.3.4->pyopenssl>=0.12->service_identity)
Requirement already satisfied: pycparser in c:\anaconda3\lib\site-packages (from cffi>=1.4.1->cryptography>=1.3.4->pyopenssl>=0.12->service_identity)

C:\Users\lenovo>pip3 install service_identity --force --upgrade
Collecting service_identity
  Using cached service_identity-17.0.0-py2.py3-none-any.whl
Collecting attrs (from service_identity)
  Using cached attrs-17.4.0-py2.py3-none-any.whl
Collecting pyasn1-modules (from service_identity)
  Using cached pyasn1_modules-0.2.1-py2.py3-none-any.whl
Collecting pyasn1 (from service_identity)
  Downloading pyasn1-0.4.2-py2.py3-none-any.whl (71kB)
    100% |████████████████████████████████| 71kB 8.3kB/s
Collecting pyopenssl>=0.12 (from service_identity)
  Downloading pyOpenSSL-17.5.0-py2.py3-none-any.whl (53kB)
    100% |████████████████████████████████| 61kB 9.0kB/s
Collecting six>=1.5.2 (from pyopenssl>=0.12->service_identity)
  Cache entry deserialization failed, entry ignored
  Cache entry deserialization failed, entry ignored
  Downloading six-1.11.0-py2.py3-none-any.whl
Collecting cryptography>=2.1.4 (from pyopenssl>=0.12->service_identity)
  Downloading cryptography-2.1.4-cp35-cp35m-win_amd64.whl (1.3MB)
    100% |████████████████████████████████| 1.3MB 9.5kB/s
Collecting idna>=2.1 (from cryptography>=2.1.4->pyopenssl>=0.12->service_identity)
  Downloading idna-2.6-py2.py3-none-any.whl (56kB)
    100% |████████████████████████████████| 61kB 15kB/s
Collecting asn1crypto>=0.21.0 (from cryptography>=2.1.4->pyopenssl>=0.12->service_identity)
  Downloading asn1crypto-0.24.0-py2.py3-none-any.whl (101kB)
    100% |████████████████████████████████| 102kB 10kB/s
Collecting cffi>=1.7; platform_python_implementation != "PyPy" (from cryptography>=2.1.4->pyopenssl>=0.12->service_identity)
  Downloading cffi-1.11.4-cp35-cp35m-win_amd64.whl (166kB)
    100% |████████████████████████████████| 174kB 9.2kB/s
Collecting pycparser (from cffi>=1.7; platform_python_implementation != "PyPy"->cryptography>=2.1.4->pyopenssl>=0.12->service_identity)
  Downloading pycparser-2.18.tar.gz (245kB)
    100% |████████████████████████████████| 256kB 8.2kB/s

欢迎大家,加入我的微信公众号:大数据躺过的坑     免费给分享

 

同时,大家可以关注我的个人博客

   http://www.cnblogs.com/zlslch/   和     http://www.cnblogs.com/lchzls/ 

   详情请见:http://www.cnblogs.com/zlslch/p/7473861.html

  人生苦短,我愿分享。本公众号将秉持活到老学到老学习无休止的交流分享开源精神,汇聚于互联网和个人学习工作的精华干货知识,一切来于互联网,反馈回互联网。
  目前研究领域:大数据、机器学习、深度学习、人工智能、数据挖掘、数据分析。 语言涉及:Java、Scala、Python、Shell、Linux等 。同时还涉及平常所使用的手机、电脑和互联网上的使用技巧、问题和实用软件。 只要你一直关注和呆在群里,每天必须有收获

以及对应本平台的QQ群:161156071(大数据躺过的坑)

全网最详细使用Scrapy时遇到0: UserWarning: You do not have a working installation of the service_identity module: 'cannot import name 'opentype''. Please install it from ..的问题解决(图文详解)

原文地址:https://www.cnblogs.com/zlslch/p/8318942.html

时间: 2024-08-05 20:51:23

全网最详细使用Scrapy时遇到0: UserWarning: You do not have a working installation of the service_identity module: 'cannot import name 'opentype''. Please install it from ..的问题解决(图文详解)的相关文章

全网最详细的启动Kafka服务时出现kafka.common.InconsistentBrokerIdException: Configured brokerId 3 doesn&#39;t match stored brokerId 1 in meta.properties错误的解决办法(图文详解)

不多说,直接上干货! 问题详情 执行bin/kafka-server-start.sh config/server.properties 时, [[email protected] kafka_2.11-0.9.0.0]$ bin/kafka-server-start.sh config/server.properties [2018-06-17 16:05:38,983] INFO KafkaConfig values: request.timeout.ms = 30000 log.roll.

CentOS 7上安装Zabbix Server 3.0 图文详解

转载自 http://www.linuxidc.com/Linux/2016-09/135204.htm CentOS 7上安装Zabbix Server 3.0 图文详解 1.查看系统信息. cat /etc/RedHat-releaseCentOS Linux release 7.0.1406 (Core) uname -a Linux VM_96_155_centos3.10.0-123.el7.x86_64 #1 SMP Mon Jun 30 12:09:22 UTC 2014 x86_

【图文详解】scrapy安装与真的快速上手——爬取豆瓣9分榜单

写在开头 现在scrapy的安装教程都明显过时了,随便一搜都是要你安装一大堆的依赖,什么装python(如果别人连python都没装,为什么要学scrapy-.)wisted, zope interface,pywin32---现在scrapy的安装真的很简单的好不好! 代码我放github上了,可以参考: https://github.com/hk029/doubanbook 为什么要用scrapy 我之前讲过了requests,也用它做了点东西,([图文详解]python爬虫实战--5分钟做

centos7.0 安装日志--图文详解-python开发环境配置

centos7.0发布之后,就下载了everthing的DVD镜像,今天有时间,所以决定在vbox底下体验一番--- 上图: 默认是体验安装,作为一个忠实粉丝,我决定选择直接安装! 这个界面是这次新版本更新后改的,它把以前要下一步.上一步可以修改的操作全部集中到一个页面来,默认选择是下图这样,比如你想修改软件安装选项只要点击相应选项就可以了. 每次你更改安装选项之后,它都会自动从新计算安装源,如果你的选择的资源本地没有,还可以通过网络来安装,默认网络是不启用的,所以我们需要自己手工设置一下网络.

全网最详细的Eclipse和MyEclipse里对于Java web项目发布到Tomcat上运行成功的对比事宜【博主强烈推荐】【适合普通的还是Maven方式创建的】(图文详解)

不多说,直接上干货! 首先,大家要明确,IDEA.Eclipse和MyEclipse等编辑器之间的新建和运行手法是不一样的. 全网最详细的Eclipse里如何正确新建普通的Java web项目并发布到Tomcat上运行成功[博主强烈推荐](图文详解) Eclipse是 或者 MyEclipse是 由 变成 即,只有所框的部分过去了.并且自动改名为dat08. 用MyEclipse里自带的Web Browser检测: 用第三方安装的浏览器检测: 欢迎大家,加入我的微信公众号:大数据躺过的坑   

Android中Canvas绘图之Shader使用图文详解

概述 我们在用Android中的Canvas绘制各种图形时,可以通过Paint.setShader(shader)方法为画笔Paint设置shader,这样就可以绘制出多彩的图形.那么Shader是什么呢?做过GPU绘图的同学应该都知道这个词汇,Shader就是着色器的意思.我们可以这样理解,Canvas中的各种drawXXX方法定义了图形的形状,画笔中的Shader则定义了图形的着色.外观,二者结合到一起就决定了最终Canvas绘制的被色彩填充的图形的样子. 类android.graphics

Android图文详解属性动画

Android中的动画分为视图动画(View Animation).属性动画(Property Animation)以及Drawable动画.从Android 3.0(API Level 11)开始,Android开始支持属性动画,本文主要讲解如何使用属性动画.关于视图动画可以参见博文<Android四大视图动画图文详解>. 概述 视图动画局限比较大,如下所述: 视图动画只能使用在View上面. 视图动画并没有真正改变View相应的属性值,这导致了UI效果与实际View状态存在差异,并导致了一

LVS-DR工作原理图文详解

为了阐述方便,我根据官方原理图另外制作了一幅图,如下图所示:VS/DR的体系结构: 我将结合这幅原理图及具体的实例来讲解一下LVS-DR的原理,包括数据包.数据帧的走向和转换过程. 官方的原理说明:Director接收用户的请求,然后根据负载均衡算法选取一台realserver,将包转发过去,最后由realserver直接回复给用户. 实例场景设备清单: 说明:我这里为了方便,client是与vip同一网段的机器.如果是外部的用户访问,将client替换成gateway即可,因为IP包头是不变的

图文详解crond定时任务

第1章crontd的介绍   1.1crond的含义 crond是linux下用来周期性的执行某种任务或等待处理某些事件的一个守护进程,与windows下的计划任务类似,当安装完成操作系统后,默认会安装此服务工具,并且会自动启动crond进程,crond进程每分钟会定期检查是否有要执行的任务,如果有要执行的任务,则自动执行该任务. 1.2关于任务调度的分类 Linux下的任务调度分为两类,系统任务调度和用户任务调度. 1.系统任务调度:系统周期性所要执行的工作,比如写缓存数据到硬盘.日志清理等.