找到问题的那一刻,骂了本身N次的猪脑子!!!python
问题:爬虫脚本原本一切正常的,临时有其余事情耽搁了,而后回头正式运行的时候发现一只报错 [twisted] CRITICAL: Unhandled error in Deferred,错误就是下面这样的(由于都同样的错误,因此直接从网上贴过来的):python2.7
2016-03-13 08:50:50 [scrapy] INFO: Enabled extensions: CloseSpider, TelnetConsole, LogStats, CoreStats, SpiderState Unhandled error in Deferred: 2016-03-13 08:50:50 [twisted] CRITICAL: Unhandled error in Deferred: Traceback (most recent call last): File "/usr/local/lib/python2.7/dist-packages/scrapy/cmdline.py", line 150, in _run_command cmd.run(args, opts) File "/usr/local/lib/python2.7/dist-packages/scrapy/commands/crawl.py", line 57, in run self.crawler_process.crawl(spname, **opts.spargs) File "/usr/local/lib/python2.7/dist-packages/scrapy/crawler.py", line 153, in crawl d = crawler.crawl(*args, **kwargs) File "/usr/local/lib/python2.7/dist-packages/twisted/internet/defer.py", line 1274, in unwindGenerator return _inlineCallbacks(None, gen, Deferred()) --- <exception caught here> --- File "/usr/local/lib/python2.7/dist-packages/twisted/internet/defer.py", line 1128, in _inlineCallbacks result = g.send(result) File "/usr/local/lib/python2.7/dist-packages/scrapy/crawler.py", line 70, in crawl self.spider = self._create_spider(*args, **kwargs) File "/usr/local/lib/python2.7/dist-packages/scrapy/crawler.py", line 80, in _create_spider return self.spidercls.from_crawler(self, *args, **kwargs) File "/usr/local/lib/python2.7/dist-packages/scrapy/spiders/crawl.py", line 91, in from_crawler spider = super(CrawlSpider, cls).from_crawler(crawler, *args, **kwargs) File "/usr/local/lib/python2.7/dist-packages/scrapy/spiders/__init__.py", line 50, in from_crawler spider = cls(*args, **kwargs) exceptions.TypeError: __init__() takes at least 3 arguments (1 given) 2016-03-13 08:50:50 [twisted] CRITICAL:
问题解决思路:scrapy
一、网上有说多是twisted版本太高的,本人scrapy版本为1.3.3,twisted版本为13.1.0,已是1.3.3版本要求的最低的twisted版本,依然不行。ide
二、有考虑过pip源的问题,因此换了源,依然不行。(可是本人以前在安装cryptography库的时候确实是由于pypi源有问题,更换为非官方的库就行了。)post
官方库下载地址:https://pypi.org/spa
非官方库下载地址:https://www.lfd.uci.edu/~gohlke/pythonlibs/#numpycode
三、在python安装目录script下运行安装:ip
python C:\Python36\Scripts\pywin32_postinstall.py -installci
依然不行!!!get
问题解决:
要崩溃了,而后看到 Stack Overflow上一我的(https://stackoverflow.com/questions/35970518/scrapy-twisted-critical-unhandled-error-in-deferred)解决这个问题的思路,灵光一闪,对比一下本身代码及配置文件,而后就发现上次清理文件时手残的把scrapy.cfg配置文件删除了!!!
因此出现诡异报错时仍是要认真检查配置项呀。