应用场景:在爬虫关闭或者爬虫空闲时能够经过发送邮件的提醒。html
经过twisted的非阻塞IO实现,能够直接写在spider中,也能够写在中间件或者扩展中,看你具体的需求。python
在网上找了不少教程,都是不少年前的或者就是官网搬运的,一点实际的代码都没有,因此就本身尝试了一下,因为本人也是爬虫新手,轻喷,轻喷!git
看下面的示例代码前,先看下官网,熟悉基本的属性。github
官网地址sending e-mail:
<https://docs.scrapy.org/en/latest/topics/email.html?highlight=MailSender>
服务器
首先在settings
同级的目录下建立extendions
(扩展)文件夹,scrapy
代码以下:ide
import logging from scrapy import signals from scrapy.exceptions import NotConfigured from scrapy.mail import MailSender logger = logging.getLogger(__name__) class SendEmail(object): def __init__(self,sender,crawler): self.sender = sender crawler.signals.connect(self.spider_idle, signal=signals.spider_idle) crawler.signals.connect(self.spider_closed, signal=signals.spider_closed) @classmethod def from_crawler(cls,crawler): if not crawler.settings.getbool('MYEXT_ENABLED'): raise NotConfigured mail_host = crawler.settings.get('MAIL_HOST') # 发送邮件的服务器 mail_port = crawler.settings.get('MAIL_PORT') # 邮件发送者 mail_user = crawler.settings.get('MAIL_USER') # 邮件发送者 mail_pass = crawler.settings.get('MAIL_PASS') # 发送邮箱的密码不是你注册时的密码,而是受权码!!!切记! sender = MailSender(mail_host,mail_user,mail_user,mail_pass,mail_port) #因为这里邮件的发送者和邮件帐户是同一个就都写了mail_user了 h = cls(sender,crawler) return h def spider_idle(self,spider): logger.info('idle spider %s' % spider.name) def spider_closed(self, spider): logger.info("closed spider %s", spider.name) body = 'spider[%s] is closed' %spider.name subject = '[%s] good!!!' %spider.name # self.sender.send(to={'zfeijun@foxmail.com'}, subject=subject, body=body) return self.sender.send(to={'zfeijun@foxmail.com'}, subject=subject, body=body)
这里为何是
return self.sender.send
,是由于直接用sender.send
会报builtins.AttributeError: 'NoneType' object has no attribute 'bio_read'
的错误(邮件会发送成功),具体缘由不是很懂,有大牛知道的能够指导一下。ui解决方法参考:
<https://github.com/scrapy/scrapy/issues/3478>
code在
sender.send
前加return
就行了。htm
在扩展中写好代码后,须要在settings
中启用
EXTENSIONS = { # 'scrapy.extensions.telnet.TelnetConsole': 300, 'bukalapak.extendions.sendmail.SendEmail': 300, } MYEXT_ENABLED = True
转载请注明出处!