一、在settings中设置log级别,在settings.py中添加一行:app
Scrapy提供5层logging级别: CRITICAL - 严重错误(critical) ERROR - 通常错误(regular errors) WARNING - 警告信息(warning messages) INFO - 通常信息(informational messages) DEBUG - 调试信息(debugging messages)
scrapy默认显示DEBUG级别的log信息dom
二、将输出的结果保存为log日志,在settings.py中添加路径:scrapy
LOG_FILE = './log.log'
三、显示log位置,在pipelines.py中:ide
import logging logger = logging.getLogger(__name__) def process_item(self, item, spider): logger.warning(item) ....
4.在spider
文件中引入Log日志:spa
class DcdappSpider(scrapy.Spider): name = 'dcdapp' allowed_domains = ['m.dcdapp.com'] custom_settings = { # 设置管道下载 'ITEM_PIPELINES': { 'autospider.pipelines.DcdAppPipeline': 300, }, # 设置log日志 'LOG_LEVEL':'DEBUG', 'LOG_FILE':'./././Log/dcdapp_log.log' }