【Scrapy学习】 scrapyd 文件配置

scrapyd配置文件: 若是没有没有配置文件,scrapyd会使用自身的默认值,好比默认每一个CPU 最多只执行4个scrapy进程。 CentOS 6.5 64 位 scrapy 1.3.3 scrapyd 1.1.1web

若是设置了scrapyd的配置文件: scrapyd会搜索路径:json

• /etc/scrapyd/scrapyd.conf (Unix)
• c:\scrapyd\scrapyd.conf (Windows)
• /etc/scrapyd/conf.d/* (in alphabetical order, Unix)
• scrapyd.conf
• ~/.scrapyd.conf (users home directory)

个人配置文件放在etc /scrapyd/scrapyd.conf 下app

[scrapyd]
eggs_dir = /usr/scrapyd/eggs
logs_dir = /usr/scrapyd/logs
jobs_to_keep = 100
dbs_dir = /usr/scrapyd/dbs
max_proc = 0
max_proc_per_cpu = 800
finished_to_keep = 100
poll_interval = 5.0
bind_address = 192.168.17.30
http_port = 6800
debug = off
runner = scrapyd.runner
application = scrapyd.app.application
launcher = scrapyd.launcher.Launcher
webroot = scrapyd.website.Root
[services]
schedule.json     = scrapyd.webservice.Schedule
cancel.json       = scrapyd.webservice.Cancel
addversion.json   = scrapyd.webservice.AddVersion
listprojects.json = scrapyd.webservice.ListProjects
listversions.json = scrapyd.webservice.ListVersions
listspiders.json  = scrapyd.webservice.ListSpiders
delproject.json   = scrapyd.webservice.DeleteProject
delversion.json   = scrapyd.webservice.DeleteVersion
listjobs.json     = scrapyd.webservice.ListJobs

其中在打开web界面时,若是长时间没有操做,后台会报出Timing out..scrapy

相关文章
相关标签/搜索