常常用python开发web应用时,会涉及到定时任务的脚本,之前用linux自带的crontab来操做,可是感受不太接地气,后来发现用celery+django 能够方便的实现!
python
安装软件环境以下:mysql
python 2.7.5linux
Django==1.8.2web
celery==3.1.18redis
celery-with-redis==3.0sql
django-celery==3.1.16django
MySQL-python==1.2.3json
supervisor==3.1.3后端
使用pip方式安装完以上软件,而且默认系统已经安装了redis和mysql服务器!bash
一 首先建立project:
django-admin.py createproject picha
而后建立名称为demo的app:
django-admin.py startapp demo
项目的目录结构为:
二 下面在settings文件中配置celery相关的配置:
# CELERY STUFF import djcelery djcelery.setup_loader() BROKER_URL = 'redis://localhost:6379' CELERYBEAT_SCHEDULER = 'djcelery.schedulers.DatabaseScheduler' # 定时任务 CELERY_RESULT_BACKEND = 'djcelery.backends.database:DatabaseBackend' CELERY_RESULT_BACKEND = 'redis://localhost:6379' CELERY_ACCEPT_CONTENT = ['application/json'] CELERY_TASK_SERIALIZER = 'json' CELERY_RESULT_SERIALIZER = 'json' CELERY_TIMEZONE = 'Asia/Shanghai'
INSTALLED_APPS = ( 'django.contrib.admin', 'django.contrib.auth', 'django.contrib.contenttypes', 'django.contrib.sessions', 'django.contrib.messages', 'django.contrib.staticfiles', 'demo', 'djcelery', )
在和settings.py同级目录中编辑文件 |__init__.py
#! /usr/bin/env python # coding: utf-8 from __future__ import absolute_import # This will make sure the app is always imported when # Django starts so that shared_task will use this app. from .celery import app as celery_app
而后修改市区:
TIME_ZONE = 'Asia/Shanghai'
市区不对,计划任务是不会按时间执行的!
另外,咱们还须要在建立一个celery.py文件,他会自动发现咱们app下面的task!
#! /usr/bin/env python # coding: utf-8 from __future__ import absolute_import import os from celery import Celery from django.conf import settings # set the default Django settings module for the 'celery' program. os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'picha.settings') app = Celery('picha') # Using a string here means the worker will not have to # pickle the object when using Windows. app.config_from_object('django.conf:settings') app.autodiscover_tasks(lambda: settings.INSTALLED_APPS) @app.task(bind=True) def debug_task(self): print('Request: {0!r}'.format(self.request))
如今咱们在demo的app下面建立测试用的task!
from __future__ import absolute_import from celery import shared_task,task @shared_task() def add(x,y): # return x + y print x + y @shared_task() def mul(x,y): print "%d * %d = %d" %(x,y,x*y) return x*y @shared_task() def sub(x,y): print "%d - %d = %d"%(x,y,x-y) return x - y @task(ignore_result=True,max_retries=1,default_retry_delay=10) def just_print(): print "Print from celery task"
到这里,django和celery部分已经安装完成!
三 我如今开始配置supervisor,用来启动相关celery程序:
1)初始化supervisor配置文件!
echo_supervisord_conf > /etc/supervisord.conf
2)而后在supervisord.conf文件末尾添加以下配置:
[program:djangoproject.celeryd] command=/usr/local/pyenv/shims/python /usr/local/coding/pythoner/picha/manage.py celeryd --concurrency=1 user=root numprocs=1 directory=/usr/local/coding/pythoner/picha stdout_logfile=/var/log/celery_worker.log stderr_logfile=/var/log/celery_worker.log autostart=true autorestart=true startsecs=10 stopwaitsecs = 120 priority=998 [program:djangoproject.celerybeat] command=/usr/local/pyenv/shims/python /usr/local/coding/pythoner/picha/manage.py celery beat --schedule=/tmp/celerybeat-schedule --pidfile=/tmp/django_celerybeat.pid --loglevel=INFO user=root numprocs=1 directory=/usr/local/coding/pythoner/picha stdout_logfile=/var/log/celery_beat.log stderr_logfile=/var/log/celery_beat.log autostart=true autorestart=true startsecs=10 stopwaitsecs = 120 priority=998 [program:djangoproject.celerycam] command=/usr/local/pyenv/shims/python /usr/local/coding/pythoner/picha/manage.py celerycam --frequency=10.0 user=root numprocs=1 directory=/usr/local/coding/pythoner/picha stdout_logfile=/var/log/celerycam.log stderr_logfile=/var/log/celerycam.log autostart=true autorestart=true startsecs=10 stopwaitsecs = 120 priority=998
四 如今咱们须要把celery相关的库文件同步到mysql中,咱们使用命令:
python manage.py syncdb
而后建立superuser
django-admin manage.py createsuperuser
启动supervisor:
supervisord -d
查看服务是否启动成功,使用命令supervisorctl status
djangoproject.celerybeat RUNNING pid 3061, uptime 1:03:27
djangoproject.celerycam RUNNING pid 3063, uptime 1:03:27
djangoproject.celeryd RUNNING pid 3062, uptime 1:03:27
而后咱们进入到django-admin后台,
如今咱们启动django:
python manage.py runserver 0.0.0.0:8008
进入后台后,点击“Periodic tasks”:
能够看到写在tasks.py下面的方法,在下拉菜单中都出现了,咱们只用选择对应的时间便可!
如今,咱们开始选择计划任务的时间:
咱们建立一个定时任务,没10s,print一个数值,放在在日志文件中查看:
咱们查看日志文件:
咱们在设置一个加法运算,每隔15s运行一次,并且咱们能够在web平台后端动态的修改所传的参数,
第一次,咱们传入参数9和5,结果应该为14,咱们看下设置和日志:
我再看下日志:
而后咱们在web后台修改传入参数为10和7,不重启服务,计算的结果动态变化为17!
咱们发现,结果数据已经动态变化!
咱们若是启动了 supervisor脚本中的:/usr/local/coding/pythoner/picha/manage.py celerycam --frequency=10.0
就能够在admin后台查看 woker是否是在线:
PS:配置过程当中计划任务的结果只能日志中查看,不知道怎么在admin的后台中显示,若是你们知道,能够告诉我,3Q!