Celery文档: http://docs.jinkan.org/docs/celery/getting-started/first-steps-with-celery.htmlhtml
pip install Celery pip install celery-with-redis
# tasks.py import time from celery import Celery celery = Celery('tasks', broker='redis://localhost:6379/0') @celery.task def sendmail(mail): print('sending mail to %s...' % mail['to']) time.sleep(2.0) print('mail sent.')
$ celery -A tasks worker --loglevel=info
上面的命令行实际上启动的是Worker,若是要放到后台运行,能够扔给supervisor。python
>>> from tasks import sendmail >>> sendmail.delay(dict(to='celery@python.org')) <AsyncResult: 1a0a9262-7858-4192-9981-b7bf0ea7483b>
能够看到,Celery的API设计真的很是简单。而后,在Worker里就能够看到任务处理的消息:redis
[2013-08-27 19:20:23,363: WARNING/MainProcess] celery@MichaeliMac.local ready. [2013-08-27 19:20:23,367: INFO/MainProcess] consumer: Connected to redis://localhost:6379/0. [2013-08-27 19:20:45,618: INFO/MainProcess] Got task from broker: tasks.sendmail[1a0a9262-7858-4192-9981-b7bf0ea7483b] [2013-08-27 19:20:45,655: WARNING/PoolWorker-4] sending mail to celery@python.org... [2013-08-27 19:20:47,657: WARNING/PoolWorker-4] mail sent. [2013-08-27 19:20:47,658: INFO/MainProcess] Task tasks.sendmail[1a0a9262-7858-4192-9981-b7bf0ea7483b] succeeded in 2.00266814232s: None
Celery默认设置就能知足基本要求。Worker以Pool模式启动,默认大小为CPU核心数量,缺省序列化机制是pickle,但能够指定为json。因为Python调用UNIX/Linux程序实在太容易,因此,用Celery做为异步任务框架很是合适json