82333d2ad5 | 10 years ago | |
---|---|---|
examples | 11 years ago | |
rq | 10 years ago | |
tests | 10 years ago | |
.coveragerc | 11 years ago | |
.gitignore | 10 years ago | |
.travis.yml | 11 years ago | |
CHANGES.md | 11 years ago | |
LICENSE | 13 years ago | |
MANIFEST.in | 12 years ago | |
README.md | 10 years ago | |
dev-requirements.txt | 11 years ago | |
py26-requirements.txt | 12 years ago | |
requirements.txt | 10 years ago | |
run_tests | 13 years ago | |
setup.cfg | 11 years ago | |
setup.py | 10 years ago | |
tox.ini | 10 years ago |
README.md
RQ (Redis Queue) is a simple Python library for queueing jobs and processing them in the background with workers. It is backed by Redis and it is designed to have a low barrier to entry. It should be integrated in your web stack easily.
RQ requires Redis >= 2.6.0.
Getting started
First, run a Redis server, of course:
$ redis-server
To put jobs on queues, you don't have to do anything special, just define your typically lengthy or blocking function:
import requests
def count_words_at_url(url):
"""Just an example function that's called async."""
resp = requests.get(url)
return len(resp.text.split())
You do use the excellent requests package, don't you?
Then, create a RQ queue:
from rq import Queue, use_connection
use_connection()
q = Queue()
And enqueue the function call:
from my_module import count_words_at_url
result = q.enqueue(count_words_at_url, 'http://nvie.com')
For a more complete example, refer to the docs. But this is the essence.
The worker
To start executing enqueued function calls in the background, start a worker from your project's directory:
$ rqworker
*** Listening for work on default
Got count_words_at_url('http://nvie.com') from default
Job result = 818
*** Listening for work on default
That's about it.
Installation
Simply use the following command to install the latest released version:
pip install rq
If you want the cutting edge version (that may well be broken), use this:
pip install -e git+git@github.com:nvie/rq.git@master#egg=rq
Project history
This project has been inspired by the good parts of Celery, Resque and this snippet, and has been created as a lightweight alternative to the heaviness of Celery or other AMQP-based queueing implementations.