RQ (_Redis Queue_) is a simple Python library for queueing jobs and processing them in the background with workers. It is backed by Redis and it is designed to have a low barrier to entry. It should be integrated in your web stack easily. ## Getting started First, run a Redis server, of course: $ redis-server To put jobs on queues, you don't have to do anything special, just define your typically lengthy or blocking function: import requests def count_words_at_url(url): resp = requests.get(url) return len(resp.text.split()) You do use the excellent [requests][r] package, don't you? Then, create a RQ queue: from rq import Queue, use_connection use_connection() q = Queue() And enqueue the function call: from my_module import count_words_at_url result = q.enqueue(count_words_at_url, 'http://nvie.com') For a more complete example, refer to the [docs][d]. But this is the essence. ### The worker To start executing enqueued function calls in the background, start a worker from your project's directory: $ rqworker *** Listening for work on default Got count_words_at_url('http://nvie.com') from default Job result = 818 *** Listening for work on default That's about it. ## Installation Simply use the following command to install the latest released version: pip install rq If you want the cutting edge version (that may well be broken), use this: pip install -e git+git@github.com:nvie/rq.git@master#egg=rq ## Project history This project has been inspired by the good parts of [Celery][1], [Resque][2] and [this snippet][3], and has been created as a lightweight alternative to the heaviness of Celery or other AMQP-based queueing implementations. [r]: http://python-requests.org [d]: http://nvie.github.com/rq/docs/ [m]: http://pypi.python.org/pypi/mailer [p]: http://docs.python.org/library/pickle.html [1]: http://www.celeryproject.org/ [2]: https://github.com/defunkt/resque [3]: http://flask.pocoo.org/snippets/73/