Go to file
Selwin Ong eaa350443c Merge pull request from Atala/set_sentry_transport
fix: set HTTP transport as default for sentry
examples fix print in example
rq fix: set HTTP transport as default for sentry
tests Another check on failed status and test
.coveragerc Ignore local.py (it's tested in werkzeug instead).
.gitignore Ignore more files
.mailmap Add .mailmap
.travis.yml run python 3.5 on travis, adopt timeouts
CHANGES.md Bump version to 0.7.1.
LICENSE Fix year.
MANIFEST.in Added a MANIFEST excluding tests from distribution
Makefile Clean dist+build folders before releasing
README.md Update outdated sample codes in README.md
dev-requirements.txt Move mock to test-only dependencies.
py26-requirements.txt Install importlib on Travis' py26 environment.
requirements.txt Reverted click requirement back to >= 3.0
run_tests correct SLOW env var, run slow tests on ci
setup.cfg Move the flake8 config to setup.cfg
setup.py Update setup.py to ensure Python 2.6 dependencies are installed by pip.
tox.ini run python 3.5 on travis, adopt timeouts

README.md

RQ (Redis Queue) is a simple Python library for queueing jobs and processing them in the background with workers. It is backed by Redis and it is designed to have a low barrier to entry. It should be integrated in your web stack easily.

RQ requires Redis >= 2.7.0.

Build status Downloads Can I Use Python 3? Coverage Status

Full documentation can be found here.

Getting started

First, run a Redis server, of course:

$ redis-server

To put jobs on queues, you don't have to do anything special, just define your typically lengthy or blocking function:

import requests

def count_words_at_url(url):
    """Just an example function that's called async."""
    resp = requests.get(url)
    return len(resp.text.split())

You do use the excellent requests package, don't you?

Then, create an RQ queue:

from redis import Redis
from rq import Queue

q = Queue(connection=Redis())

And enqueue the function call:

from my_module import count_words_at_url
result = q.enqueue(count_words_at_url, 'http://nvie.com')

For a more complete example, refer to the docs. But this is the essence.

The worker

To start executing enqueued function calls in the background, start a worker from your project's directory:

$ rq worker
*** Listening for work on default
Got count_words_at_url('http://nvie.com') from default
Job result = 818
*** Listening for work on default

That's about it.

Installation

Simply use the following command to install the latest released version:

pip install rq

If you want the cutting edge version (that may well be broken), use this:

pip install -e git+git@github.com:nvie/rq.git@master#egg=rq

Project history

This project has been inspired by the good parts of Celery, Resque and this snippet, and has been created as a lightweight alternative to the heaviness of Celery or other AMQP-based queueing implementations.