You cannot select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
 
 
 
Yaniv Aknin 4925b09aa5 Set worker ttl and maintain it when idle/taking/finishing jobs
This change could use far better test coverage, but I'm not sure how to
test it without refactoring more of the code than I think is reasonable
in the scope of this work.
12 years ago
examples Rewrite of the connection setup. 13 years ago
rq Set worker ttl and maintain it when idle/taking/finishing jobs 12 years ago
tests Set worker ttl and maintain it when idle/taking/finishing jobs 12 years ago
.env.fish Add .env.fish, for fish lovers. 12 years ago
.gitignore Add tox tests, for Python 2.6 and PyPy. 12 years ago
.travis.yml Install importlib on Travis' py26 environment. 12 years ago
CHANGES.md Update changelog. 12 years ago
LICENSE Fix year. 13 years ago
README.md Add comment to the README. 13 years ago
py26-requirements.txt Install importlib on Travis' py26 environment. 12 years ago
requirements.txt Install importlib on Travis' py26 environment. 12 years ago
run_tests Add way of running tests unfiltered. 13 years ago
setup.cfg Merge branch 'selwin-remove-logbook' 12 years ago
setup.py Promote to 'beta' stage. 12 years ago
tox.ini Add tox tests, for Python 2.6 and PyPy. 12 years ago

README.md

Build status

RQ (Redis Queue) is a simple Python library for queueing jobs and processing them in the background with workers. It is backed by Redis and it is designed to have a low barrier to entry. It should be integrated in your web stack easily.

Getting started

First, run a Redis server, of course:

$ redis-server

To put jobs on queues, you don't have to do anything special, just define your typically lengthy or blocking function:

import requests

def count_words_at_url(url):
    """Just an example function that's called async."""
    resp = requests.get(url)
    return len(resp.text.split())

You do use the excellent requests package, don't you?

Then, create a RQ queue:

from rq import Queue, use_connection
use_connection()
q = Queue()

And enqueue the function call:

from my_module import count_words_at_url
result = q.enqueue(count_words_at_url, 'http://nvie.com')

For a more complete example, refer to the docs. But this is the essence.

The worker

To start executing enqueued function calls in the background, start a worker from your project's directory:

$ rqworker
*** Listening for work on default
Got count_words_at_url('http://nvie.com') from default
Job result = 818
*** Listening for work on default

That's about it.

Installation

Simply use the following command to install the latest released version:

pip install rq

If you want the cutting edge version (that may well be broken), use this:

pip install -e git+git@github.com:nvie/rq.git@master#egg=rq

Project history

This project has been inspired by the good parts of Celery, Resque and this snippet, and has been created as a lightweight alternative to the heaviness of Celery or other AMQP-based queueing implementations.