You cannot select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
 
 
 
Selwin Ong baa0cc268a
Job scheduling (#1163)
* First RQScheduler prototype

* WIP job scheduling

* Fixed Python 2.7 tests

* Added ScheduledJobRegistry.get_scheduled_time(job)

* WIP on scheduler's threading mechanism

* Fixed test errors

* Changed scheduler.acquire_locks() to instance method

* Added scheduler.prepare_registries()

* Somewhat working implementation of RQ scheduler

* Only call stop_scheduler if there's a scheduler present

* Use OSError rather than ProcessLookupError for PyPy compatibility

* Added `auto_start` argument to scheduler.acquire_locks()

* Make RQScheduler play better with timezone

* Fixed test error

* Added --with-scheduler flag to rq worker CLI

* Fix tests on Python 2.x

* More Python 2 fixes

* Only call `scheduler.start` if worker is run in non burst mode

* Fixed an issue where running worker with scheduler would fail sometimes

* Make `worker.stop_scheduler()` more resilient to errors

* worker.dequeue_job_and_maintain_ttl() should also periodically run maintenance tasks

* Scheduler can now work with worker in both burst and non burst mode

* Fixed scheduler logging message

* Always log scheduler errors when running

* Improve scheduler error logging message

* Removed testing code

* Scheduler should periodically try to acquire locks for other queues it doesn't have

* Added tests for scheduler.should_reacquire_locks

* Added queue.enqueue_in()

* Fixes queue.enqueue_in() in Python 2.7

* First stab at documenting job scheduling

* Remove unused methods

* Remove Python 2.6 logging compatibility code

* Remove more unused imports

* Added convenience methods to access job registries from queue

* Added test for worker.run_maintenance_tasks()

* Simplify worker.queue_names() and worker.queue_keys()

* Updated changelog to mention RQ's new job scheduling mechanism.
5 years ago
.github Create FUNDING.yml 6 years ago
docs Job scheduling (#1163) 5 years ago
examples fix print in example 11 years ago
rq Job scheduling (#1163) 5 years ago
tests Job scheduling (#1163) 5 years ago
.coveragerc Ignore local.py (it's tested in werkzeug instead). 11 years ago
.gitignore RQ v1.0! (#1059) 6 years ago
.mailmap Add .mailmap 10 years ago
.travis.yml Travis CI: sudo: deprecated and Xenial is default (#1175) 5 years ago
CHANGES.md Job scheduling (#1163) 5 years ago
LICENSE Fix year. 13 years ago
MANIFEST.in Include LICENSE in manifest (#1134) 6 years ago
Makefile Clean dist+build folders before releasing 10 years ago
README.md Update README.md 6 years ago
dev-requirements.txt Job scheduling (#1163) 5 years ago
requirements.txt modify zadd calls for redis-py 3.0 (#1016) 7 years ago
run_tests Fix run_tests to use pytest. (#1033) 6 years ago
setup.cfg modify zadd calls for redis-py 3.0 (#1016) 7 years ago
setup.py Remove Python 3.3 support. (#1031) 6 years ago
tox.ini Fix run_tests to use pytest. (#1033) 6 years ago

README.md

RQ (Redis Queue) is a simple Python library for queueing jobs and processing them in the background with workers. It is backed by Redis and it is designed to have a low barrier to entry. It should be integrated in your web stack easily.

RQ requires Redis >= 3.0.0.

Build status PyPI Coverage

Full documentation can be found here.

Support RQ

If you find RQ useful, please consider supporting this project via Tidelift.

Getting started

First, run a Redis server, of course:

$ redis-server

To put jobs on queues, you don't have to do anything special, just define your typically lengthy or blocking function:

import requests

def count_words_at_url(url):
    """Just an example function that's called async."""
    resp = requests.get(url)
    return len(resp.text.split())

You do use the excellent requests package, don't you?

Then, create an RQ queue:

from redis import Redis
from rq import Queue

q = Queue(connection=Redis())

And enqueue the function call:

from my_module import count_words_at_url
job = q.enqueue(count_words_at_url, 'http://nvie.com')

For a more complete example, refer to the docs. But this is the essence.

The worker

To start executing enqueued function calls in the background, start a worker from your project's directory:

$ rq worker
*** Listening for work on default
Got count_words_at_url('http://nvie.com') from default
Job result = 818
*** Listening for work on default

That's about it.

Installation

Simply use the following command to install the latest released version:

pip install rq

If you want the cutting edge version (that may well be broken), use this:

pip install -e git+https://github.com/nvie/rq.git@master#egg=rq

Project history

This project has been inspired by the good parts of Celery, Resque and this snippet, and has been created as a lightweight alternative to the heaviness of Celery or other AMQP-based queueing implementations.