Fix error in example in the documentation (#1870)

The solution that @rpkak proposes works. Closes #1524
main
Joris 2 years ago committed by GitHub
parent 04722339d7
commit a9fae76e88
No known key found for this signature in database
GPG Key ID: 4AEE18F83AFDEB23

@ -111,8 +111,8 @@ You can also enqueue multiple jobs in bulk with `queue.enqueue_many()` and `Queu
```python ```python
jobs = q.enqueue_many( jobs = q.enqueue_many(
[ [
Queue.prepare_data(count_words_at_url, 'http://nvie.com', job_id='my_job_id'), Queue.prepare_data(count_words_at_url, ('http://nvie.com',), job_id='my_job_id'),
Queue.prepare_data(count_words_at_url, 'http://nvie.com', job_id='my_other_job_id'), Queue.prepare_data(count_words_at_url, ('http://nvie.com',), job_id='my_other_job_id'),
] ]
) )
``` ```
@ -123,8 +123,8 @@ which will enqueue all the jobs in a single redis `pipeline` which you can optio
with q.connection.pipeline() as pipe: with q.connection.pipeline() as pipe:
jobs = q.enqueue_many( jobs = q.enqueue_many(
[ [
Queue.prepare_data(count_words_at_url, 'http://nvie.com', job_id='my_job_id'), Queue.prepare_data(count_words_at_url, ('http://nvie.com',), job_id='my_job_id'),
Queue.prepare_data(count_words_at_url, 'http://nvie.com', job_id='my_other_job_id'), Queue.prepare_data(count_words_at_url, ('http://nvie.com',), job_id='my_other_job_id'),
], ],
pipeline=pipe pipeline=pipe
) )

Loading…
Cancel
Save