Skip to content
Advertisement

Django – Celery 4.1 with django-celery-beat/rabbitmq : Nothing?

I followed the tutorial on http://docs.celeryproject.org/en/latest/ and I am on virtualbox (Xubuntu 16.XX TLS), Django 1.11.3, Celery 4.1 . rabbitmq 3.6.14, Python 2.7 .

and when I started the daemonization with the init-script: celerybeat (with /etc/default/celeryd config file)

[2017-11-19 01:13:00,912: INFO/MainProcess] beat: Starting…

and nothing more after. Do you see what could I make wrong ?

My celery.py:

from __future__ import absolute_import, unicode_literals
import os
from celery import Celery

# set the default Django settings module for the 'celery' program.
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'oscar.settings')

app = Celery('oscar')

# Using a string here means the worker doesn't have to serialize
# the configuration object to child processes.
# - namespace='CELERY' means all celery-related configuration keys
#   should have a `CELERY_` prefix.
app.config_from_object('django.conf:settings', namespace='CELERY')

# Broker settings
app.conf.broker_url = 'amqp://oscar:oscar@localhost:5672/oscarRabbit'

# Load task modules from all registered Django app configs.
app.autodiscover_tasks()

some_app/tasks.py:

from __future__ import absolute_import, unicode_literals
from oscar import celery_app
from celery.schedules import crontab
from .models import HelpRequest
from datetime import datetime, timedelta
import logging

""" CONSTANTS FOR THE TIMER """
# Can be changed  (by default 1 week)
WEEKS_BEFORE_PENDING = 0
DAYS_BEFORE_PENDING = 0
HOURS_BEFORE_PENDING = 0
MINUTES_BEFORE_PENDING = 1

# http://docs.celeryproject.org/en/latest/userguide/periodic-tasks.html
# for schedule : http://docs.celeryproject.org/en/latest/userguide/periodic-tasks.html#crontab-schedules
@celery_app.on_after_configure.connect
def setup_periodic_tasks(sender, **kwargs):

    sender.add_periodic_task(
        crontab(minute=2),
        set_open_help_request_to_pending
    )

@celery_app.task(name="HR_OPEN_TO_PENDING")
def set_open_help_request_to_pending():
    """ For timedelay idea : https://stackoverflow.com/a/27869101/6149867  """
    logging.info("RUNNING CRON TASK FOR STUDENT COLLABORATION : set_open_help_request_to_pending")
    request_list = HelpRequest.objects.filter(
        state=HelpRequest.OPEN,
        timestamp__gte=datetime.now() - timedelta(hours=HOURS_BEFORE_PENDING,
                                                  minutes=MINUTES_BEFORE_PENDING,
                                                  days=DAYS_BEFORE_PENDING,
                                                  weeks=WEEKS_BEFORE_PENDING)
    )

    if request_list:
        logging.info("FOUND ", request_list.count(), "  Help request(s) => PENDING")
        for help_request in request_list.all():
            help_request.change_state(HelpRequest.PENDING)

/etc/default/celeryd:

# Names of nodes to start
#   most people will only start one node:
CELERYD_NODES="worker1"
#   but you can also start multiple and configure settings
#   for each in CELERYD_OPTS
#CELERYD_NODES="worker1 worker2 worker3"
#   alternatively, you can specify the number of nodes to start:
#CELERYD_NODES=10

# Absolute or relative path to the 'celery' command:
CELERY_BIN="/home/jy95/Documents/oscareducation/ve/local/bin/celery"

# App instance to use
# comment out this line if you don't use an app
CELERY_APP="oscar"


# Where to chdir at start.
CELERYD_CHDIR="/home/jy95/Documents/oscareducation"

# Extra command-line arguments to the worker
# django_celery_beat for admin purpuse
CELERYD_OPTS="--scheduler django_celery_beat.schedulers:DatabaseScheduler -f /var/log/celery/celery_tasks.log"

# Set logging level to DEBUG
#CELERYD_LOG_LEVEL="DEBUG"

# %n will be replaced with the first part of the nodename.
CELERYD_LOG_FILE="/var/log/celery/%n%I.log"
CELERYD_PID_FILE="/var/run/celery/%n.pid"

# Workers should run as an unprivileged user.
#   You need to create this user manually (or you can choose
#   a user/group combination that already exists (e.g., nobody).
CELERYD_USER="celery"
CELERYD_GROUP="celery"

# If enabled pid and log directories will be created if missing,
# and owned by the userid/group configured.
CELERY_CREATE_DIRS=1

My setup of rabbitmq :

$ sudo rabbitmqctl add_user oscar oscar
$ sudo rabbitmqctl add_vhost oscarRabbit
$ sudo rabbitmqctl set_user_tags oscar administrator
$ sudo rabbitmqctl set_permissions -p oscarRabbit oscar ".*" ".*" ".*"

The commands I run to start (and their messages) :

sudo rabbitmq-server -detached
sudo /etc/init.d/celerybeat start

Warning: PID file not written; -detached was passed.
/etc/init.d/celerybeat: lerybeat: not found
celery init v10.1.
Using configuration: /etc/default/celeryd
Starting celerybeat…

sudo /etc/init.d/celerybeat start
source ve/bin/activate
python manage.py runserver

Performing system checks…

System check identified no issues (0 silenced).
November 19, 2017 -01:49:22 Django version 1.11.3, using settings ‘oscar.settings’
Starting development server at http://127.0.0.1:8000/
Quit the server with CONTROL-C.

Thanks for your answer

Advertisement

Answer

It looks like you’ve started a celerybeat process and your server, but haven’t started a celery worker process.

python celery -A proj worker -B

(where proj is the name of your project).

Note that you can start a celery worker with an embedded beat process rather than needing to run celerybeat separately.

User contributions licensed under: CC BY-SA
7 People found this is helpful
Advertisement