-
Notifications
You must be signed in to change notification settings - Fork 919
Closed
Labels
status:needs-more-infoIssues that require more information to cleanup.Issues that require more information to cleanup.
Description
Description
I'm using Celery to parallelize a process of ingesting data to a Kafka topic. What I've noticed is that some messages do not end up in Kafka even though the library emits no issues. I've tried several configuration attempts to mitigate it but to no avail. I'm using version 2.2.0 of the library with Kafka 3.0.1 with Celery 5.3.4. Both kafka and the producer run in Docker.
How to reproduce
Celery config:
from producer import Updater
app = Celery('kafka-producer', broker=os.environ["CELERY_BROKER"], backend=os.environ["CELERY_RESULTS"])
@app.task
def send_update(key: str, value: str, queue: str):
kafka_con = Updater()
response = kafka_con.push_to_kafka(key, value, queue)
Producer code:
from confluent_kafka import Producer, KafkaException
class Updater:
def __init__(self):
self.logger = build_logger("producer")
self.producer = Producer(self.get_config())
self.poll_delay = float(os.environ.get("KAFKA_POLL_INTERVAL", 0))
def get_config(self) -> dict:
base = {'bootstrap.servers': os.environ["KAFKA_BROKER"], "security.protocol": "sasl_plaintext",
"sasl.mechanism": "PLAIN", "client.id": "client_id", "enable.idempotence": True,
"queue.buffering.max.messages": 1, ## Desperate attempt #1
"sasl.username": os.environ.get("KAFKA_USER", "client_id"),
"sasl.password": os.environ["KAFKA_PASSWORD"]}
return base
def delivery_report(self, err, msg):
if err:
raise Exception("Delivery failed for User record {}: {}".format(msg.key(), err))
else:
self.logger.debug(
f'Log key {msg.key()} successfully produced to {msg.topic()} [{msg.partition()}] at offset {msg.offset()}')
def push_to_kafka(self, key: str, value: str, queue_name: str):
try:
self.producer.produce(queue_name, key=key, value=value, callback=self.delivery_report)
self.producer.poll(self.poll_delay) ## Desperate attempt #2
except KafkaException as ke:
self.logger.warning(f"Encountered Kafka error for key {key}, {ke}.")
raise
except Exception as e:
self.logger.warning(f"Generic error for key {key}, {e}.")
else:
self.producer.flush(self.poll_delay)
self.logger.debug(f"Key {key} successfully pushed to topic Kafka topic {queue_name}: {value}")
Celery run command
celery -A main worker --autoscale=20,0 --loglevel=INFO -O fair --max-tasks-per-child 50
Checklist
Please provide the following information:
- confluent-kafka-python and librdkafka version (
confluent_kafka.version()
andconfluent_kafka.libversion()
): - Apache Kafka broker version:
- Client configuration:
{...}
- Operating system:
- Provide client logs (with
'debug': '..'
as necessary) - Provide broker log excerpts
- Critical issue
Metadata
Metadata
Assignees
Labels
status:needs-more-infoIssues that require more information to cleanup.Issues that require more information to cleanup.