@@ -24,7 +24,7 @@ with Apache Kafka at its core. It's high priority for us that client features ke
24
24
pace with core Apache Kafka and components of the [ Confluent Platform] ( https://www.confluent.io/product/compare/ ) .
25
25
26
26
The Python bindings provides a high-level Producer and Consumer with support
27
- for the balanced consumer groups of Apache Kafka 0.9.
27
+ for the balanced consumer groups of Apache Kafka & gt ; = 0.9.
28
28
29
29
See the [ API documentation] ( http://docs.confluent.io/current/clients/confluent-kafka-python/index.html ) for more info.
30
30
@@ -40,11 +40,26 @@ Usage
40
40
from confluent_kafka import Producer
41
41
42
42
43
- p = Producer({' bootstrap.servers' : ' mybroker,mybroker2' })
43
+ p = Producer({' bootstrap.servers' : ' mybroker1,mybroker2' })
44
+
45
+ def delivery_report (err , msg ):
46
+ """ Called once for each message produced to indicate delivery result.
47
+ Triggered by poll() """
48
+ if err is not None :
49
+ print (' Message delivery failed: {} ' .format(err))
50
+ else :
51
+ print (' Message delivered to {} [{} ]' .format(msg.topic(), msg.partition()))
44
52
45
53
for data in some_data_source:
46
- p.produce(' mytopic' , data.encode(' utf-8' ))
54
+ # Trigger delivery report callbacks from previous produce() calls
55
+ p.poll(0 )
56
+
57
+ # Asynchronously produce a message, the optional but recommended
58
+ # delivery report callback will be triggered from poll()
59
+ # when the message has been successfully delivered or failed permanently.
60
+ p.produce(' mytopic' , data.encode(' utf-8' ), callback = delivery_report)
47
61
62
+ # Wait for any outstanding messages to be delivered
48
63
p.flush()
49
64
```
50
65
@@ -66,8 +81,10 @@ c = Consumer({
66
81
c.subscribe([' mytopic' ])
67
82
68
83
while True :
69
- msg = c.poll()
84
+ msg = c.poll(1.0 )
70
85
86
+ if msg is None :
87
+ continue
71
88
if msg.error():
72
89
if msg.error().code() == KafkaError._PARTITION_EOF :
73
90
continue
@@ -171,6 +188,24 @@ See the [examples](examples) directory for more examples, including [how to conf
171
188
[ Confluent Cloud] ( https://www.confluent.io/confluent-cloud/ ) .
172
189
173
190
191
+ Install
192
+ =======
193
+
194
+ ** Install self-contained binary wheels for OSX and Linux from PyPi:**
195
+
196
+ $ pip install confluent-kafka
197
+
198
+ ** Install AvroProducer and AvroConsumer:**
199
+
200
+ $ pip install confluent-kafka[avro]
201
+
202
+ ** Install from source from PyPi** * (requires librdkafka + dependencies to be installed separately)* :
203
+
204
+ $ pip install --no-binary :all: confluent-kafka
205
+
206
+ For source install, see * Prerequisites* below.
207
+
208
+
174
209
Broker Compatibility
175
210
====================
176
211
The Python client (as well as the underlying C library librdkafka) supports
@@ -200,7 +235,7 @@ Prerequisites
200
235
* Python >= 2.7 or Python 3.x
201
236
* [ librdkafka] ( https://github.com/edenhill/librdkafka ) >= 0.9.5 (latest release is embedded in wheels)
202
237
203
- librdkafka is embedded in the manylinux wheels, for other platforms or
238
+ librdkafka is embedded in the macosx manylinux wheels, for other platforms or
204
239
when a specific version of librdkafka is desired, following these guidelines:
205
240
206
241
* For ** Debian/Ubuntu**** based systems, add this APT repo and then do ` sudo apt-get install librdkafka-dev python-dev ` :
@@ -211,24 +246,10 @@ http://docs.confluent.io/current/installation.html#rpm-packages-via-yum
211
246
212
247
* On ** OSX** , use ** homebrew** and do ` brew install librdkafka `
213
248
214
-
215
- Install
216
- =======
217
-
218
- ** Install from PyPi:**
219
-
220
- $ pip install confluent-kafka
221
-
222
- # for AvroProducer or AvroConsumer
223
- $ pip install confluent-kafka[avro]
224
-
225
-
226
- ** Install from source / tarball:**
227
-
228
- $ pip install .
229
-
230
- # for AvroProducer or AvroConsumer
231
- $ pip install .[avro]
249
+ ** NOTE:** The pre-built Linux wheels do NOT contain SASL Kerberos support.
250
+ If you need SASL Kerberos support you must install librdkafka and
251
+ its dependencies using the above repositories and then build
252
+ confluent-kafka from source.
232
253
233
254
234
255
Build
0 commit comments