Skip to content

Commit ef3e2bc

Browse files
committed
Merge tag 'v0.9.1.2' into 3.0.x
2 parents 253b6c4 + 8a7b837 commit ef3e2bc

File tree

6 files changed

+90
-39
lines changed

6 files changed

+90
-39
lines changed

README.md

Lines changed: 75 additions & 24 deletions
Original file line numberDiff line numberDiff line change
@@ -1,57 +1,108 @@
11
Confluent's Apache Kafka client for Python
22
==========================================
33

4+
Confluent's Kafka client for Python wraps the librdkafka C library, providing
5+
full Kafka protocol support with great performance and reliability.
46

5-
Prerequisites
6-
===============
7+
The Python bindings provides a high-level Producer and Consumer with support
8+
for the balanced consumer groups of Apache Kafka 0.9.
79

8-
librdkafka >=0.9.1 (or master>=2016-04-13)
9-
py.test (pip install pytest)
10+
See the [API documentation](http://docs.confluent.io/current/clients/confluent-kafka-python/index.html) for more info.
1011

12+
**License**: [Apache License v2.0](http://www.apache.org/licenses/LICENSE-2.0)
1113

12-
Build
14+
15+
Usage
1316
=====
1417

15-
python setup.py build
18+
**Producer:**
19+
20+
```python
21+
from confluent_kafka import Producer
22+
23+
p = Producer({'bootstrap.servers': 'mybroker,mybroker2'})
24+
for data in some_data_source:
25+
p.produce('mytopic', data.encode('utf-8'))
26+
p.flush()
27+
```
28+
29+
30+
**High-level Consumer:**
31+
32+
```python
33+
from confluent_kafka import Consumer
34+
35+
c = Consumer({'bootstrap.servers': 'mybroker', 'group.id': 'mygroup',
36+
'default.topic.config': {'auto.offset.reset': 'smallest'}})
37+
c.subscribe(['mytopic'])
38+
while running:
39+
msg = c.poll()
40+
if not msg.error():
41+
print('Received message: %s' % msg.value().decode('utf-8'))
42+
c.close()
43+
```
1644

45+
See [examples](examples) for more examples.
46+
47+
48+
Prerequisites
49+
=============
50+
51+
* Python >= 2.7 or Python 3.x
52+
* [librdkafka](https://github.com/edenhill/librdkafka) >= 0.9.1
1753

1854

1955
Install
2056
=======
21-
Preferably in a virtualenv:
2257

23-
pip install .
58+
**Install from PyPi:**
59+
60+
$ pip install confluent-kafka
61+
62+
63+
**Install from source / tarball:**
64+
65+
$ pip install .
2466

2567

26-
Run unit-tests
27-
==============
68+
Build
69+
=====
70+
71+
$ python setup.by build
72+
73+
If librdkafka is installed in a non-standard location provide the include and library directories with:
74+
75+
$ CPLUS_INCLUDE_PATH=/path/to/include LIBRARY_PATH=/path/to/lib python setup.py ...
76+
77+
78+
Tests
79+
=====
80+
2881

29-
py.test
82+
**Run unit-tests:**
3083

84+
$ py.test
85+
86+
**NOTE**: Requires `py.test`, install by `pip install pytest`
87+
88+
89+
**Run integration tests:**
90+
91+
$ examples/integration_test.py <kafka-broker>
3192

32-
Run integration tests
33-
=====================
3493
**WARNING**: These tests require an active Kafka cluster and will make use of a topic named 'test'.
3594

36-
examples/integration_test.py <kafka-broker>
3795

3896

3997

4098
Generate documentation
4199
======================
42100
Install sphinx and sphinx_rtd_theme packages and then:
43101

44-
make docs
102+
$ make docs
45103

46104
or:
47105

48-
python setup.py build_sphinx
49-
50-
51-
Documentation will be generated in `docs/_build/`
52-
53-
54-
Examples
55-
========
106+
$ python setup.by build_sphinx
56107

57-
See [examples](examples)
108+
Documentation will be generated in `docs/_build/`.

confluent_kafka/src/Producer.c

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -370,8 +370,8 @@ static PyMethodDef Producer_methods[] = {
370370
"message has been succesfully delivered or permanently fails delivery.\n"
371371
"\n"
372372
" :param str topic: Topic to produce message to\n"
373-
" :param str value: Message payload\n"
374-
" :param str key: Message key\n"
373+
" :param str|bytes value: Message payload\n"
374+
" :param str|bytes key: Message key\n"
375375
" :param int partition: Partition to produce to, elses uses the "
376376
"configured partitioner.\n"
377377
" :param func on_delivery(err,msg): Delivery report callback to call "

confluent_kafka/src/confluent_kafka.c

Lines changed: 5 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -339,12 +339,12 @@ static PyMethodDef Message_methods[] = {
339339

340340
{ "value", (PyCFunction)Message_value, METH_NOARGS,
341341
" :returns: message value (payload) or None if not available.\n"
342-
" :rtype: str or None\n"
342+
" :rtype: str|bytes or None\n"
343343
"\n"
344344
},
345345
{ "key", (PyCFunction)Message_key, METH_NOARGS,
346346
" :returns: message key or None if not available.\n"
347-
" :rtype: str or None\n"
347+
" :rtype: str|bytes or None\n"
348348
"\n"
349349
},
350350
{ "topic", (PyCFunction)Message_topic, METH_NOARGS,
@@ -486,10 +486,10 @@ PyObject *Message_new0 (const rd_kafka_message_t *rkm) {
486486
self->topic = cfl_PyUnistr(
487487
_FromString(rd_kafka_topic_name(rkm->rkt)));
488488
if (rkm->payload)
489-
self->value = cfl_PyUnistr(_FromStringAndSize(rkm->payload,
490-
rkm->len));
489+
self->value = cfl_PyBin(_FromStringAndSize(rkm->payload,
490+
rkm->len));
491491
if (rkm->key)
492-
self->key = cfl_PyUnistr(
492+
self->key = cfl_PyBin(
493493
_FromStringAndSize(rkm->key, rkm->key_len));
494494

495495
self->partition = rkm->partition;

confluent_kafka/src/confluent_kafka.h

Lines changed: 5 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -41,12 +41,12 @@
4141
*
4242
****************************************************************************/
4343

44-
#ifdef PY3
44+
#ifdef PY3 /* Python 3 */
4545
/**
4646
* @brief Binary type, use as cfl_PyBin(_X(A,B)) where _X() is the type-less
47-
* suffix of a PyBinary/Str_X() function
47+
* suffix of a PyBytes/Str_X() function
4848
*/
49-
#define cfl_PyBin(X) PyBinary ## X
49+
#define cfl_PyBin(X) PyBytes ## X
5050

5151
/**
5252
* @brief Unicode type, same usage as PyBin()
@@ -62,7 +62,8 @@
6262
* @returns Unicode Python string object
6363
*/
6464
#define cfl_PyObject_Unistr(X) PyObject_Str(X)
65-
#else
65+
66+
#else /* Python 2 */
6667

6768
/* See comments above */
6869
#define cfl_PyBin(X) PyString ## X

docs/conf.py

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -55,9 +55,9 @@
5555
# built documents.
5656
#
5757
# The short X.Y version.
58-
version = '0.9.1'
58+
version = '0.9.1.2'
5959
# The full version, including alpha/beta/rc tags.
60-
release = '0.9.1'
60+
release = '0.9.1.2'
6161

6262
# The language for content autogenerated by Sphinx. Refer to documentation
6363
# for a list of supported languages.

setup.py

Lines changed: 1 addition & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -5,14 +5,13 @@
55

66

77
module = Extension('confluent_kafka.cimpl',
8-
include_dirs = ['/usr/local/include'],
98
libraries= ['rdkafka'],
109
sources=['confluent_kafka/src/confluent_kafka.c',
1110
'confluent_kafka/src/Producer.c',
1211
'confluent_kafka/src/Consumer.c'])
1312

1413
setup(name='confluent-kafka',
15-
version='0.9.1.1',
14+
version='0.9.1.2',
1615
description='Confluent\'s Apache Kafka client for Python',
1716
author='Confluent Inc',
1817
author_email='support@confluent.io',

0 commit comments

Comments
 (0)
pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy