Skip to content

Commit bbf61f7

Browse files
committed
docs: reinstate class members and Avro, improve docs, fix doc warnings.
1 parent 82f32ac commit bbf61f7

File tree

6 files changed

+81
-61
lines changed

6 files changed

+81
-61
lines changed

confluent_kafka/admin/__init__.py

Lines changed: 8 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -184,8 +184,8 @@ def set_config(self, name, value, overwrite=True):
184184

185185
class AdminClient (_AdminClientImpl):
186186
"""
187-
The Kafka AdminClient provides admin operations for Kafka brokers,
188-
topics, groups, and other resource types supported by the broker.
187+
AdminClient provides admin operations for Kafka brokers, topics, groups,
188+
and other resource types supported by the broker.
189189
190190
The Admin API methods are asynchronous and returns a dict of
191191
concurrent.futures.Future objects keyed by the entity.
@@ -519,10 +519,13 @@ class TopicMetadata (object):
519519
520520
This class is typically not user instantiated.
521521
522-
:ivar str topic: Topic name.
522+
:ivar str -topic: Topic name.
523523
:ivar dict partitions: Map of partitions indexed by partition id. Value is PartitionMetadata object.
524-
:ivar KafkaError error: Topic error, or None. Value is a KafkaError object.
524+
:ivar KafkaError -error: Topic error, or None. Value is a KafkaError object.
525525
"""
526+
# The dash in "-topic" and "-error" is needed to circumvent a
527+
# Sphinx issue where it tries to reference the same instance variable
528+
# on other classes which raises a warning/error.
526529
def __init__(self):
527530
self.topic = None
528531
self.partitions = {}
@@ -548,7 +551,7 @@ class PartitionMetadata (object):
548551
:ivar int leader: Current leader broker for this partition, or -1.
549552
:ivar list(int) replicas: List of replica broker ids for this partition.
550553
:ivar list(int) isrs: List of in-sync-replica broker ids for this partition.
551-
:ivar KafkaError error: Partition error, or None. Value is a KafkaError object.
554+
:ivar KafkaError -error: Partition error, or None. Value is a KafkaError object.
552555
553556
:warning: Depending on cluster state the broker ids referenced in
554557
leader, replicas and isrs may temporarily not be reported

confluent_kafka/src/Admin.c

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1540,7 +1540,7 @@ PyTypeObject AdminType = {
15401540
"\n"
15411541
".. py:function:: Admin(**kwargs)\n"
15421542
"\n"
1543-
" Create new AdminClient instance using provided configuration dict.\n"
1543+
" Create a new AdminClient instance using the provided configuration dict.\n"
15441544
"\n"
15451545
"This class should not be used directly, use confluent_kafka.AdminClient\n."
15461546
"\n"

confluent_kafka/src/Consumer.c

Lines changed: 7 additions & 18 deletions
Original file line numberDiff line numberDiff line change
@@ -1421,27 +1421,16 @@ PyTypeObject ConsumerType = {
14211421
0, /*tp_as_buffer*/
14221422
Py_TPFLAGS_DEFAULT | Py_TPFLAGS_BASETYPE |
14231423
Py_TPFLAGS_HAVE_GC, /*tp_flags*/
1424-
"High-level Kafka Consumer\n"
1424+
"A high-level Apache Kafka Consumer\n"
14251425
"\n"
14261426
".. py:function:: Consumer(config)\n"
14271427
"\n"
1428-
" :param dict config: Configuration properties. "
1429-
"At a minimum ``group.id`` **must** be set,"
1430-
" ``bootstrap.servers`` **should** be set."
1431-
"\n"
1432-
"Create new Consumer instance using provided configuration dict.\n"
1433-
"\n"
1434-
" Special configuration properties:\n"
1435-
" ``on_commit``: Optional callback will be called when a commit "
1436-
"request has succeeded or failed.\n"
1437-
"\n"
1438-
"\n"
1439-
".. py:function:: on_commit(err, partitions)\n"
1440-
"\n"
1441-
" :param KafkaError err: Commit error object, or None on success.\n"
1442-
" :param list(TopicPartition) partitions: List of partitions with "
1443-
"their committed offsets or per-partition errors.\n"
1444-
"\n"
1428+
"Create a new Consumer instance using the provided configuration *dict* ("
1429+
"including properties and callback functions). "
1430+
"See :ref:`pythonclient_configuration` for more information."
1431+
"\n\n"
1432+
":param dict config: Configuration properties. At a minimum "
1433+
"``group.id`` **must** be set, ``bootstrap.servers`` **should** be set."
14451434
"\n", /*tp_doc*/
14461435
(traverseproc)Consumer_traverse, /* tp_traverse */
14471436
(inquiry)Consumer_clear, /* tp_clear */

confluent_kafka/src/Producer.c

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -552,7 +552,7 @@ PyTypeObject ProducerType = {
552552
"\n"
553553
" :param dict config: Configuration properties. At a minimum ``bootstrap.servers`` **should** be set\n"
554554
"\n"
555-
" Create new Producer instance using provided configuration dict.\n"
555+
" Create a new Producer instance using the provided configuration dict.\n"
556556
"\n"
557557
"\n"
558558
".. py:function:: len()\n"

confluent_kafka/src/confluent_kafka.c

Lines changed: 10 additions & 10 deletions
Original file line numberDiff line numberDiff line change
@@ -486,21 +486,21 @@ static PyMethodDef Message_methods[] = {
486486
},
487487
{ "timestamp", (PyCFunction)Message_timestamp, METH_NOARGS,
488488
"Retrieve timestamp type and timestamp from message.\n"
489-
"The timestamp type is one of:\n"
490-
" * :py:const:`TIMESTAMP_NOT_AVAILABLE`"
491-
" - Timestamps not supported by broker\n"
492-
" * :py:const:`TIMESTAMP_CREATE_TIME` "
493-
" - Message creation time (or source / producer time)\n"
494-
" * :py:const:`TIMESTAMP_LOG_APPEND_TIME` "
495-
" - Broker receive time\n"
489+
"The timestamp type is one of:\n\n"
490+
" * :py:const:`TIMESTAMP_NOT_AVAILABLE` "
491+
"- Timestamps not supported by broker.\n"
492+
" * :py:const:`TIMESTAMP_CREATE_TIME` "
493+
"- Message creation time (or source / producer time).\n"
494+
" * :py:const:`TIMESTAMP_LOG_APPEND_TIME` "
495+
"- Broker receive time.\n"
496496
"\n"
497-
"The returned timestamp should be ignored if the timestamp type is "
497+
" The returned timestamp should be ignored if the timestamp type is "
498498
":py:const:`TIMESTAMP_NOT_AVAILABLE`.\n"
499499
"\n"
500500
" The timestamp is the number of milliseconds since the epoch (UTC).\n"
501501
"\n"
502-
" Timestamps require broker version 0.10.0.0 or later and \n"
503-
" ``{'api.version.request': True}`` configured on the client.\n"
502+
" Timestamps require broker version 0.10.0.0 or later and "
503+
"``{'api.version.request': True}`` configured on the client.\n"
504504
"\n"
505505
" :returns: tuple of message timestamp type, and timestamp.\n"
506506
" :rtype: (int, int)\n"

docs/index.rst

Lines changed: 54 additions & 26 deletions
Original file line numberDiff line numberDiff line change
@@ -1,79 +1,98 @@
1+
The confluent_kafka API
2+
=======================
13

2-
Welcome to Confluent's Python client for Apache Kafka documentation
3-
===================================================================
4+
A reliable, performant and feature rich Python client for Apache Kafka v0.8 and above.
45

5-
Indices and tables
6-
==================
6+
Clients
7+
- :ref:`Consumer <pythonclient_consumer>`
8+
- :ref:`Producer <pythonclient_producer>`
9+
- :ref:`AdminClient <pythonclient_adminclient>`
10+
11+
12+
Supporting classes
13+
- :ref:`Message <pythonclient_message>`
14+
- :ref:`TopicPartition <pythonclient_topicpartition>`
15+
- :ref:`KafkaError <pythonclient_kafkaerror>`
16+
- :ref:`KafkaException <pythonclient_kafkaexception>`
17+
- :ref:`ThrottleEvent <pythonclient_throttleevent>`
18+
- :ref:`Avro <pythonclient_avro>`
719

8-
* :ref:`genindex`
920

10-
:mod:`confluent_kafka` --- Confluent's Python client for Apache Kafka
11-
*********************************************************************
21+
:ref:`genindex`
1222

13-
.. automodule:: confluent_kafka
14-
:synopsis: Confluent's Python client for Apache Kafka.
1523

24+
.. _pythonclient_consumer:
1625

17-
********
1826
Consumer
19-
********
27+
========
2028

2129
.. autoclass:: confluent_kafka.Consumer
30+
:members:
2231

32+
.. _pythonclient_producer:
2333

24-
********
2534
Producer
26-
********
35+
========
2736

2837
.. autoclass:: confluent_kafka.Producer
38+
:members:
2939

40+
.. _pythonclient_adminclient:
3041

31-
*****
32-
Admin
33-
*****
42+
AdminClient
43+
===========
3444

3545
.. automodule:: confluent_kafka.admin
46+
:members:
3647

48+
.. _pythonclient_avro:
3749

38-
.. autoclass:: confluent_kafka.admin.NewTopic
39-
.. autoclass:: confluent_kafka.admin.NewPartitions
40-
41-
****
4250
Avro
43-
****
51+
====
4452

4553
.. automodule:: confluent_kafka.avro
54+
:members:
4655

56+
Supporting Classes
57+
==================
4758

48-
.. autoclass:: confluent_kafka.avro.CachedSchemaRegistryClient
49-
59+
.. _pythonclient_message:
5060

5161
*******
5262
Message
5363
*******
5464

5565
.. autoclass:: confluent_kafka.Message
66+
:members:
5667

68+
.. _pythonclient_topicpartition:
5769

5870
**************
5971
TopicPartition
6072
**************
6173

6274
.. autoclass:: confluent_kafka.TopicPartition
75+
:members:
6376

6477

78+
.. _pythonclient_kafkaerror:
79+
6580
**********
6681
KafkaError
6782
**********
6883

6984
.. autoclass:: confluent_kafka.KafkaError
85+
:members:
86+
7087

88+
.. _pythonclient_kafkaexception:
7189

7290
**************
7391
KafkaException
7492
**************
7593

7694
.. autoclass:: confluent_kafka.KafkaException
95+
:members:
7796

7897

7998
******
@@ -87,16 +106,22 @@ Logical offset constants:
87106
* :py:const:`OFFSET_STORED` - Use stored/committed offset
88107
* :py:const:`OFFSET_INVALID` - Invalid/Default offset
89108

109+
110+
.. _pythonclient_throttleevent:
111+
90112
*************
91113
ThrottleEvent
92114
*************
93115

94116
.. autoclass:: confluent_kafka.ThrottleEvent
117+
:members:
95118

96119

120+
.. _pythonclient_configuration:
97121

98122
Configuration
99123
=============
124+
100125
Configuration of producer and consumer instances is performed by
101126
providing a dict of configuration properties to the instance constructor, e.g.::
102127

@@ -117,7 +142,7 @@ The Python bindings also provide some additional configuration properties:
117142
properties that are applied to all used topics for the instance. **DEPRECATED:**
118143
topic configuration should now be specified in the global top-level configuration.
119144

120-
* ``error_cb(kafka.KafkaError)``: Callback for generic/global error events. This callback is served upon calling
145+
* ``error_cb(kafka.KafkaError)``: Callback for generic/global error events, these errors are typically to be considered informational since the client will automatically try to recover. This callback is served upon calling
121146
``client.poll()`` or ``producer.flush()``.
122147

123148
* ``throttle_cb(confluent_kafka.ThrottleEvent)``: Callback for throttled request reporting.
@@ -138,8 +163,11 @@ The Python bindings also provide some additional configuration properties:
138163
callback. The ``msg.headers()`` will return None even if the original message
139164
had headers set. This callback is served upon calling ``producer.poll()`` or ``producer.flush()``.
140165

141-
* ``on_commit(kafka.KafkaError, list(kafka.TopicPartition))`` (**Consumer**): Callback used to indicate success or failure
142-
of asynchronous and automatic commit requests. This callback is served upon calling ``consumer.poll()``. Is not triggered for synchronous commits.
166+
* ``on_commit(kafka.KafkaError, list(kafka.TopicPartition))`` (**Consumer**): Callback used to indicate
167+
success or failure of asynchronous and automatic commit requests. This callback is served upon calling
168+
``consumer.poll()``. Is not triggered for synchronous commits. Callback arguments: *KafkaError* is the
169+
commit error, or None on success. *list(TopicPartition)* is the list of partitions with their committed
170+
offsets or per-partition errors.
143171

144172
* ``logger=logging.Handler`` kwarg: forward logs from the Kafka client to the
145173
provided ``logging.Handler`` instance.

0 commit comments

Comments
 (0)
pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy