Skip to content

Commit b3ede8d

Browse files
authored
Added documentation with an example of FIPS compliant communication with Kafka cluster.
1) Added documentation with an example of FIPS compliant communication with Kafka cluster. (confluentinc#1582) 2) Fixed wrong error code parameter name in KafkaError.
1 parent 96d48e2 commit b3ede8d

File tree

11 files changed

+372
-2
lines changed

11 files changed

+372
-2
lines changed

CHANGELOG.md

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -8,6 +8,8 @@ v2.2.0 is a feature release with the following features, fixes and enhancements:
88
IncrementalAlterConfigs API (#1517).
99
- [KIP-554](https://cwiki.apache.org/confluence/display/KAFKA/KIP-554%3A+Add+Broker-side+SCRAM+Config+API):
1010
User SASL/SCRAM credentials alteration and description (#1575).
11+
- Added documentation with an example of FIPS compliant communication with Kafka cluster.
12+
- Fixed wrong error code parameter name in KafkaError.
1113

1214
confluent-kafka-python is based on librdkafka v2.2.0, see the
1315
[librdkafka release notes](https://github.com/confluentinc/librdkafka/releases/tag/v2.2.0)

examples/fips/README.md

Lines changed: 90 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,90 @@
1+
# FIPS Compliance
2+
3+
We tested FIPS compliance for the client using OpenSSL 3.0. To use the client in FIPS-compliant mode, use OpenSSL 3.0. Older versions of OpenSSL have not been verified (although they may work).
4+
5+
## Communication between client and Kafka cluster
6+
7+
### Installing client using OpenSSL and librdkafka bundled in wheels
8+
9+
If you install this client through prebuilt wheels using `pip install confluent_kafka`, OpenSSL 3.0 is already statically linked with the librdkafka shared library. To enable this client to communicate with the Kafka cluster using the OpenSSL FIPS provider and FIPS-approved algorithms, you must enable the FIPS provider. You can find steps to enable the FIPS provider in section [Enabling FIPS provider](#enabling-fips-provider).
10+
11+
You should follow the same above steps if you install this client from the source using `pip install confluent_kafka --no-binary :all:` with prebuilt librdkafka in which OpenSSL is statically linked
12+
13+
14+
### Installing client using system OpenSSL with librdkafka shared library
15+
16+
When you build the librdkafka from source, librdkafka dynamically links to the OpenSSL present in the system (if static linking is not used explicitly while building). If the installed OpenSSL is already working in FIPS mode, then you can directly jump to the section [Client configuration to enable FIPS provider](#client-configuration-to-enable-fips-provider) and enable `fips` provider. If you don't have OpenSSL working in FIPS mode, use the steps mentioned in the section [Enabling FIPS provider](#enabling-fips-provider) to make OpenSSL in your system FIPS compliant and then enable the `fips` provider. Once you have OpenSSL working in FIPS mode and the `fips` provider enabled, librdkafka and python client will use FIPS approved algorithms for the communication between client and Kafka Cluster using the producer, consumer or admin client.
17+
18+
### Enabling FIPS provider
19+
20+
To enable the FIPS provider, you must have the FIPS module available on your system, plug the module into OpenSSL, and then configure OpenSSL to use the module.
21+
You can plug the FIPS provider into OpenSSL two ways: 1) put the module in the default module folder of OpenSSL or 2) point to the module with the environment variable, `OPENSSL_MODULES`. For example: `OPENSSL_MODULES="/path/to/fips/module/lib/folder/`
22+
23+
You configure OpenSSL to use the FIPS provider using the FIPS configuration in OpenSSL config. You can 1) modify the default configuration file to include FIPS related config or 2) create a new configuration file and point to it using the environment variable,`OPENSSL_CONF`. For example `OPENSSL_CONF="/path/to/fips/enabled/openssl/config/openssl.cnf` For an example of OpenSSL config, see below.
24+
25+
**NOTE:** You need to specify both `OPENSSL_MODULES` and `OPENSSL_CONF` environment variable when installing the client from pre-built wheels or when OpenSSL is statically linked to librdkafka.
26+
27+
#### Steps to build FIPS provider module
28+
29+
You can find steps to generate the FIPS provider module in the [README-FIPS doc](https://github.com/openssl/openssl/blob/openssl-3.0.8/README-FIPS.md)
30+
31+
In short, you need to perform the following steps:
32+
33+
1) Clone OpenSSL from [OpenSSL Github Repo](https://github.com/openssl/openssl)
34+
2) Checkout the correct version. (v3.0.8 is the current FIPS compliant version for OpenSSL 3.0 at the time of writing this doc.)
35+
3) Run `./Configure enable-fips`
36+
4) Run `make install_fips`
37+
38+
After last step, two files are generated.
39+
* FIPS module (`fips.dylib` in Mac, `fips.so` in Linux and `fips.dll` in Windows)
40+
* FIPS config (`fipsmodule.cnf`) file will be generated. Which needs to be used with OpenSSL.
41+
42+
#### Referencing FIPS provider in OpenSSL
43+
44+
As mentioned earlier, you can dynamically plug the FIPS module built above into OpenSSL by putting the FIPS module into the default OpenSSL module folder (only when installing from source). For default locations of OpenSSL on various operating systems, see the SSL section of the [Introduction to librdkafka - the Apache Kafka C/C++ client library](https://github.com/confluentinc/librdkafka/blob/master/INTRODUCTION.md#ssl). It should look something like `...lib/ossl-modules/`.
45+
46+
You can also point to this module with the environment variable `OPENSSL_MODULES`. For example, `OPENSSL_MODULES="/path/to/fips/module/lib/folder/`
47+
48+
#### Linking FIPS provider with OpenSSL
49+
50+
To enable FIPS in OpenSSL, you must include `fipsmodule.cnf` in the file, `openssl.cnf`. The `fipsmodule.cnf` file includes `fips_sect` which OpenSSL requires to enable FIPS. See the example below.
51+
52+
Some of the algorithms might have different implementation in FIPS or other providers. If you load two different providers like default and fips, any implementation could be used. To make sure you fetch only FIPS compliant version of the algorithm, use `fips=yes` default property in config file.
53+
54+
OpenSSL config should look something like after applying the above changes.
55+
56+
```
57+
config_diagnostics = 1
58+
openssl_conf = openssl_init
59+
60+
.include /usr/local/ssl/fipsmodule.cnf
61+
62+
[openssl_init]
63+
providers = provider_sect
64+
alg_section = algorithm_sect
65+
66+
[provider_sect]
67+
fips = fips_sect
68+
69+
[algorithm_sect]
70+
default_properties = fips=yes
71+
.
72+
.
73+
.
74+
```
75+
76+
### Client configuration to enable FIPS provider
77+
78+
OpenSSL requires some non-crypto algorithms as well. These algorithms are not included in the FIPS provider and you need to use the `base` provider in conjunction with the `fips` provider. Base provider comes with OpenSSL by default. You must enable `base` provider in the client configuration.
79+
80+
To make client (consumer, producer or admin client) FIPS compliant, you must enable only `fips` and `base` provider in the client using the `ssl.providers` configuration property i.e `'ssl.providers': 'fips,base'`.
81+
82+
83+
## Communication between client and Schema Registry
84+
85+
This part is not tested for FIPS compliance right now.
86+
87+
## References
88+
* [Generating FIPS module and config file](https://github.com/openssl/openssl/blob/openssl-3.0.8/README-FIPS.md)
89+
* [How to use FIPS Module](https://www.openssl.org/docs/man3.0/man7/fips_module.html)
90+
* [librdkafka SSL Information](https://github.com/confluentinc/librdkafka/blob/master/INTRODUCTION.md#ssl)

examples/fips/docker/README.md

Lines changed: 5 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,5 @@
1+
Use `generate_certificates.sh` in secrets folder to generate the certificates.
2+
Up the server using `docker-compose up`.
3+
Use example producer and consumer to test the FIPS compliance. Note that you might need to point to FIPS module and FIPS enabled OpenSSL 3.0 config using environment variables like ` OPENSSL_CONF="/path/to/fips/enabled/openssl/config/openssl.cnf" OPENSSL_MODULES="/path/to/fips/module/lib/folder/" ./examples/fips/fips_producer.py localhost:9092 test-topic`
4+
5+
Uncomment `KAFKA_SSL_CIPHER.SUITES: TLS_CHACHA20_POLY1305_SHA256` in `docker-compose.yml` to enable non FIPS compliant algorithm. Use this to verify that only FIPS compliant algorithms are used.
Lines changed: 51 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,51 @@
1+
version: "3.9"
2+
services:
3+
4+
zookeeper:
5+
hostname: zookeeper
6+
container_name: zookeeper
7+
restart: always
8+
image: confluentinc/cp-zookeeper:7.4.0
9+
environment:
10+
ZOOKEEPER_CLIENT_PORT: 2181
11+
KAFKA_OPTS: -Djava.security.auth.login.config=/etc/kafka/secrets/zookeeper_jaas.conf
12+
-Dzookeeper.authProvider.1=org.apache.zookeeper.server.auth.SASLAuthenticationProvider
13+
-DrequireClientAuthScheme=sasl
14+
-Dzookeeper.allowSaslFailedClients=false
15+
volumes:
16+
- ./secrets:/etc/kafka/secrets
17+
18+
broker:
19+
image: confluentinc/cp-kafka:7.4.0
20+
hostname: broker
21+
container_name: broker
22+
restart: always
23+
ports:
24+
- 29092:29092
25+
- 9092:9092
26+
volumes:
27+
- ./secrets:/etc/kafka/secrets
28+
environment:
29+
KAFKA_ZOOKEEPER_CONNECT: 'zookeeper:2181'
30+
KAFKA_INTER_BROKER_LISTENER_NAME: SASL_SSL
31+
KAFKA_LISTENER_SECURITY_PROTOCOL_MAP: SASL_SSL:SASL_SSL,SASL_SSL_HOST:SASL_SSL
32+
KAFKA_ADVERTISED_LISTENERS: SASL_SSL://localhost:9092,SASL_SSL_HOST://broker:29092
33+
KAFKA_LISTENERS: SASL_SSL://:9092,SASL_SSL_HOST://:29092
34+
KAFKA_OFFSETS_TOPIC_REPLICATION_FACTOR: 1
35+
CONFLUENT_METRICS_REPORTER_SECURITY_PROTOCOL: SASL_SSL
36+
CONFLUENT_METRICS_REPORTER_SASL_JAAS_CONFIG: "org.apache.kafka.common.security.plain.PlainLoginModule required \
37+
username=\"client\" \
38+
password=\"client-secret\";"
39+
KAFKA_SASL_ENABLED_MECHANISMS: PLAIN
40+
KAFKA_SASL_MECHANISM_INTER_BROKER_PROTOCOL: PLAIN
41+
KAFKA_SSL_KEYSTORE_FILENAME: server.keystore.jks
42+
KAFKA_SSL_KEYSTORE_CREDENTIALS: creds
43+
KAFKA_SSL_KEY_CREDENTIALS: creds
44+
KAFKA_SSL_TRUSTSTORE_FILENAME: server.truststore.jks
45+
KAFKA_SSL_TRUSTSTORE_CREDENTIALS: creds
46+
# KAFKA_SSL_CIPHER.SUITES: TLS_CHACHA20_POLY1305_SHA256 # FIPS non compliant algo.
47+
# enables 2-way authentication
48+
KAFKA_SSL_CLIENT_AUTH: "required"
49+
KAFKA_SSL_ENDPOINT_IDENTIFICATION_ALGORITHM: ""
50+
KAFKA_OPTS: -Djava.security.auth.login.config=/etc/kafka/secrets/broker_jaas.conf
51+
KAFKA_SSL_PRINCIPAL_MAPPING_RULES: RULE:^CN=(.*?),OU=TEST.*$$/$$1/,DEFAULT
Lines changed: 17 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,17 @@
1+
KafkaServer {
2+
org.apache.kafka.common.security.plain.PlainLoginModule required
3+
username="broker"
4+
password="broker-secret"
5+
user_broker="broker-secret"
6+
user_client="client-secret"
7+
user_schema-registry="schema-registry-secret"
8+
user_restproxy="restproxy-secret"
9+
user_clientrestproxy="clientrestproxy-secret"
10+
user_badclient="badclient-secret";
11+
};
12+
13+
Client {
14+
org.apache.zookeeper.server.auth.DigestLoginModule required
15+
username="kafka"
16+
password="kafkasecret";
17+
};
Lines changed: 33 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,33 @@
1+
#!/bin/bash
2+
set -e
3+
server_hostname=${server_hostname:-localhost}
4+
client_hostname=${client_hostname:-localhost}
5+
cert_password=${cert_password:-111111}
6+
openssl=${openssl:-openssl}
7+
keytool=${keytool:-keytool}
8+
echo openssl version
9+
$openssl version
10+
echo creates server keystore
11+
$keytool -keystore server.keystore.jks -storepass $cert_password -alias ${server_hostname} -validity 365 -genkey -keyalg RSA -dname "cn=$server_hostname"
12+
echo creates root CA
13+
$openssl req -nodes -new -x509 -keyout ca-root.key -out ca-root.crt -days 365 -subj "/C=US/ST=CA/L=MV/O=CFLT/CN=CFLT"
14+
echo creates CSR
15+
$keytool -keystore server.keystore.jks -alias ${server_hostname} -certreq -file ${server_hostname}_server.csr -storepass $cert_password
16+
echo sign CSR
17+
$openssl x509 -req -CA ca-root.crt -CAkey ca-root.key -in ${server_hostname}_server.csr -out ${server_hostname}_server.crt -days 365 -CAcreateserial
18+
echo import root CA
19+
$keytool -keystore server.keystore.jks -alias CARoot -import -noprompt -file ca-root.crt -storepass $cert_password
20+
echo import server certificate
21+
$keytool -keystore server.keystore.jks -alias ${server_hostname} -import -file ${server_hostname}_server.crt -storepass $cert_password
22+
echo create client CSR
23+
$openssl req -newkey rsa:2048 -nodes -keyout ${client_hostname}_client.key -out ${client_hostname}_client.csr -subj "/C=US/ST=CA/L=MV/O=CFLT/CN=CFLT" -passin pass:$cert_password
24+
echo sign client CSR
25+
$openssl x509 -req -CA ca-root.crt -CAkey ca-root.key -in ${client_hostname}_client.csr -out ${client_hostname}_client.crt -days 365 -CAcreateserial
26+
echo create client keystore
27+
$openssl pkcs12 -export -in ${client_hostname}_client.crt -inkey ${client_hostname}_client.key -name ${client_hostname} -out client.keystore.p12 -passin pass:$cert_password \
28+
-passout pass:$cert_password
29+
echo create truststore
30+
$keytool -noprompt -keystore server.truststore.jks -alias CARoot -import -file ca-root.crt -storepass $cert_password
31+
echo create creds file
32+
echo "$cert_password" > ./creds
33+
echo verify with: openssl pkcs12 -info -nodes -in client.keystore.p12
Lines changed: 5 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,5 @@
1+
Server {
2+
org.apache.zookeeper.server.auth.DigestLoginModule required
3+
user_super="adminsecret"
4+
user_kafka="kafkasecret";
5+
};

examples/fips/fips_consumer.py

Lines changed: 84 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,84 @@
1+
#!/usr/bin/env python
2+
#
3+
# Copyright 2023 Confluent Inc.
4+
#
5+
# Licensed under the Apache License, Version 2.0 (the "License");
6+
# you may not use this file except in compliance with the License.
7+
# You may obtain a copy of the License at
8+
#
9+
# http://www.apache.org/licenses/LICENSE-2.0
10+
#
11+
# Unless required by applicable law or agreed to in writing, software
12+
# distributed under the License is distributed on an "AS IS" BASIS,
13+
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
14+
# See the License for the specific language governing permissions and
15+
# limitations under the License.
16+
#
17+
18+
#
19+
# Example FIPS Compliant Consumer
20+
#
21+
from confluent_kafka import Consumer, KafkaException
22+
import sys
23+
24+
25+
def print_usage_and_exit(program_name):
26+
sys.stderr.write('Usage: %s [options..] <bootstrap-brokers> <group> <topic1> <topic2> ..\n' % program_name)
27+
sys.exit(1)
28+
29+
30+
if __name__ == '__main__':
31+
if len(sys.argv) < 4:
32+
print_usage_and_exit(sys.argv[0])
33+
34+
broker = sys.argv[1]
35+
group = sys.argv[2]
36+
topics = sys.argv[3:]
37+
conf = {'bootstrap.servers': broker,
38+
'group.id': group,
39+
'auto.offset.reset': 'earliest',
40+
'security.protocol': 'SASL_SSL',
41+
'sasl.mechanism': 'PLAIN',
42+
'sasl.username': 'broker',
43+
'sasl.password': 'broker-secret',
44+
# pkc12 keystores are not FIPS compliant and hence you will need to use
45+
# path to key and certificate separately in FIPS mode
46+
# 'ssl.keystore.location': './docker/secrets/client.keystore.p12',
47+
# 'ssl.keystore.password': '111111',
48+
'ssl.key.location': './docker/secrets/localhost_client.key',
49+
'ssl.key.password': '111111',
50+
'ssl.certificate.location': './docker/secrets/localhost_client.crt',
51+
'ssl.ca.location': './docker/secrets/ca-root.crt',
52+
'ssl.providers': 'fips,base'
53+
}
54+
55+
# Create Consumer instance
56+
c = Consumer(conf)
57+
58+
def print_assignment(consumer, partitions):
59+
print('Assignment:', partitions)
60+
61+
# Subscribe to topics
62+
c.subscribe(topics, on_assign=print_assignment)
63+
64+
# Read messages from Kafka, print to stdout
65+
try:
66+
while True:
67+
msg = c.poll(timeout=1.0)
68+
if msg is None:
69+
continue
70+
if msg.error():
71+
raise KafkaException(msg.error())
72+
else:
73+
# Proper message
74+
sys.stderr.write('%% %s [%d] at offset %d with key %s:\n' %
75+
(msg.topic(), msg.partition(), msg.offset(),
76+
str(msg.key())))
77+
print(msg.value())
78+
79+
except KeyboardInterrupt:
80+
sys.stderr.write('%% Aborted by user\n')
81+
82+
finally:
83+
# Close down consumer to commit final offsets.
84+
c.close()

examples/fips/fips_producer.py

Lines changed: 83 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,83 @@
1+
#!/usr/bin/env python
2+
#
3+
# Copyright 2023 Confluent Inc.
4+
#
5+
# Licensed under the Apache License, Version 2.0 (the "License");
6+
# you may not use this file except in compliance with the License.
7+
# You may obtain a copy of the License at
8+
#
9+
# http://www.apache.org/licenses/LICENSE-2.0
10+
#
11+
# Unless required by applicable law or agreed to in writing, software
12+
# distributed under the License is distributed on an "AS IS" BASIS,
13+
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
14+
# See the License for the specific language governing permissions and
15+
# limitations under the License.
16+
#
17+
18+
#
19+
# Example Kafka FIPS Compliant Producer.
20+
# Reads lines from stdin and sends to Kafka.
21+
#
22+
23+
from confluent_kafka import Producer
24+
import sys
25+
26+
if __name__ == '__main__':
27+
if len(sys.argv) != 3:
28+
sys.stderr.write('Usage: %s <bootstrap-brokers> <topic>\n' % sys.argv[0])
29+
sys.exit(1)
30+
31+
broker = sys.argv[1]
32+
topic = sys.argv[2]
33+
34+
# Producer configuration
35+
# See https://github.com/confluentinc/librdkafka/blob/master/CONFIGURATION.md
36+
conf = {'bootstrap.servers': broker,
37+
'security.protocol': 'SASL_SSL',
38+
'sasl.mechanism': 'PLAIN',
39+
'sasl.username': 'broker',
40+
'sasl.password': 'broker-secret',
41+
# pkc12 keystores are not FIPS compliant and hence you will need to use
42+
# path to key and certificate separately in FIPS mode
43+
# 'ssl.keystore.location': './docker/secrets/client.keystore.p12',
44+
# 'ssl.keystore.password': '111111',
45+
'ssl.key.location': './docker/secrets/localhost_client.key',
46+
'ssl.key.password': '111111',
47+
'ssl.certificate.location': './docker/secrets/localhost_client.crt',
48+
'ssl.ca.location': './docker/secrets/ca-root.crt',
49+
'ssl.providers': 'fips,base'
50+
}
51+
52+
# Create Producer instance
53+
p = Producer(**conf)
54+
55+
# Optional per-message delivery callback (triggered by poll() or flush())
56+
# when a message has been successfully delivered or permanently
57+
# failed delivery (after retries).
58+
def delivery_callback(err, msg):
59+
if err:
60+
sys.stderr.write('%% Message failed delivery: %s\n' % err)
61+
else:
62+
sys.stderr.write('%% Message delivered to %s [%d] @ %d\n' %
63+
(msg.topic(), msg.partition(), msg.offset()))
64+
65+
# Read lines from stdin, produce each line to Kafka
66+
for line in sys.stdin:
67+
try:
68+
# Produce line (without newline)
69+
p.produce(topic, line.rstrip(), callback=delivery_callback)
70+
71+
except BufferError:
72+
sys.stderr.write('%% Local producer queue is full (%d messages awaiting delivery): try again\n' %
73+
len(p))
74+
75+
# Serve delivery callback queue.
76+
# NOTE: Since produce() is an asynchronous API this poll() call
77+
# will most likely not serve the delivery callback for the
78+
# last produce()d message.
79+
p.poll(0)
80+
81+
# Wait until all messages have been delivered
82+
sys.stderr.write('%% Waiting for %d deliveries\n' % len(p))
83+
p.flush()

src/confluent_kafka/src/confluent_kafka.c

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -310,7 +310,7 @@ static PyTypeObject KafkaErrorType = {
310310
" - Exceptions\n"
311311
"\n"
312312
"Args:\n"
313-
" error_code (KafkaError): Error code indicating the type of error.\n"
313+
" error (KafkaError): Error code indicating the type of error.\n"
314314
"\n"
315315
" reason (str): Alternative message to describe the error.\n"
316316
"\n"

0 commit comments

Comments
 (0)
pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy