This project is a hands-on learning environment for Apache Kafka. It is designed to help you understand and experiment with Kafka and its ecosystem. As part of this project, you will also create a demo application featuring a Kafka producer and consumer to illustrate real-world usage.
This environment provides a ready-to-use Apache Kafka setup with Schema Registry, ksqlDB, Kafka Connect, and a modern Kafka UI, all orchestrated via Docker Compose. It is ideal for development, testing, and learning purposes.
- Kafka: Distributed event streaming platform
- Schema Registry: Manages Avro/JSON/Protobuf schemas for Kafka topics
- ksqlDB: Streaming SQL engine for Apache Kafka
- Kafka Connect: Integration framework for connecting Kafka with external systems
- Kafka UI: Web UI for managing Kafka topics, schemas, and more
- Docker (v20+ recommended)
- Docker Compose (v2+ recommended)
-
Clone this repository
git clone <your-repo-url> cd hands-on-kafka
-
Start the environment
docker-compose up -d
This will start all services in the background.
-
Access the Kafka UI
- Open your browser and go to: http://localhost:8080
- You can view topics, consumers, schemas, and more.
-
Access the Schema Registry API
- Schema Registry REST endpoint: http://localhost:8081
- Example: List all subjects
curl http://localhost:8081/subjects
-
Access ksqlDB
- ksqlDB server endpoint: http://localhost:8088
- You can use the ksqlDB CLI container for interactive queries:
docker-compose exec ksqldb-cli ksql http://ksqldb-server:8088
-
Access Kafka Connect
- Kafka Connect REST endpoint: http://localhost:8083
To stop all services:
docker-compose down
-
Register the schema with Schema Registry:
curl -X POST http://localhost:8081/subjects/users-value/versions \ -H "Content-Type: application/vnd.schemaregistry.v1+json" \ -d '{ "schema": "{\"type\":\"record\",\"name\":\"User\",\"namespace\":\"com.example\",\"fields\":[{\"name\":\"id\",\"type\":\"int\"},{\"name\":\"name\",\"type\":\"string\"},{\"name\":\"email\",\"type\":\"string\"}]}" }'
- All services are networked together using Docker's user-defined bridge network (
kafka-net
). - Default ports are exposed for local development. Change them in
docker-compose.yml
if needed. - For advanced configuration, refer to the official documentation of each component.
Enjoy your hands-on Kafka environment!
docker-compose exec kafka kafka-topics --create --topic transactions --bootstrap-server kafka:9092 --partitions 1 --replication-factor 1
Schema
syntax = "proto3";
package com.ortisan.kafkahandson;
message Transaction {
string id = 1;
string userId = 2;
double amount = 3;
string currency = 4;
int64 timestamp = 5;
google.protobuf.Timestamp timestamp = 5;
}
Register on Schema Registry
curl -X POST http://localhost:8081/subjects/transactions-value/versions \
-H "Content-Type: application/vnd.schemaregistry.v1+json" \
-d @- <<EOF
{
"schemaType": "PROTOBUF",
"schema": $(jq -Rs . < schemas/transaction.proto)
}
EOF
PATH=$PATH:$(pwd)/kafka-producer/node_modules/.bin \
protoc -I . \
--es_out schemas/gen/ \
--es_opt target=ts \
schemas/transaction.proto