Feedback?Something looks wrong? Send us an email at feedback [at] conduktor [dot] io
How to choose a Kafka client library?
Usually most of the libraries will have support for the producer and consumer API. Decisions points come around the following:- Pure implementation or librdkafka based: librdkafka is a C/C++ library that implements the Kafka API and many libraries are built as a wrapper around it. This may be acceptable to you, or not. The main issues can arise when you build and deploy your software.
- Does it support your security mechanism? Most libraries will for sure support PLAINTEXT, but ensure it supports the security mechanism you need for your Kafka deployment: SSL and SASL (SASL/SCRAM, SASL/OAUTHBEARER, SASL/GSSAPI, SASL/PLAIN, or even external SASL mechanisms such as SASL IAM for MSK)
- Does it support the Confluent Schema Registry? The Confluent Schema Registry is a common implementation in a Kafka deployment, and making sure your libraries have the proper serializers and deserializers for it is very important. Usually, Confluent libraries do a good job at integrating with the Confluent Schema Registry. If it supports the Schema Registry, does it support Avro, JSON, or Protobuf? (based on your decisions)
- Does it support the performance you need? This is usually the case, but make sure to do performance testing before going with a full blown implementation. This may save you some time.
Kafka client libraries SDK list
Here you can find a list of libraries that has been compiled to ensure you have some pointers on when to start with your implementation:-
Java
- The official client library: low-level client
- The official Kafka Streams client library: to create your Kafka Streams application
- Kafka for Spring Boot: applies Spring concepts to Kafka development
- Spring Cloud Stream: bindings for Kafka Stream
- Akka Streams & Alpakka Kafka
-
Scala
- FS2 Kafka: Functional Kafka Producer and Consumer
- ZIO Kafka: Kafka Client for ZIO
- Kafka Streams Scala: the official kafka-streams library has Scala since Kafka 2.0
- Alpakka Kafka
-
C/C++
- librdkafka: low-level implementation of the Kafka client, many other higher level language libraries are a wrapper around this library. Supports all kind of security, and most KIPs
- CPP Kafka: based on librdkafka
- Modern CPP Kafka: based on librdkafka
-
Golang
- Confluent Kafka Go: wrapper of librdkafka, currently no Schema Registry support
- Schema Registry Client for Go: compatible with Confluent Kafka Go
- Segment’s Kafka Go: pure Go implementation of the Kafka Client, good support
- Franz Go: Pure go implementation, supports most KIPs
- others if you’re curious: Sarama, Goka
-
Python
- Confluent Kafka Python: based on librdkafka, includes an Admin client, Avro support with the Confluent Schema Registry
- Kafka Python: not active, use at your own risk
-
Javascript / Node.js
- KafkaJS: no external dependencies, good performance, support for Schema Registry
- Blizzard Node rdkafka: Node.js wrapper for librdkafka, low maintenance activity
-
.NET / C#
- Confluent Kafka DotNet: wrapper around librdkafka, full Schema Registry support (Avro, JSON, Protobuf)
-
Rust
- Rust rdkafka: Rust wrapper for librdkafka, good performance
- Rust Schema Registry Converter: compatible with Rust rdkafka
- Kafka Rust: pure Rust implementation, low maintenance activity
-
REST API
- Confluent REST Proxy: REST API for Kafka. Deploy with a reverse proxy sidecar to implement the security mechanism you need. Historically needed, but now that client library support for Kafka is quite good, you migh not have a use case for it.
-
Kotlin
- simply use the standard Java library
-
Haskell
- HW Kafka Client: based on librdkafka
- HW Kafka Avro: support for Schema Registry
-
Ruby
- rdkafka-ruby: based on librdkafka
- Ruby Kafka: limited support for new API