These documents allow you to setup, configure and explore the Conduktor Gateway.
If you are looking at setting up your Gateway, review the installation options.
Need to configure Gateway for your specific setup, review the configuration options.
Adding a plugin a.k.a. an interceptor, then view the interceptors to see what there is and view examples to copy.
If you want to walkthrough a demo for yourself, or use it to copy configuration examples checkout the demos.
Something missing, want more? Contact us, we're always reviewing and expanding our documentation.
Happy reading 😃.
What is the Conduktor Gateway
A vendor agnostic Apache Kafka proxy. Adding governance, security, cost-optimisation, and much more!
Kafka is a powerful tool, with great flexibility. This power and flexibility can lead to challenges around managing and bringing structure to your Kafka ecosystem, especially as that ecosystem grows.
When using Conduktor Gateway you can more easily follow the path to a mature Kafka set-up, avoiding the pitfalls and the common challenges that come with this progression. Conduktor Gateway gives you the power to add structure, organisation, enhanced functionality, and therefore confidence in your Kafka environment. Gateway is fully Apache Kafka protocol compliant and vendor agnostic, it supports the use of Kafka wherever that Kafka is hosted.
The core of Conduktor Gateway is the transport layer between your Kafka client applications and your Kafka clusters.
This transport layer is enhanced by interacting with the Kafka, modifying the data or performing logical operations to add value. Gateway itself is made of two conceptual parts, the Gateway core, and interceptors. There is so much you can do with a Conduktor Gateway, just some of the features include:
Virtual clusters for your clients through multi-tenancy
Encryption, for encrypting at the field level within your Kafka records, to aid with compliance around use of confidential, personal, or high value data
Chaos engineering, which enables you to develop against, and then prove that your Kafka applications can handle failure scenarios
Safeguarding, which puts structure and guards in place to ensure your Kafka environment is used in the right way