Sending Events via Kafka Native
Overview
Kafka Native ingestion allows you to produce events directly to a dedicated Future Anthem Kafka cluster using standard Kafka producer libraries.
This approach is optimised for high-throughput, low-latency streaming and is designed for customers with existing Kafka infrastructure or advanced streaming requirements.
When to Use Kafka Native
Kafka Native is recommended if:
-
You already operate Kafka producers
-
You require very high throughput
-
You need fine-grained control over producer configuration
-
You want native Kafka semantics and behaviour
Kafka Native is suitable for production workloads where scale, performance, and control are key requirements.
How Kafka Native Ingestion Works
At a high level:
-
Future Anthem provisions a dedicated Kafka cluster
-
Kafka topics are created and managed by Future Anthem
-
Your producers authenticate using secure credentials
-
Events are produced directly to agreed topics
-
Events enter the same downstream processing pipeline as REST Proxy ingestion
Once ingested, events are processed identically across the platform.
Kafka Cluster Model
Kafka Native ingestion uses:
-
A dedicated Amazon MSK cluster per customer
-
TLS-encrypted communication
-
SASL-based authentication
-
Topic-level access controls
Kafka topics must be provisioned in advance by Future Anthem as part of onboarding.
Prerequisites
Before producing events, the following must be completed:
-
Source IPs shared with Future Anthem for whitelisting
-
API key issued for secure access
-
Kafka credentials (MSK username and password) provisioned
-
Required Kafka topics created by Future Anthem
Connection Setup
1. IP Whitelisting
Access to Kafka brokers is restricted via IP allowlisting.
-
Provide your source IP addresses to Future Anthem
-
Access will only be permitted from approved IPs
2. Retrieve Bootstrap Server Endpoint
Use the API to retrieve the broker discovery endpoint:
curl -s --location 'https://producer.msk-rest-proxy.dev.<#CLIENT#>.future-anthem-lz-pr.com/v3/clusters' \
--header 'x-api-key: <#API_KEY#>' | jq -r '.data[0].brokers.related'
3. Retrieve Kafka Bootstrap Server URIs
Use the returned endpoint to obtain the broker list
curl -s --location '<URI FROM PREVIOUS REQUEST>' \
--header 'x-api-key: <#API_KEY#>' | jq -r '.data[] | "\(.host):\(.port)"'
This returns the Kafka bootstrap servers required for your producer configuration.
Producer Configuration
Configure your Kafka producer as follows:
bootstrap.servers=<BOOTSTRAP_SERVERS_FROM_API>
security.protocol=SASL_SSL
sasl.mechanism=SCRAM-SHA-512
sasl.jaas.config=org.apache.kafka.common.security.scram.ScramLoginModule required username="producer" password="<#MSK_PWD#>";
Topic Provisioning and Constraints
-
Kafka topics must be created in advance by Future Anthem
-
Producers cannot create or modify topics
-
Topic configuration, schema setup, and access must be requested via Future Anthem
This ensures consistency, governance, and compatibility across the platform.
Producer Responsibilities
When using Kafka Native, customers are responsible for:
-
Sending JSON-encoded event payloads
-
Following agreed topic naming and schema conventions
-
Managing producer retries, batching, and acknowledgements
Future Anthem manages cluster provisioning, topic lifecycle, and platform-side validation and consumption.