Decoding Data with Confluent Schema Registry Support in Eventstream (Preview)
We are pleased to announce that Eventstream’s Confluent Cloud for Apache Kafka streaming connector now supports decoding data from Confluent Cloud for Apache Kafka topics that are associated with a data contract in Confluent Schema Registry.
The Challenge with Schema Registry Encoded Data
The Confluent Schema Registry serves as a centralized service for managing and validating schemas used in Kafka topics, ensuring that both producers and consumers maintain a consistent data structure. Data produced using the Confluent serializer and its Schema Registry are serialized according to a defined format. Prior to the availability of decoding capabilities, although Eventstream’s Confluent Cloud for Apache Kafka streaming connector could ingest such data into Eventstream, several challenges arose when processing this data within Eventstream:
- The data cannot be processed or routed to Fabric destinations, like Eventhouse, Lakehouse, etc.
- The data cannot be previewed properly inside Eventstream.
- The data cannot be previewed properly at its source.
These challenges prevent customers from using Fabric Real-Time Intelligence to achieve their streaming scenarios with their streaming data.
How Eventstream Solves it
The essential problem we need to solve is to decode the data that is produced with Confluent schema registry. To achieve this, the Eventstream streaming connector must retrieve the relevant schema (data contract) associated with the Kafka topic from the Confluent Schema Registry. This schema is then used by the connector to deserialize the encoded data before serializing the decoded data into Eventstream.

Start Using it Today
To properly consume data from Confluent Cloud for Apache Kafka topic associated with a schema registry, a connection with your Confluent Schema Registry server needs to be established.
In Advanced settings of Confluent Cloud for Apache Kafka source, select ‘Yes’ in both ‘Define and serialize data’ and ‘Is data encoded with schema registry’, then create a connection reference to your Confluent schema registry server. For detailed configuration instruction, please refer to our public document: Add Confluent Cloud for Apache Kafka source to an eventstream.

After configured, the streaming connector retrieves the schema associated with the Kafka topic from Confluent Schema Registry server. Then the data in source can be previewed in both Edit mode and Live view; you will see the decoded data shown in the Data preview tab.

After this eventstream is published, the data from Kafka topic will be decoded with the schema obtained from Confluent Schema Registry server by the streaming connector while it is brought into this eventstream. You can then preview the ingested data in this eventstream.

The data now can be processed using Eventstream’s operators and routed to Fabric destinations, such as Eventhouse, for purposes like reporting or monitoring.

Conclusion and Feedback
With the introduction of this new capability in Eventstream, customers can process, preview, and route the streaming data produced using the Confluent Schema Registry from their Confluent Cloud for Apache Kafka, while maintaining data structure consistency in your Kafka’s producers and consumers.
Get started with a free trial of Microsoft Fabric today! If you have any questions, please contact us via email at askeventstreams@microsoft.com. You are also welcome to provide feedback or submit feature requests on Fabric Ideas, and participate in discussions with other users in the Fabric Community.