Microsoft Fabric Updates Blog

Acquiring Real-Time Data from New Sources with Enhanced Eventstream

Data in real-time is generally classified into stream event and discrete event (or notification event).

  • Stream event: Represents continuous and unbounded series of events that may never end. Examples include sensor data from IoT devices, stock market tick data, or social media posts in a real-time feed.
  • Discrete event: Often referred to as notification events, these are individual occurrences that happen at specific points in time. Each event is independent of others and has a clear start and end point. Examples of discrete events include users placing orders on a website, or users making changes to a database. 

Streaming event sources

The enhanced Eventstream offers an intuitive way to connect external stream event sources with a ready-to-use setup, supporting well-known cloud services such as Google Cloud and Amazon Kinesis; as well as database change data capture (CDC) streams, by utilizing our newly introduced messaging connectors. These connectors leverage Kafka connect and Camel Kafka connectors to provide a flexible, declarative approach to data integration, ensuring broad connectivity across leading data platforms. It also integrates Debezium for setting up database connections, to accurately capture CDC streams. Below is a list of the fresh sources recently incorporated into Eventstream, enabled by messaging connectors:

  • Confluent Cloud Kafka
  • Amazon Kinesis Data Streams
  • Google Cloud Pub/Sub
  • Azure SQL Server DB (CDC)
  • PostgreSQL DB (CDC)
  • Azure Cosmos DB (CDC)
  • MySQL DB (CDC)

To use these new sources in Eventstream, simply create an eventstream with choosing “Enhanced Capabilities (preview)”.

Preview Toggle in Eventstrem Creation

You will see the new Eventstream homepage that gives you some choices to begin with. By selecting Add source on ribbon or Add external source card in homepage, you will find these sources in the wizard that helps you to set up the source in a few steps.

New Sources in Eventstream

After you add the source to your eventstream, you can publish it to stream the data into your eventstream. To transform the stream data or route it to different Fabric destinations based on its content, you can click Edit in ribbon and enter the Edit mode. There you can add event processing operators and destinations.

For the step-by-step guide, please refer to Add and manage eventstream sources – Microsoft Fabric | Microsoft Learn

Discrete event sources

The enhanced Eventstream lets you acquire and route real-time data not only from stream sources but also from discrete event sources, such as: Azure blob storage events, Fabric workspace item events.

Azure blob storage events

These events will be produced when there are any changes to a blob storage account, such as blob changes (creates, blob renames, blob deletes), directory changes, etc. The new Eventstream can help you link these events to Fabric events so that you can subscribe to them in Real-Time hub with Data Activator triggers or create triggers that execute Fabric job items, like pipeline, notebook, etc. You can accomplish this easily with a few clicks: select blob storage account in wizard, publish your eventstream. Then you can go to Real-Time hub to subscribe to these events. At this stage, the events remain as discrete events, have not been transformed into continuous streams yet (Unstreamed).

Linked Azure Blob Storage Events to Fabric Events in Eventstream (Unstreamed)

You also can use the new Eventstream to turn these events into streams for more analysis. It enables you to transform send the streams to different Fabric data destinations, like Lakehouse and KQL database. After the events are converted as streamed events, a default stream will appear in Real-Time hub. To turn them, click Edit on ribbon, select Stream events on the source node, and publish your eventstream.

Convert Azure Blob Stroage Discrete Events to Stream (Streamed)

For the step-by-step guide, please refer to Add Azure Blob Storage event source to an eventstream – Microsoft Fabric | Microsoft Learn

Fabric workspace item events

These are discrete events that occur when a workspace has any changes, for example, an item in a workspace changed (created, updated, deleted, read). With new Eventstream, you can turn these events into streams to perform more analysis and extract more business values from these streams (streamed events). To do this, you just need to set up this source following the wizard and publish your eventstreams. Unlike the Azure blob storage events, you don’t need to link these events to Fabric events before converting them since these events are already in Fabric.

Convert Fabric Workspace Item Discrete Events to Stream (Streamed)

For the step-by-step guide, please refer to Add Fabric workspace item event source to an eventstream – Microsoft Fabric | Microsoft Learn

Transform and route the streamed events

To transform the streamed events from Azure blob storage events or Fabric workspace events, you can click Edit in the ribbon and enter the Edit mode. Here you can add event processing operations and route the streamed events to various destinations in Fabric. For the step-by-step guide, please refer to Edit and publish Microsoft Fabric eventstreams – Microsoft Fabric | Microsoft Learn

Learn more, and help us with your feedback

Access to Eventstream how-tos, tutorials, and other helpful resources at: Microsoft Fabric event streams overview – Microsoft Fabric | Microsoft Learn.

To find out more about Real-Time Intelligence, read Yitzhak Kesselman’s announcement. As we launch our preview, we’d love to hear what you think and how you’re using the product. The best way to get in touch with us is through our community forum or submit an idea. For detailed how-tos, tutorials and other resources, check out the documentation.

This is part of a series of blog posts that dive into all the capabilities of Real-Time Intelligence. Stay tuned for more!

โพสต์ในบล็อกที่เกี่ยวข้อง

Acquiring Real-Time Data from New Sources with Enhanced Eventstream

ตุลาคม 31, 2567 โดย Jovan Popovic

Fabric Data Warehouse is a modern data warehouse optimized for analytical data models, primarily focused on the smaller numeric, datetime, and string types that are suitable for analytics. For the textual data, Fabric DW supports the VARCHAR type that can store up to 8KB of text, which is suitable for most of the textual values … Continue reading “Announcing public preview of VARCHAR(MAX) and VARBINARY(MAX) types in Fabric Data Warehouse”

ตุลาคม 29, 2567 โดย Dandan Zhang

Managed private endpoints allow Fabric experiences to securely access data sources without exposing them to the public network or requiring complex network configurations. We announced General Availability for Managed Private Endpoint in Fabric in May of this year. Learn more here: Announcing General Availability of Fabric Private Links, Trusted Workspace Access, and Managed Private Endpoints. … Continue reading “APIs for Managed Private Endpoint are now available”