Microsoft Fabric Updates Blog

Explore Data Transformation in Eventstream for KQL Database Integration

At Microsoft Ignite this year, we were thrilled to announce the General Availability of Eventstream. One of the exciting features is Eventstream’s real-time processing tailored for seamless integration with the KQL Database. This feature ensures a smooth experience for ingesting and transforming data streams into your KQL table.

In this blog post, we’ll dive into a practical scenario using real-world bike-sharing data and learn to compute the number of bikes rented every minute on each street. You’ll gain hands-on experience using Eventstream’s powerful event processor, mastering real-time data transformations, and effortlessly directing the processed data to your KQL Database.

Prerequisites

  • Access to a premium workspace in Fabric with Contributor or above permissions.
  • A KQL Database created in your workspace.

Step 1: Create an eventstream and add sample bike data

1. Switch your Power BI experience to Real-time Analytics and select the Eventstream to create a new one. Name your Eventstream e.g. “eventstream-1”.

2. On the Eventstream canvas, expand New source and select Sample data. Give a name to the source and select Bicycles as the sample data.

3. Preview the data in eventstream to verify if the sample bike data is added successfully.

Here’s the description of the columns:

ColumnsDescription
BikepointIDID for the bike docking point
StreetName of the street where the dock is located
NeighbourhoodNeighbourhood where the dock is located
LatitudeLatitude of the docking point
LongitudeLongitude of the docking point
No_BikesNumber of bikes currently rented
No_Empty_DocksNumber of available empty docks at the docking point

Step 2: Add a KQL destination with the event processor

1. On the Eventstream canvas, expand the New destination drop-down menu and choose KQL Database.

2. Data ingestion mode. There are two ways of ingesting data into the KQL Database:

  • Direct ingestion: Ingest data directly to a KQL table without any transformation.
  • Event processing before ingestion: Transform the data with the Event Processor before sending it to a KQL table.

 Note: You CANNOT edit the ingestion mode after the KQL destination is added to the eventstream.

Select Event processing before ingestion and enter the necessary details of your KQL Database.

3. Scroll down the right panel and select Open event processor, this action opens a no-code editor allowing you to add real-time operations to your data streams.

4. On the editor, add a Group by operation between the Eventstream and the KQL Database. We want to calculate the number of bikes rented every minute on each street. Therefore under the Aggregation section, we select SUM for the aggregation and No_Bikes for the field.

5. Further down in the Settings section, select Street for the “Group aggregation by”, choose Tumbling for the “Time window”, and enter 1 Minute for the “Duration”. Then Save the Group by configuration.

6. Back to the editor, select the Group by operation, and preview the processing result to make sure the operation is configured successfully. Then select Save to close the Event processor.

7. Finally, select Add to finish the configuration for the KQL database destination.

Step 3: View result in the KQL table

1. On the Eventstream canvas, select the KQL destination, and select Open item to access your KQL Database.

2. Within the KQL database interface, locate the bike-count table. Select the Query table and choose Show any 100 records. This action opens the right panel, allowing you to examine the last 100 records of the table. Here, you can observe the detailed count of bikes rented on each street, minute by minute.

Congratulations!

You successfully completed the tutorial on exploring and transforming bike-sharing data using Eventstream. Keep exploring Eventstream’s capabilities and continue your journey with real-time data processing.

Povezane objave na blogu

Explore Data Transformation in Eventstream for KQL Database Integration

listopada 29, 2024 autor Dandan Zhang

Managed private endpoints allow Fabric experiences to securely access data sources without exposing them to the public network or requiring complex network configurations. We announced General Availability for Managed Private Endpoint in Fabric in May of this year. Learn more here: Announcing General Availability of Fabric Private Links, Trusted Workspace Access, and Managed Private Endpoints. … Continue reading “APIs for Managed Private Endpoint are now available”

listopada 7, 2024 autor Alex Lin

Introducing Managed VNet Support for Fabric Eventstream! By creating a Fabric’s Managed Private Endpoint, you can now securely connect Eventstream to your Azure services, such as Azure Event Hubs or IoT Hub, within a private network or behind a firewall. This integration ensures your data is securely transmitted over a private network, enabling you to … Continue reading “Secure Data Streaming with Managed Private Endpoints in Eventstream (Preview)”