Microsoft Fabric Updates Blog

Explore Data Transformation in Eventstream for KQL Database Integration

At Microsoft Ignite this year, we were thrilled to announce the General Availability of Eventstream. One of the exciting features is Eventstream’s real-time processing tailored for seamless integration with the KQL Database. This feature ensures a smooth experience for ingesting and transforming data streams into your KQL table.

In this blog post, we’ll dive into a practical scenario using real-world bike-sharing data and learn to compute the number of bikes rented every minute on each street. You’ll gain hands-on experience using Eventstream’s powerful event processor, mastering real-time data transformations, and effortlessly directing the processed data to your KQL Database.

Prerequisites

  • Access to a premium workspace in Fabric with Contributor or above permissions.
  • A KQL Database created in your workspace.

Step 1: Create an eventstream and add sample bike data

1. Switch your Power BI experience to Real-time Analytics and select the Eventstream to create a new one. Name your Eventstream e.g. “eventstream-1”.

2. On the Eventstream canvas, expand New source and select Sample data. Give a name to the source and select Bicycles as the sample data.

3. Preview the data in eventstream to verify if the sample bike data is added successfully.

Here’s the description of the columns:

ColumnsDescription
BikepointIDID for the bike docking point
StreetName of the street where the dock is located
NeighbourhoodNeighbourhood where the dock is located
LatitudeLatitude of the docking point
LongitudeLongitude of the docking point
No_BikesNumber of bikes currently rented
No_Empty_DocksNumber of available empty docks at the docking point

Step 2: Add a KQL destination with the event processor

1. On the Eventstream canvas, expand the New destination drop-down menu and choose KQL Database.

2. Data ingestion mode. There are two ways of ingesting data into the KQL Database:

  • Direct ingestion: Ingest data directly to a KQL table without any transformation.
  • Event processing before ingestion: Transform the data with the Event Processor before sending it to a KQL table.

 Note: You CANNOT edit the ingestion mode after the KQL destination is added to the eventstream.

Select Event processing before ingestion and enter the necessary details of your KQL Database.

3. Scroll down the right panel and select Open event processor, this action opens a no-code editor allowing you to add real-time operations to your data streams.

4. On the editor, add a Group by operation between the Eventstream and the KQL Database. We want to calculate the number of bikes rented every minute on each street. Therefore under the Aggregation section, we select SUM for the aggregation and No_Bikes for the field.

5. Further down in the Settings section, select Street for the “Group aggregation by”, choose Tumbling for the “Time window”, and enter 1 Minute for the “Duration”. Then Save the Group by configuration.

6. Back to the editor, select the Group by operation, and preview the processing result to make sure the operation is configured successfully. Then select Save to close the Event processor.

7. Finally, select Add to finish the configuration for the KQL database destination.

Step 3: View result in the KQL table

1. On the Eventstream canvas, select the KQL destination, and select Open item to access your KQL Database.

2. Within the KQL database interface, locate the bike-count table. Select the Query table and choose Show any 100 records. This action opens the right panel, allowing you to examine the last 100 records of the table. Here, you can observe the detailed count of bikes rented on each street, minute by minute.

Congratulations!

You successfully completed the tutorial on exploring and transforming bike-sharing data using Eventstream. Keep exploring Eventstream’s capabilities and continue your journey with real-time data processing.

Bài đăng blog có liên quan

Explore Data Transformation in Eventstream for KQL Database Integration

tháng 11 5, 2025 của Pam Spier

There’s no AI without data. Are you ready to invest in your future? Fabric Data Days offers 50+ days of immersive learning designed for data professionals at every level and students alike. Don’t miss your chance to gain practical experience, earn free certifications exam voucher, and connect with a global community of experts. Here’s what … Continue reading “Advance your career in Data & AI with Microsoft Fabric Data Days”

tháng 11 4, 2025 của Misha Desai

We’re introducing a set of new enhancements for Data Agent creators — designed to make it easier to debug, improve, and express your agent’s logic. Whether you’re tuning example queries, refining instructions, or validating performance, these updates make it faster to iterate and deliver high-quality experiences to your users. New Debugging Tools View referenced example … Continue reading “Creator Improvements in the Data Agent”