Stream Real-time Events to Microsoft Fabric with Event Streams from Custom Application
In 2023 //Build, Microsoft has unveiled Microsoft Fabric – an end-to-end, unified analytics platform that brings together all the data and analytics tools that organizations need. It is an all-in-one analytics product that addresses every data analytics scenario with its core workloads, such as Data Factory, Data Engineering, Data Science, Data Warehousing, Real-Time Analytics, Power BI and Data Activator (coming soon).
One of the critical data scenarios is real-time event data streaming and processing. For instance, if you have real-time event data such as weather data and you want to either develop your own application or utilize an existing one to stream this event data into Fabric platform for further analytics, how can you accomplish this objective?
Now, with the Fabric event streams feature under Real-Time Analytics, you can easily achieve this goal. It serves as a centralized platform within Fabric, allowing you to capture, transform, and route real-time events to multiple destinations effortlessly, all through a user-friendly, no-code experience.
In this blog article, we will guide you through the process of writing a simple application to stream your real-time event data into your eventstream, then routing and/or transforming the data to Fabric data sinks, such as Lakehouse and KQL Database, using the Fabric event streams feature. Please note that this blog assumes you have access to a premium workspace with Contributor or above permissions, where your eventstream and lakehouse KQL database items are already created and located.
Get Eventstream item prepared
The Eventstream item in Fabric provides an endpoint that can receive real-time event data from your application. To set this up, you will need to create an Eventstream item called ‘citytempdata-es‘ and a “Custom App” source named ‘citytemp-custapp‘ within your eventstream.
In the “Custom App” source, you can obtain the endpoint connection string, which is an event hub compatible connection string. This connection string will be used in your application at a later stage. Below is an example of the connection string:
Endpoint=sb://eventstream-xxxxxxxx.servicebus.windows.net/;SharedAccessKeyName=key_xxxxxxxx;SharedAccessKey=xxxxxxxx;EntityPath=es_xxxxxxxx
Create an application to send events to eventstream
With the connection string available in the “Custom App” source, you can begin creating an application that sends events to your eventstream. In this example, the application simulates 10 sensor devices transmitting temperature and humidity data nearly every second.
- Create a file called sendtoes.js, and paste the following code into it.
-
In above code, replace “CONNECTION STRING” and “ENTITY NAME” with the real values in “Connection string-primary key” from your Custom App.
- Install the required packages. Open your PowerShell command prompt, navigate to the same folder as sendtoes.js and execute the following commands.
- Run node sendtoes.js in a PowerShell command prompt to execute this file. The window should display messages about sending events as following:
- Go to your eventstream main editor, and select your eventstream in the middle and then Data preview tab in the bottom pane to view the data streamed into your eventstream:
-
- Ingest, filter, and transform real-time events and send them in Delta Lake format to Microsoft Fabric Lakehouse, then build a Power BI report to visualize your business insights in these events data, see: Ingest, filter, and transform real-time events and send them in Delta Lake format to Microsoft Fabric Lakehouse – Microsoft Fabric.
- Stream real-time events from your Custom Application into Microsoft Fabric KQL Database, and then create a near real-time Power BI report to effectively monitor your business data, see: Stream real-time events from a custom application to Microsoft Fabric KQL Database for real-time reporting – Microsoft Fabric | Microsoft Learn
const { EventHubProducerClient } = require("@azure/event-hubs");
var moment = require('moment');
const connectionString = "CONNECTION STRING";
const entityName = "ENTITY NAME";
//Generate event data
function getRowData(id) {
const time = moment().toISOString();
const deviceID = id + 100;
const humidity = Math.round(Math.random()*(65-35) + 35);
const temperature = Math.round(Math.random()*(37-20) + 20);
return {"entryTime":time, "messageId":id, "temperature":temperature, "humidity":humidity, "deviceID":deviceID};
}
function sleep(ms) {
return new Promise(resolve => setTimeout(resolve, ms));
}
async function main() {
// Create a producer client to send messages to the eventstream.
const producer = new EventHubProducerClient(connectionString, entityName);
// There are 10 devices. They are sending events every second nearly. So, there are 10 events within one batch.
// The event counts per batch. For this case, it is the sensor device count.
const batchSize = 10;
// The event batch count. If you want to send events indefinitely, you can increase this number to any desired value.
const batchCount = 5;
// Generating and sending events...
for (let j = 0; j < batchCount; ++j) {
const eventDataBatch = await producer.createBatch();
for (let k = 0; k < batchSize; ++k) {
eventDataBatch.tryAdd({ body: getRowData(k) });
}
// Send the batch to the eventstream.
await producer.sendBatch(eventDataBatch);
console.log(moment().format('YYYY/MM/DD HH:mm:ss'), `[Send events to Fabric Eventstream]: batch#${j} (${batchSize} events) has been sent to eventstream`);
// sleep for 1 second.
await sleep(1000);
}
// Close the producer client.
await producer.close();
console.log(moment().format('YYYY/MM/DD HH:mm:ss'), `[Send events to Fabric Eventstream]: All ${batchCount} batches have been sent to eventstream`);
}
main().catch((err) => {
console.log("Error occurred: ", err);
});
npm install @azure/event-hubs
npm install moment
C:\wa>node sendtoes.js
2023/06/12 20:35:39 [Send events to Fabric Eventstream]: batch#0 (10 events) has been sent to eventstream
2023/06/12 20:35:41 [Send events to Fabric Eventstream]: batch#1 (10 events) has been sent to eventstream
2023/06/12 20:35:42 [Send events to Fabric Eventstream]: batch#2 (10 events) has been sent to eventstream
2023/06/12 20:35:44 [Send events to Fabric Eventstream]: batch#3 (10 events) has been sent to eventstream
2023/06/12 20:35:45 [Send events to Fabric Eventstream]: batch#4 (10 events) has been sent to eventstream
2023/06/12 20:35:47 [Send events to Fabric Eventstream]: All 5 batches have been sent to eventstream
You can also view the data metrics to confirm if the data has been streamed into this eventstream by selecting “Data insights” tab:
As you can see, your event data has now been successfully streamed into your eventstream using your simple application.
Route event data to Fabric sinks
Now that your event data has been successfully streamed into your eventstream, you can proceed to route them to various Fabric sinks, such as Lakehouse and KQL Database, for further analytics and processing.
Lakehouse
Lakehouse destination offers the capability to transform your real-time events before they are ingested into your lakehouse. It converts event data into the Delta Lake format and then stores it in the given lakehouse tables. It can help with your data warehousing scenario with its default SQL endpoint.
To route the event data to your lakehouse, you can easily achieve it by adding a Lakehouse destination to your eventstream. You can follow the step-by-step guide: “Add a lakehouse as a destination” to accomplish this.
KQL Database
The KQL Database destination enables you to ingest your real-time events into a powerful, scalable, and highly available database known as KQL Database. This database is designed to handle large volumes of data, allowing you to efficiently query, analyze, and visualize the data in real time.
To route the event data to KQL Database, simply add a KQL Database destination in your evenstream. For a detailed guide on how to create this destination, refer to the “Add a KQL database as a destination”.
The Fabric event streams documentation offers two comprehensive tutorials that guide you through achieving the following end-to-end scenarios:
Summary
Microsoft Fabric event streams feature offers the Custom App source, which provides an endpoint for receiving your real-time event data. The connection string within this endpoint is compatible with Azure Event Hubs and can be utilized in your application to push the event data to your eventstream. Subsequently, the eventstream enables you to route your data to different Fabric sinks. If you already have an application that sends event data to Azure Event Hubs or a similar target, you can effortlessly reuse this application by simply updating the endpoint connection string accordingly.