Microsoft Fabric Updates Blog

Continuous Ingestion from Azure Storage to Eventhouse (Preview)

The integration of Azure Storage with Fabric Eventhouse for continuous ingestion represents a significant simplification of data ingestion process from Azure Storage to Eventhouse in Fabric Real-Time Intelligence. It automates extraction and loading from Azure storage and facilitates near real-time updates to Eventhouse KQL DB tables. With this feature, it is now easier for organizations striving for streamlined and efficient data operations, to configure and maintain data pipelines from Azure Storage. By utilizing Azure Events, Continuous Ingestion transforms the conventional data pipeline model, providing exceptional ease and scalability for contemporary data-driven environments.

What is Continuous Ingestion from Azure Storage to Eventhouse?

Fabric Real-Time Intelligence is a powerful service that empowers everyone in your organization to extract insights and visualize their data in motion. It offers an end-to-end solution for event-driven scenarios, streaming data, and data logs. Eventhouse provides a solution for handling and analyzing large volumes of data, particularly in scenarios requiring real-time analytics and exploration. It is designed to handle real-time data streams efficiently, which lets organizations ingest, process, and analyze data in near real-time.

Continous Ingestion configuration.

Get Data in Real-Time Intelligence Eventhouse offers a step-by-step process to guide you through importing or inspecting the incoming data, creating, or editing the destination table schema, to exploration of the ingested result from multiple sources.

One of the sources from which users can bring data into an Eventhouse table using Get Data wizard is Azure Storage, which allows users to ingest one or more blobs/files from the storage account. This capability is now being enhanced with the feature of continuous ingestion, where once the connection between the Azure Storage Account and Eventhouse has been established, any new blob/file uploaded to the storage account will automatically be ingested to the destination table.

How does it work?

Continuous Ingestion from Azure Storage to Eventhouse, utilizes Azure Events in Fabric to listen to Azure Storage Account Events. Fabric events allow users to subscribe to events produced by Fabric and Azure resources. Fabric events allow downstream processes that have subscribed to these events to be triggered in near real-time, providing an efficient communication model at high scale and higher fault tolerance. Azure Blob Storage events are generated when actions occur on blobs within an Azure storage account. These events are triggered by actions such as creating, updating, or deleting blobs.

Based on the subscribed events from Azure Events, Eventhouse pulls the corresponding newly created/renamed file from the connected Azure Storage. This simplifies the process of bringing data from your Azure Storage account as it is being generated and eliminates the need for creating and maintaining long complicated ETL pipelines. It also eliminates the need of defining time-based triggers for fetching new data from Azure storage and makes ingestion to Eventhouse near real-time.

When to use Continuous Ingestion?

Continuous Ingestion from Azure Storage is particularly beneficial in scenarios where real-time data processing and analysis of blob storage data are crucial. For instance, in log ingestion, continuous ingestion from Azure Storage allows for the seamless and immediate ingestion of log data from various sources who are writing to a storage account, enabling timely monitoring, and troubleshooting. In environments utilizing external data lakes, such as those where sources write to Azure Data Lake Storage (ADLS) Gen2, continuous ingestion ensures that any new data added to the storage is automatically ingested into the destination KQL DB table, facilitating up-to-date data analysis without manual intervention.

Additionally, in Industry 4.0 scenarios, where machines and IoT devices continuously generate data and write to a common blob storage account, continuous ingestion provides an efficient way to handle the high volume of data streams. Once the data has been ingested in Eventhouse, we can query the same using KQL Queryset, build Real-Time Dashboards for Analytics or use Activator for Real-Time Alerts when data value is within or above a specific threshold. This setup allows for near real-time insights and operational intelligence, enhancing decision-making processes and overall efficiency.

How to enable continuous ingestion?

Enabling continuous ingestion from Azure Storage to Eventhouse is a simple process. To establish a comprehensive understanding of the process end-to-end, I have broken down the steps into three sections below, namely Pre-requisites, Configure and Validate.

Pre-requisites

Step 1: Create a Storage Account with Hierarchical namespace enabled – Create a storage account for Azure Data Lake Storage.

Hierarchical namespace setting

Step 2: Add a container and upload at least one file in the container. This file will be used for inference of the schema for Eventhouse KQL DB destination table – Quickstart: Upload, download, and list blobs.

Step 3: In new or existing workspace, except ‘My Workspace’, create and copy workspace id for this workspace – Workspace identity

Screenshot showing workspace identity details.

Step 4: Navigate to Access Control (IAM) in Storage Account Storage and click ‘Add Role Assignment.’

Step 5: Add workspace id to Storage Account with Storage Blob Data Reader permissions.

Step 6: Create an Eventhouse – Create an eventhouse

Screenshot of creating a new eventhouse item in Real-Time Intelligence.

Configure

Step 1: Navigate to ‘Get Data’ in KQL Database and select Azure Storage in Source Step:

Step 2: Create New or Select existing table which will be the destination table.

Step 3: Select Subscription, Storage Account and Container.

Step 4: In Connection drop down click ‘+New Connection’, Save and Close dialog to create connection between Azure storage and Eventhouse.

Additionally, you can choose a filter, change Eventstream name or workspace, select event filter – Refer to Get data from Azure storage.

Step 5: Inspect schema and finalize to upload existing files, create a connection to ingest any new file uploaded to the storage.

Validate

Step 1: Explore results to see existing file in container ingested into the destination table. This data is now available to query from the destination table.

Step 2: Upload a new file to the storage container. This emulates new files being written to Azure Storage Account by source. This file will be ingested to the KQL DB destination table in the Eventhouse automatically.

Step 3: Query the table again. The new file added on storage should be automatically ingested to the storage account – Use example queries in Real-Time Intelligence

Step 4: Navigate to Data streams in the KQL databases tree to see the status of the Data connection. From here, you can filter the data streams. view status, details and even delete a data stream, which will stop the continuos ingestion.

Cost billing

The charges of the Azure Events utilized for continuous ingestion from Azure Storage will appear in the Fabric metrics app under the Eventhouse where the continuous ingestion has been set up.

What next?

Continuous Ingestion from Azure Storage to Eventhouse is now available as a ‘Preview’ in Microsoft Fabric. Please refer Get data from Azure storage to learn more and get started today. 

Need help or want to suggest an improvement?

Reach out to us on RTI Forum: Get Help with Real-Time Intelligence.

Request or upvote a suggestion on Fabric Ideas RTI: Fabric Ideas.

Related blog posts

Continuous Ingestion from Azure Storage to Eventhouse (Preview)

June 17, 2025 by Akshay Dixit

The Eventstreams artifact in the Microsoft Fabric Real-Time Intelligence experience lets you bring real-time events into Fabric, transform them, and then route them to various destinations such as Eventhouse, without writing any code (no-code). You can ingest data from an Eventstream to Eventhouse seamlessly either from Eventstream artifact or Eventhouse Get Data Wizard. This capability … Continue reading “Fabric Eventhouse now supports Eventstream Derived Streams in Direct Ingestion mode (Preview)”

June 17, 2025 by Dan Liu

Have you ever found yourself frustrated by inconsistent item creation? Maybe you’ve struggled to select the right workspace or folder when creating a new item or ended up with a cluttered workspace due to accidental item creation. We hear you—and we’re excited to introduce the new item creation experience in Fabric! This update is designed … Continue reading “Introducing new item creation experience in Fabric”