Microsoft Fabric Updates Blog

Operational Reporting with Microsoft Fabric Real-Time Intelligence

Operational reporting and historical reporting serve distinct purposes in organizations. Historically, data teams have heavily leaned on providing historical reporting, as being able to report on the operational business processes has proved elusive.  

As a result, organizations have created reports directly against the operational database for operational needs or spend significant effort trying to get analytical tools to refresh faster using ‘micro-batching’ and/or keeping a tool like Power BI in directQuery mode. These efforts come with the goal of ‘moving data through the system as fast as possible’. 

This is an example of a typical modern data warehouse architecture:

To avoid conflicts during updates and improve timeliness, organizations often daisy-chain Extract, Load, Transform (ELT) jobs together. However, there is a limit to how much you can compress or speed up these processes. The simple fact is that there are too many steps in this process to achieve timely insights into the data. Organizations are then forced to accept that daily, hourly, or every few minutes as ‘good enough’. However, this architecture loses the ability to truly integrate data as part of the business fabric, leveraging it as a semi-autonomous agent that can take actions, call external systems, and notify users of exceptions as they occur.  

Report faster with Real-Time Intelligence 

Real-time data integration is where Microsoft Fabric Real-Time Intelligence (RTI) shines. RTI solves traditional data processing challenges and allows data to be leveraged as events within the organization. Real-Time Intelligence is not just for Internet of Things (IoT) data or logs. While the engine is powerful enough to handle these workloads, it offers many additional benefits. RTI fosters partnerships between data and business teams, moving from ‘create me a report’ to ‘what do we want our data to do‘.

The power of RTI lies in how data is ingested and loaded into Fabric. Before diving deeper, let’s review the four main components of RTI: 

  • Eventstreams are event listeners and wait for messages to be sent to them (data is pushed). This is very unlike pipelines, notebooks, and other traditional data processing tools which pull the data from their sources.
  • Eventhouse is Microsoft’s massively scalable ingestion engine with the ability to handle up to millions of events per hour.   
  • Real-Time Dashboards query the data in real-time as it’s being loaded. Every time a query is run, it leverages the latest data available in an Eventhouse or OneLake. If you are from the Power BI world, think of Real-Time Dashboards like DirectQuery, but without the need to load data into a semantic model.   
  • Activator takes events as they are being processed into Eventstreams or Eventhouses and connects them to downstream systems to make data actionable.   

Since all of this is integrated into the Fabric experience, you can also use Power BI reports, send data to Lakehouses, and leverage this data with other Fabric workloads, such as machine learning training. 

Scenario 

Consider the following scenario: An organization has a work order system. When a work order is completed, that event generates many different downstream business processes:  

  • The work order is dispatched to a service technician to complete a repair or an installation.  
  • The service technician needs to manually check the inventory that’s available and order new parts.
  • There are sensors on all the trucks in the fleet, showing their current location.  
  • A custom messaging application shows customers how close their technician is to arriving. This legacy system runs separately and is not integrated with the rest of the organization due to speed and volume requirements. 
  • Finance needs to create an invoice and prepare it for processing. They also need this data for monthly reporting.   
  • Inventory needs to be ordered as supplies are depleted. 
  • Upon dispatch, the technician goes to the location to fulfill the work order. 
  • After completion, a customer survey is sent along with an invoice.  

In the existing cloud world, this is a sample implementation to solve the scenario:

  • A data lake is created, the data is moved through various Data Lake layers (Bronze, Silver, Gold) before being made available in a data warehouse. In reality, this is usually 3-4 steps spread over multiple storage accounts. 
  • Data is loaded from Gold to a compute engine such as Azure Synapse, SQL, or Postgres. 
  • Data is moved from the staging tables in the compute layer to cleaned tables for the particular business process. 
  • Data is exposed from the clean tables to a view for finalized exposure.
  • This is all orchestrated by Azure Data Factory or Apache Airflow.
  • The data is loaded hourly, where Power BI reports have been created which are sent to the various departments.  
  • The work order team gets a Power BI report that shows which orders came in recently, along with the latest updated data on truck locations (batched through a cloud process hourly as well) where a user manually reconciles the nearest technician based on available dispatch data in another report.  
  • The finance team gets a report updated the next day that shows the previous orders that came in, where they start creating invoices via a CRM system. There is another report updated monthly that shows which work orders were loaded last month.  
  • The inventory team gets another report (updated daily) that shows what is running low on stock and what to order.  
  • The inventory manager checks the report and orders a new supply of the product.   
  • Specialized logic apps are built by the data engineering team based on carefully curated requirements from the business to create alerts for high-impact criteria. 

In Azure, these processes are spread across multiple teams within an organization. To work on the same dataset, it is often copied into storage, moved to another resource group, and then re-loaded into a new compute engine. Financially, this solution requires multiple components that all come with their own cost: Storage, Notebooks, Compute, Orchestration, Logic Apps, Power BI. At a minimum that is 5 different services each with their own bill.

Additionally, these data stores need to be kept in sync, lineage must be tracked, and time to insights is slowed. There is an operational overhead to this as well, as service principals, app registrations, and authentication must be established across these endpoints to ensure proper access controls.  

Using Fabric Real-Time Intelligence, the scenario can be solved more simply using the following example.

  • A Fabric workspace is created for the work order team.  
  • An eventstream is created for incoming work orders.  
  • The existing application sends data to this eventstream via either an event/change feed, change data capture (CDC), or a kafka message/topic.
  • The work order is loaded into an eventhouse, where it is processed and updated via an Update Policy. Update policies process events in-flight before loading them to the next table in the process.  
  • A KQL query is written in this eventhouse which leverages ingested inventory data from OneLake.
  • Another eventstream is created that loads the vehicle telemetry information into the same eventhouse.
  • geospatial mapping is applied during ingestion to identify truck locations.  
  • A real-time dashboard provides an up-to-date view of the work order system, showing which work orders have been processed today, and how it compares to historical averages, along with a map showing where these orders are being placed and where the nearest technician is.  
  • An activator is created which creates a dispatch record to the nearest available technician. It triggers an alert to a Fabric notebook, which sends outgoing API calls to dispatch the technician and send the customer an ETA.
  • A link is provided to the customer to track their technician’s arrival time in real-time  
  • Another workspace is created for the finance team.  
  • A database shortcut is created from eventhouse, surfacing the data on all latest work orders to the finance team.  
  • The finance team creates an activator that calls a Power Automate job which creates and stores the invoice.  
  • After the technician marks the job complete, the finance team has another activator process that runs which sends the invoice to the customer.  
  • Leveraging OneLake, finance also loads the data to a SQL database or processes the data via Spark, where they do additional team-specific transformations to apply custom business logic and create Power BI reports. There’s no need to copy data into different resource groups as we saw in other solutions! 
  • A workspace is created for the inventory team, which leverages an eventstream to bring in inventory data and store it in an Eventhouse. The inventory team has their own activator’s that run which automatically trigger alerts to teams to inventory managers on low-running items. The inventory manager orders new stock from custom outside application.  
  • An eventhouse shortcut is created for the work order team, which uses the data to check inventory before dispatching the technician. 
  • A Power BI report is created directly against Eventhouse by the Inventory team to see current stock across all products in the organization. 

To the business, this provides a much tighter integration into the everyday business processes. This fuels partnership between the data team and the business teams to embed data analytics into the everyday business processes. The setup experience is a bit different, as you can see from the above, however to the business user this leads to an implementation that is aligned directly with their objectives. 

Conclusion

Implementing RTI removes traditional data bottlenecks and accelerates innovation. Each business team can move at their own pace and create reports as needed. The native integration of Purview means governance is built in. Additional performance metrics can be monitored in a centralized analytics workspace that reports on telemetry from various workspaces. Real-Time Intelligence in Microsoft Fabric shifts the organizational data paradigm, allowing efficient and effective data use.

Next Steps 

Related blog posts

Operational Reporting with Microsoft Fabric Real-Time Intelligence

June 17, 2025 by Dan Liu

Have you ever found yourself frustrated by inconsistent item creation? Maybe you’ve struggled to select the right workspace or folder when creating a new item or ended up with a cluttered workspace due to accidental item creation. We hear you—and we’re excited to introduce the new item creation experience in Fabric! This update is designed … Continue reading “Introducing new item creation experience in Fabric”

May 19, 2025 by Yitzhak Kesselman

In a world defined by constant change, organizations are not just looking for insights. They need timely, actionable intelligence. Organizations across industries are unlocking competitive advantage with high-granularity and real-time data, driving faster decisions, optimizing workflows, and seizing opportunities. Since the general availability of Microsoft Fabric Real-Time Intelligence in November 2024, the response and momentum … Continue reading “Elevate how your organization operates using Real-Time Intelligence, now enhanced with digital twin builder”