Microsoft Fabric Updates Blog

How Stibo Systems’ MDM powers trusted data for analytics and AI in Microsoft Fabric (Preview)

The preview of the Stibo Systems Master Data Management (MDM) workload on Microsoft Fabric which integrates enterprise customers’ master data and ingests it directly into Fabric OneLake through their DaaS (Data as a service) feature to unlock analytics and AI use-cases for them is now available. 

Unlocking financial insights with Capital Markets DataHub workload—A partner-led innovation in Microsoft Fabric (Preview)

We are releasing the preview of a domain‑native data solution built as a workload in Fabric to power analytics and AI-enabled use-cases for capital markets and hedge fund customers in the finance industry. Financial institutions are moving fast to modernize their analytics stack and accelerate AI adoption, however in capital markets, data complexity remains the single biggest blocker to achieving that goal.

Industrial Analytics delivered at-scale: Powered by Fabric Real-Time Intelligence and Fusion Data Hub

Industrial organizations generate a continuous stream of operational signals—temperature, pressure, flow, vibration, energy, and more. Much of that data is captured in plant historians, systems built to collect and store sensor and equipment data over long periods, often driven by compliance needs and operational reporting.

Integrating Dynamics 365 Business Central with Microsoft Fabric using Open Mirroring with BC2Fab workload (Generally Available)

Integrating Dynamics 365 Business Central with Microsoft Fabric is a common requirement as organizations modernize their analytics platforms. The primary challenge is not connectivity, but establishing an architecture that scales predictably, protects the ERP system, and enables analytics teams to focus on insights rather than maintaining ingestion pipelines.

Third-party support for OneLake security

As modern data lakes are built on open-source technology like Delta and Iceberg, customers expect to use the analytics engines and services that best fit their needs—without copying data or redefining security. This creates a clear requirement: security must be defined once and enforced consistently everywhere data is consumed.