Microsoft Fabric Updates Blog

Gain End-to-End Visibility into Data Activity Using OneLake diagnostics (Generally Available)

OneLake diagnostics, makes it simple to answer ‘who accessed what, when, and how’ across your Fabric workspaces. OneLake diagnostics provides similar capabilities to Azure Blob Storage monitoring.

OneLake is the data lake powering Microsoft Fabric. OneLake unifies all your data into a single virtual lake and brings together powerful features for enterprise data management:

  • Shortcuts and Mirroring let you bring data into OneLake without complex ETL—Shortcuts create virtual links without copying data, while Mirroring keeps source data continuously replicated in analytics-ready Delta format.
  • Shortcuts make it simple to connect and reuse data products across your organization.
  • OneLake security ensures only authorized users can access sensitive data.
  • The OneLake catalog makes it easy to discover and leverage data assets for analytics and AI.
  • With OneLake diagnostics, you gain deep visibility into how your data is accessed and used—no matter where it’s consumed.

Alongside Workspace monitoring and user activity tracking accessible through Microsoft Purview, these capabilities make federated data governance a reality at enterprise scale.

Enable diagnostics at the workspace level, and OneLake streams diagnostic events as JSON into a Lakehouse you choose—within the same capacity. You can use these events to unlock usage insights, provide operational visibility, and support compliance reporting.

Why use OneLake diagnostics?

Data governance, operations teams, and data product owners need a unified, trustworthy record of data activity—not just for compliance, but to understand how data products are used and valued across the organization.

With OneLake diagnostics, you gain a consistent log of data access and operations, spanning everything from user actions in the Fabric web experience to programmatic access via APIs, pipelines, and analytics engines.

Even cross-workspace shortcuts are covered: events are captured when diagnostics is enabled in the source workspace, ensuring visibility no matter where data is consumed.

Because events are stored in your Lakehouse as open JSON files, you can analyze them with the tools you already use—Spark, SQL, Eventhouse, Power BI, or any solution that ingests JSON logs. Because OneLake implements the ADLS and Azure Blob Storage APIs, all that data is accessible outside of Fabric too!

How it works

  • In each workspace, a Workspace Admin can enable OneLake diagnostics from the OneLake section of the workspace settings pane and select any Lakehouse in the same capacity to store events. (We recommend a dedicated ‘diagnostics’ workspace with restricted access.) Events are written as append-only JSON files.
  • Convert the JSON diagnostic events into a Delta table using Spark, import into an Eventhouse, or consume directly via Power BI.
  • Build dashboards that track access frequency, top items, and trends in minutes.
  • Everyday actions—uploading files in the portal, programmatic reads/writes via SDKs, OneLake file explorer, and more—generate events that land in the selected Lakehouse.

Note: Allow ~1 hour after enabling for event flow to begin.

When are diagnostic events captured?

OneLake diagnostic events capture both internal and external access to data in OneLake.

  • For external access—including activity via the Fabric user experience, or when OneLake is protected by workspace private links—OneLake captures all Azure Blob storage operations.
  • For internal Fabric access, OneLake diagnostics records that access was temporarily provided to the Fabric workload.

This means you get detailed logs when trust is lowest, and high-level logs that point you to the more detailed records of the Fabric experience you’re using.

As a result, retaining diagnostic data is more efficient and cost-effective, with no duplication—yet you still have enough information to understand how your data products are being consumed across your organization.

How does this impact consumption?

OneLake diagnostics consumption is equivalent to the cost of Azure Storage diagnostics when diagnostic events are captured into an Azure Storage account.

OneLake diagnostics has two associated meters. For the latest details, refer to the official pricing page: OneLake compute and storage consumption.

  • ‘OneLake Diagnostics Event Operation‘: Charges apply as data is written to the destination Lakehouse at the same rate as OneLake Write operations (or OneLake BCDR Write, if BCDR is enabled).
  • ‘OneLake Diagnostics Data Transfer‘: Diagnostic event ingestion is 1.389 CU Hours per GB calculated from the size of the diagnostic event.

Best practices for configuring OneLake diagnostics

  • Use a dedicated diagnostics workspace: While it’s possible to capture diagnostic events in the same workspace that generates them, we recommend capturing diagnostic events in a different workspace. This allows you to set separate permissions, so users accessing the data do not automatically have access to the diagnostic logs.
  • Centralize diagnostics for multiple workspaces: If you are enabling diagnostics for multiple workspaces within the same capacity, consider capturing all events in the same destination Lakehouse. This consolidates your diagnostic data and makes analysis much simpler.
  • Leverage shortcuts for flexible analysis: You can create shortcuts to diagnostic data, allowing you to compose diagnosticsacross capacities and regions as needed. This makes it easy to build a unified view of data activity, regardless of where the data or diagnostics are stored.

Personal Data

OneLake diagnostic events capture executingUPN and callerIpAddress information. A forthcoming tenant admin setting will provide the option to disable collection of these details. Prior to the availability of this setting, which is expected within the next few weeks, these fields will be redacted.

Monitoring OneLake diagnostics is enabled

You can use the Microsoft 365 security logs to monitor when diagnostic events are enabled or disabled for your workspaces using the new ModifyOneLakeDiagnosticSettings event.

Get started today!

Related blog posts

Gain End-to-End Visibility into Data Activity Using OneLake diagnostics (Generally Available)

October 29, 2025 by Adam Saxton

This month’s update delivers key advancements across Microsoft Fabric, including enhanced security with Outbound Access Protection and Workspace-Level Private Link, smarter data engineering features like Adaptive Target File Size, and new integrations such as Data Agent in Lakehouse. Together, these improvements streamline workflows and strengthen data governance for users. Contents Events & Announcements Fabric Data … Continue reading “Fabric October 2025 Feature Summary”

October 20, 2025 by Tzvia Gitlin Troyna

The Eventhouse Endpoint for Lakehouse is a powerful new capability in Microsoft Fabric that enables users to query Lakehouse tables with exceptional speed and ease, delivering real-time insights with high performance with large data volume, flexibility, advanced analytics capabilities, support for enhanced data formats such as strings and dynamic types and simplicity. Whether you’re working … Continue reading “Unlock Real-Time Intelligence with the Eventhouse Endpoint for Lakehouse”