Microsoft Fabric Updates Blog

Zero-copy access to OneLake data in Azure Databricks (Preview)

Most data estates are not single platform, and that is not a problem. The challenge is what usually comes next: extra copies, extra pipelines, extra refresh schedules, and endless debates about which version is the truth.

Today, we are introducing OneLake catalog federation (Beta) in Azure Databricks Lakehouse Federation, which simplifies multi-engine analytics by enabling Unity Catalog in Azure Databricks to query data stored in OneLake. This allows you to analyze Fabric tables without copying the data.

A Microsoft Fabric and Azure Databricks integration diagram showing metadata syncing from Fabric Items to Unity Catalog and zero-copy query access from OneLake to Databricks compute. The visual highlights how Fabric data stored in OneLake can be queried directly by Databricks without moving or duplicating data
 

Figure: Diagram Azure Databricks reading OneLake data

Why this matters

Fabric is built around a simple idea: your data should be usable the moment it lands, and it should stay governed and consistent wherever it is consumed. OneLake is the foundation that makes that real by giving your organization a single, shared data lake that is easy to manage, secure, and scale.

OneLake catalog federation extends that promise even further. It allows more teams to use the same curated data products in OneLake without creating additional copies or building parallel pipelines just to satisfy different tools. That means fewer moving parts, fewer refresh problems, and far less time spent reconciling datasets. You keep OneLake as the source of truth and the Fabric experience intact, while expanding how broadly those OneLake data products can be used across the organization.

What you can do

  • Discover Fabric tables from Unity CatalogOnce connected, your Fabric schemas and tables show up in Unity Catalog through a foreign catalog that stays in sync.
  • Query OneLake data from Databricks compute
    You run Databricks SQL and notebooks as usual, using catalog.schema.table naming.
  • Keep OneLake as the source of truth
    No extra storage copies. No extra refresh jobs.

Conceptual overview

  1. You create a OneLake connection in Unity Catalog.
  2. You create a foreign catalog that points to a specific Fabric item.
  3. Databricks syncs the metadata, so schemas and tables appear in Unity Catalog.
  4. Queries run on Databricks compute while reading the data in OneLake.

Follow the full walkthrough in the documentation: Enable OneLake catalog federation – Azure Databricks | Microsoft Learn

Beta limitations

This capability is in Beta. Explore the specific requirements and supported configurations detailed in the Beta limitations documentation.

Try it out

OneLake is about removing friction between data and value. OneLake catalog federation is another step in that direction: fewer copies, simpler architecture, and broader reuse of the data products you already build in Fabric.

A shared OneLake foundation unlocks new possibilities for what teams can build next.

Powiązane wpisy w blogu

Zero-copy access to OneLake data in Azure Databricks (Preview)

kwietnia 14, 2026 autor: Tzvia Gitlin Troyna

Modern analytics isn’t just about storing data. It’s about detecting issues early, understanding them fast, and acting with confidence. Eventhouse in Microsoft Fabric brings advanced analytics capabilities together so teams can move from raw events to insight and action without stitching tools or duplicating data. With native integrations for Anomaly Detection, Data Agents, SQL Endpoints, … Continue reading “One platform, many insights: How Eventhouse brings analytics together (Preview)”

kwietnia 14, 2026 autor: Tzvia Gitlin Troyna

Modern, real-time analytics workloads are rarely flat. In Eventhouse, some of the customers consistently told us that their usage follows clear, predictable patterns: heavy ingestion during business hours, lighter query traffic overnight, quiet weekends, and short but critical pipeline windows. Previously, customers had to choose a single minimum capacity value for the entire week, paying … Continue reading “Capacity Scheduler: Smarter capacity control for Eventhouse (Preview)”