Microsoft Fabric Updates Blog

Announcing: Fabric Warehouse publishing full DML to Delta Lake Logs

We are excited to announce that the Data Warehouse now publishes all Inserts, Updates and Deletes for each table to their Delta Lake Log in OneLake!

Our vision is to break down data silos and make it really easy to share data from your Data Warehouses with other teams who use different services without having to create copies of your data in different formats.

What does this mean?

Today, teams have a wide set of skills and varying comfort levels with different tools and query languages such as Python, T-SQL, KQL and DAX. Instead of having to create copies of your data in different formats for each tool and service, Fabric leverages Delta Lake as a common format across all of its services. By only having one copy of your data, this makes it more secure, easier to manage, ensures the data is consistent across reports and it makes it faster and easier to share your data.

The Data Warehouse supports this, by publishing Delta Lake Logs for every table that you create in your Data Warehouses. When you modify data within a Data Warehouse table, those changes will be visible in the Delta Lake Log within 1 minute of the transaction being committed.

For example, say you want to use Python to query a Data Warehouse table by using a Notebook in a Lakehouse. All you would need to do, is to create a new shortcut in the Lakehouse and point it to the Data Warehouse Table. That table is now directly accessible by your Notebook and no data has been copied or duplicated! Data Scientists and Data Engineers are going to love how easy it is to incorporate your Data Warehouse Tables into their projects like Machine Learning and training AI models.

To learn more about how to create shortcuts that point to Data Warehouse Tables, please see this documentation article: Create a OneLake shortcut – Microsoft Fabric | Microsoft Learn

Conclusion

You might wonder, how do I enable this? The answer is that you do not have to do anything! This all happens automatically for your Data Warehouses.

Note, only tables created going forward will have all DML published. If you have an older table that you wish to be fully published, you will need to use CTAS (Create Table as Select) to create a new copy of the table with all of its data or drop the table and reload it.

To learn more about how to leverage your Data Warehouse’s data through its published Delta Lake Logs, please see our documentation Delta Lake logs in Warehouse – Microsoft Fabric | Microsoft Learn.

Relaterade blogginlägg

Announcing: Fabric Warehouse publishing full DML to Delta Lake Logs

april 17, 2025 från Jovan Popovic

The BULK INSERT statement is generally available in Fabric Data Warehouse. The BULK INSERT statement enables you to ingest parquet or csv data into a table from the specified file stored in Azure Data Lake or Azure Blob storage: The BULK INSERT statement is very similar to the COPY INTO statement and enables you to … Continue reading “BULK INSERT statement is generally available!”

april 8, 2025 från Meenal Srivastva

We are excited to announce the latest update to our permission model for OneLake events in the Fabric Real-Time Hub. Previously, users with the ReadAll permission, such as workspace admins, members, and contributors, could subscribe to OneLake events for items like lakehouses, warehouses, SQL databases, mirrored databases, and KQL databases. To provide more granular control, we … Continue reading “Announcing permission model changes for OneLake events in Fabric Real-Time Hub”