Microsoft Fabric Updates Blog

Announcing: Fabric Warehouse publishing full DML to Delta Lake Logs

We are excited to announce that the Data Warehouse now publishes all Inserts, Updates and Deletes for each table to their Delta Lake Log in OneLake!

Our vision is to break down data silos and make it really easy to share data from your Data Warehouses with other teams who use different services without having to create copies of your data in different formats.

What does this mean?

Today, teams have a wide set of skills and varying comfort levels with different tools and query languages such as Python, T-SQL, KQL and DAX. Instead of having to create copies of your data in different formats for each tool and service, Fabric leverages Delta Lake as a common format across all of its services. By only having one copy of your data, this makes it more secure, easier to manage, ensures the data is consistent across reports and it makes it faster and easier to share your data.

The Data Warehouse supports this, by publishing Delta Lake Logs for every table that you create in your Data Warehouses. When you modify data within a Data Warehouse table, those changes will be visible in the Delta Lake Log within 1 minute of the transaction being committed.

For example, say you want to use Python to query a Data Warehouse table by using a Notebook in a Lakehouse. All you would need to do, is to create a new shortcut in the Lakehouse and point it to the Data Warehouse Table. That table is now directly accessible by your Notebook and no data has been copied or duplicated! Data Scientists and Data Engineers are going to love how easy it is to incorporate your Data Warehouse Tables into their projects like Machine Learning and training AI models.

To learn more about how to create shortcuts that point to Data Warehouse Tables, please see this documentation article: Create a OneLake shortcut – Microsoft Fabric | Microsoft Learn

Conclusion

You might wonder, how do I enable this? The answer is that you do not have to do anything! This all happens automatically for your Data Warehouses.

Note, only tables created going forward will have all DML published. If you have an older table that you wish to be fully published, you will need to use CTAS (Create Table as Select) to create a new copy of the table with all of its data or drop the table and reload it.

To learn more about how to leverage your Data Warehouse’s data through its published Delta Lake Logs, please see our documentation Delta Lake logs in Warehouse – Microsoft Fabric | Microsoft Learn.

Postagens relacionadas em blogs

Announcing: Fabric Warehouse publishing full DML to Delta Lake Logs

novembro 5, 2025 de Pradeep Srikakolapu

In our earlier announcement, we shared that newly created data warehouses, lakehouses and other items in Microsoft Fabric would no longer automatically generate default semantic models. This change allows customers to have more control over their modeling experience and to explicitly choose when and how to create semantic models. Starting November 20, 2025, Power BI … Continue reading “Decoupling Default Semantic Models for Existing Items in Microsoft Fabric”

novembro 3, 2025 de Jovan Popovic

Data ingestion is one of the most important actions in the Data Warehouse solutions. In Microsoft Fabric Data Warehouse, the OPENROWSET function provides a powerful and flexible way to read data from files stored in Fabric OneLake or external Azure Storage accounts. Whether you’re working with Parquet, CSV, TSV, or JSONL files, the OPENROWSET function … Continue reading “Ingest files into your Fabric Data Warehouse using the OPENROWSET function”