Microsoft Fabric Updates Blog

Updates to default data destination behavior in Dataflow Gen2

When you have a Lakehouse or Warehouse and you want to load data into it, you can use Dataflow Gen2 as an easy, low-code way for landing your data with the right shape.

You can always create a stand-alone Dataflow Gen2 and use the data destinations to load your data in any Fabric Lakehouse or Warehouse, but to speed up your development, there are some easy other ways to land your data faster.

This blog is to update you on the experience and important changes that were made.

Within the Lakehouse or Warehouse experience, you can get data through a variation of options.

Select new dataflow gen 2 from get data dropdown in Lakehouse or Warehouse

When choosing Dataflow Gen2 from either the Lakehouse or the Warehouse, the data destination experience is slightly different from a ‘standard’ Dataflow Gen2 that was created from the workspace.

By default, any query that you create will have the Lakehouse or Warehouse you started from set as the data destination. If you hover over the data destination icon, you can see that the destination is labeled as ‘default destination’. This is different from the standard Dataflow Gen2, where you explicitly have to assign a query with a data destination.

Hover over the data destination icon to see that the destination is labeled as defautl destination

With the default destination, the settings are set to a default behavior that cannot be changed. This are the behaviors for both Lakehouse and Warehouse default destination:

BehaviorLakehouseWarehouse
Update methodReplaceAppend
Schema change on publishDynamicFixed

Note: Previously, update method for Lakehouse was append. This is now changed to replace.

To edit the settings of an individual data destination, use the gear icon to edit the destination. When you edit the individual data destination, this change is only affecting the specific query. It is currently not possible to change the behavior of the default destination.

Related blog posts

Updates to default data destination behavior in Dataflow Gen2

April 22, 2026 by Bodhisatva Gautam

Co-author: Abhishek Narain Workspace outbound access protection (OAP) is widely accessible for Data Factory workloads—including Pipelines, Copy Job, and Dataflows—as well as for Mirrored Databases such as Mirrored SQL Database and Mirrored Snowflake. Key benefits Enhanced outbound security: By leveraging OAP rules, organizations can ensure that the Data Factory items from the protected workspace can … Continue reading “Outbound access protection for Data Factory (Generally Available)”

April 20, 2026 by Penny Zhou

Coordinating dbt runs with upstream ingestion and downstream consumption often requires complex solutions and different tools. You can now add a dbt job activity (Preview) directly to your Fabric pipelines. This lets you orchestrate dbt transformations alongside other pipeline activities, so you can build end-to-end data workflows without switching tools. Why this matters Run dbt … Continue reading “Orchestrate dbt jobs activity in your Fabric pipelines (Preview)”