Microsoft Fabric Updates Blog

Boost Performance with Fast Copy in Dataflows Gen2 for Snowflake

Fast Copy in Dataflows Gen2 is a game-changer to enhance the performance and cost-efficiency of your Dataflows Gen2. By leveraging the same optimized backend as the Copy Activity in data pipelines, Fast Copy significantly reduces data processing time and enhances cost efficiency.

Fast Copy in Dataflows Gen2 (Generally Available) enabled by default in all newly created Dataflows Gen2—making it the recommended choice for your production workloads.

In this blog post, you’ll discover how Fast Copy dramatically improves performance and efficiency when loading data from Snowflake into Microsoft Fabric.

We’ll walk through a real-world example using Dataflows Gen2 to load 180 million rows from Snowflake into a Lakehouse table. By comparing performance before and after enabling Fast Copy, you’ll see the substantial impact it can make.

Case 1: Dataflow Gen2 without Fast Copy

Configurations Steps to reproduce this scenario:

  1. Create a table in Snowflake with a Sales Data dataset containing approximately 180 million rows, using a schema similar to the following example.

Tip: You can use any similar dataset, such as the NYC Taxi dataset.

  1. Create a Dataflow Gen2 to load data from Snowflake.
  2. Disable Fast Copy in the Options settings.
  3. Set Lakehouse as the output destination
  4. Publish and refresh the Dataflow Gen2.

Performance Result

The Dataflow Gen2 refresh took approximately 42 minutes to ingest 180M rows.

Case 2: Dataflow Gen2 with Fast Copy

Configurations Steps to reproduce this scenario:

  1. Create a Dataflow Gen2 to load same data from Snowflake. (Fast Copy was enabled by default).
  2. Set Lakehouse as the output destination.
  3. Publish and refresh the Dataflow Gen2.

Performance Result

With Fast Copy enabled, the same scenario completes in about 5 minutes.

Summary

This table compares the performance of the two scenarios:

With Fast Copy enabled in Dataflows Gen2, data processing times are significantly reduced. In this example, loading 180 million rows from Snowflake into a Lakehouse in Microsoft Fabric resulted in an 8X improvement in performance.

Give it a try yourself and experience the performance boost firsthand!

More resources

Related blog posts

Boost Performance with Fast Copy in Dataflows Gen2 for Snowflake

November 5, 2025 by Pradeep Srikakolapu

In our earlier announcement, we shared that newly created data warehouses, lakehouses and other items in Microsoft Fabric would no longer automatically generate default semantic models. This change allows customers to have more control over their modeling experience and to explicitly choose when and how to create semantic models. Starting end of October 2025, Microsoft … Continue reading “Decoupling Default Semantic Models for Existing Items in Microsoft Fabric”

October 29, 2025 by Ye Xu

Copy job is the go-to solution in Microsoft Fabric Data Factory for simplified data movement, whether you’re moving data across clouds, from on-premises systems, or between services. With native support for multiple delivery styles, including bulk copy, incremental copy, and change data capture (CDC) replication, Copy job offers the flexibility to handle a wide range … Continue reading “Simplifying Data Ingestion with Copy job – More File Formats with Enhancements”