Microsoft Fabric Updates Blog

Boost Performance with Fast Copy in Dataflows Gen2 for Snowflake

Fast Copy in Dataflows Gen2 is a game-changer to enhance the performance and cost-efficiency of your Dataflows Gen2. By leveraging the same optimized backend as the Copy Activity in data pipelines, Fast Copy significantly reduces data processing time and enhances cost efficiency.

Fast Copy in Dataflows Gen2 (Generally Available) enabled by default in all newly created Dataflows Gen2—making it the recommended choice for your production workloads.

In this blog post, you’ll discover how Fast Copy dramatically improves performance and efficiency when loading data from Snowflake into Microsoft Fabric.

We’ll walk through a real-world example using Dataflows Gen2 to load 180 million rows from Snowflake into a Lakehouse table. By comparing performance before and after enabling Fast Copy, you’ll see the substantial impact it can make.

Case 1: Dataflow Gen2 without Fast Copy

Configurations Steps to reproduce this scenario:

  1. Create a table in Snowflake with a Sales Data dataset containing approximately 180 million rows, using a schema similar to the following example.

Tip: You can use any similar dataset, such as the NYC Taxi dataset.

  1. Create a Dataflow Gen2 to load data from Snowflake.
  2. Disable Fast Copy in the Options settings.
  3. Set Lakehouse as the output destination
  4. Publish and refresh the Dataflow Gen2.

Performance Result

The Dataflow Gen2 refresh took approximately 42 minutes to ingest 180M rows.

Case 2: Dataflow Gen2 with Fast Copy

Configurations Steps to reproduce this scenario:

  1. Create a Dataflow Gen2 to load same data from Snowflake. (Fast Copy was enabled by default).
  2. Set Lakehouse as the output destination.
  3. Publish and refresh the Dataflow Gen2.

Performance Result

With Fast Copy enabled, the same scenario completes in about 5 minutes.

Summary

This table compares the performance of the two scenarios:

With Fast Copy enabled in Dataflows Gen2, data processing times are significantly reduced. In this example, loading 180 million rows from Snowflake into a Lakehouse in Microsoft Fabric resulted in an 8X improvement in performance.

Give it a try yourself and experience the performance boost firsthand!

More resources

Gerelateerde blogberichten

Boost Performance with Fast Copy in Dataflows Gen2 for Snowflake

januari 22, 2026 door Anna Hoffman

The SQL community is gathering in Atlanta this March for the first‑ever SQLCon, co‑located with FabCon, the Microsoft Fabric Community Conference, March 16-20. One registration unlocks both events, giving you access to deep SQL expertise and the latest in Fabric, Power BI, data engineering, real‑time intelligence, and AI. Whether you’re a DBA, developer, data engineer, architect, or a … Continue reading “Five Reasons to attend SQLCon”

november 24, 2025 door Jianlei Shen

This milestone marks a major step forward in unifying and simplifying data movement experiences across Data Factory. With Copy Job Activity, users can now enjoy the simplicity and speed of Copy Job while leveraging the orchestration power and flexibility of Data Factory pipelines. What is the Copy job Activity  Copy Job Activity allows you to … Continue reading “Announcing Copy Job Activity in Data Factory Pipeline (Generally Available)”