Microsoft Fabric Updates Blog

Accelerating Data Movement by using Fast Copy to unlock performance and efficiency during data ingestion from SQL database in Fabric

Accelerating Data Movement: Leveraging Fast Copy Feature in Dataflow to ingest SQL database faster. Unlocking Performance and Efficiency for Modern Data Workloads.

The Fast Copy feature in Dataflow is designed to dramatically reduce data movement latency. It minimizes data movement latency primarily through bulk loading and parallel processing by dividing large datasets into manageable chunks. This parallelism not only shortens the overall copy duration but also makes efficient use of both source and destination compute resources.

Now, the Fast Copy capability can also be used to ingest large volume data from SQL database.

Note: Fast Copy can use SQL database as a source only.

A screenshot of a computer

AI-generated content may be incorrect.

Implementing Fast Copy in Dataflow

Step1 – Create a new data flow by selecting Dataflow Gen2 item.

Step 2 – Select options icon.

A screenshot of a computer

AI-generated content may be incorrect.

Step 3 – Selecting the Options icon would show a popup like the image below. From the left menu select Scale under Dataflow and make sure ‘Allow use of fast copy connectors’ check box is checked.

A screenshot of a computer

AI-generated content may be incorrect.

Step 4 – Select a SQL database as source and OneLake for destination to build your data flow.

Step 5 – Publish and Run Dataflow.

Note: Dataflow automatically switch to Fast copy when data size exceeds 100MB or 1 million rows.

To force the use of Fast Copy where the data is not 100MB or 1million rows, select the ‘Require fast copy’ option by right clicking the query.

A screenshot of a computer

AI-generated content may be incorrect.

Step 6 – Navigate to Refresh history for Dataflow to verify if fast copy was used successfully. Select the start time for the run.

A white rectangular object with black lines

AI-generated content may be incorrect.

Select an activity from Activities list.

A screenshot of a computer

AI-generated content may be incorrect.

Screenshot example – ‘CopyActivity’ which means this data movement used Fast Copy.

A screenshot of a computer

AI-generated content may be incorrect.

Screenshot example – ‘Allow use of fast copy connectors’ disabled. Engine shows ‘-‘ which means Dataflow didn’t use Fast Copy.

A screenshot of a computer

AI-generated content may be incorrect.

The Fast Copy feature in Dataflow reduces data movement latency and enables high-throughput, low-touch migrations. If your organization is striving for operational excellence and analytics at the speed of business, now is the time to explore Fast Copy in your Dataflow pipelines—and experience the future of data integration today.

To learn more, refer to the Fast copy in Dataflow Gen2 documentation.

 

Postagens relacionadas em blogs

Accelerating Data Movement by using Fast Copy to unlock performance and efficiency during data ingestion from SQL database in Fabric

novembro 21, 2025 de Sravani Saluru

Auditing for Fabric SQL database, is a powerful feature designed to help organizations strengthen security, ensure compliance, and gain deep operational insights into their data environments. Why Auditing Matters Auditing is a cornerstone of data governance. With Fabric SQL Database auditing, you can now easily track and log database activities—answering critical questions like who accessed … Continue reading “Auditing for Fabric SQL database (Preview)”

novembro 19, 2025 de Ajay Jagannathan

In today’s AI driven world, analytics platforms are only as good as their data. With the ever-increasing amount of data being collected in various applications, databases, and data warehouses in an enterprise, managing and ingesting data into a central platform for purposes of analytics and AI is a cumbersome and costly process. Databases and data … Continue reading “Mirroring for SQL Server in Microsoft Fabric (Generally Available)”