Microsoft Fabric Updates Blog

Fast Copy with On-premises Data Gateway Support in Dataflow Gen2

Fast Copy boosts the speed and cost-effectiveness of your Dataflows Gen2. We loaded a 6 GB CSV file to a Lakehouse table in Microsoft Fabric with 8x faster and 3x cheaper result. See our last post for details.

Today, we’re excited to announce that Fast Copy in Dataflow Gen2 now supports high-performance data transfers from on-premises data stores using a gateway. You can use your existing on-premises data gateway in your Data Factory to access on-premise stores like SQL Server with Fast Copy in Dataflow Gen2.

Let’s see how it works. 

Assume that you have the on-premises data gateway installed and registered in your Data Factory as below.

Create your Dataflow Gen2, and you can access data from a SQL Server through that on-premises data gateway.

Click the Options button from Dataflow Gen2 pane to enable Fast Copy feature.

The indicator will also show you that your data will be loaded via Fast Copy after evaluating your data loading step.

After you finish all the configurations in Dataflow Gen2, publish it. You will see the copy activity engine is used to transfer your data from an on-premises SQL Server in a fast way.

More resources

Have any questions or feedback? Leave a comment below!

Gerelateerde blogberichten

Fast Copy with On-premises Data Gateway Support in Dataflow Gen2

juni 12, 2025 door RK Iyer

Introduction Whether you’re building analytics pipelines or conversational AI systems, the risk of exposing sensitive data is real. AI models trained on unfiltered datasets can inadvertently memorize and regurgitate PII, leading to compliance violations and reputational damage. This blog explores how to build scalable, secure, and compliant data workflows using PySpark, Microsoft Presidio, and Faker—covering … Continue reading “Privacy by Design: PII Detection and Anonymization with PySpark on Microsoft Fabric”

mei 29, 2025 door Amit Chandra

Microsoft Fabric and Azure Databricks are widely used data platforms. This article aims to address the requirement of customers who have large data estates in Azure databricks and want to unlock additional use cases in Microsoft Fabric arising out of different business teams. When integrating the two platforms, a crucial security requirement is to ensure … Continue reading “Integrating Fabric with Databricks using private network”