Microsoft Fabric Updates Blog

Simplifying Data Ingestion with Copy job – Replicate data from Dataverse through Fabric to multiple destinations

Copy job is the recommended approach in Microsoft Fabric Data Factory for moving data from any sources to any destinations in a simplified and efficient way—whether you’re transferring data across clouds, from on-premises systems, or between services. With native support for multiple delivery patterns, including bulk copy, incremental copy, and change data capture (CDC) replication, Copy job provides the flexibility to handle a wide range of data movement scenarios, all through an intuitive and easy-to-use experience.

Copy job already supports a list of CDC-enabled sources and destinations, allowing changed data—including inserts, updates, and deletions—to be automatically captured and replicated to supported targets. You can get more details in Change data capture (CDC) in Copy Job – Microsoft Fabric | Microsoft Learn.

For source systems that Copy job does not yet natively support for CDC, such as Dataverse, you can still achieve CDC replication by using Copy job together with Fabric Link. Copy job becomes essential when you need to move data across regions or tenants, or when you want to replicate Dataverse data to multiple destinations beyond Fabric.

This blog provides step-by-step guidance on how to use Copy job to replicate data—including inserts, updates, and deletions—from Dataverse through Fabric to multiple destinations.

How it works

Prerequisites 

Your Dynamics Finance & Operations (F&O) ERP environment is always linked to an associated Dataverse environment. You can view this mapping in the Power Platform Admin Center, where you can check which F&O environment is connected to which Dataverse environment.

  1. Go to the Power Apps maker portal (https://make.powerapps.com), and select “Link to Microsoft Fabric” to replicate data from your Dataverse environment to a Fabric Lakehouse.

2. Enter the required connection information to connect Dataverse to a Fabric Lakehouse.

You can either create a new workspace or use an existing one in Microsoft Fabric, and a Fabric Lakehouse will be created in the selected workspace.

3. After you click “Review and Create,” a Fabric Lakehouse will be provisioned.

4. Select the tables from Dataverse and F&O that you want to synchronize to Fabric.

5. Access your newly created Fabric Lakehouse by clicking “View in Microsoft Fabric.”

6. Now, go to Microsoft Fabric (https://app.powerbi.com/) to create a Copy job to replicate the same data to multiple destinations.

7. From the creation wizard of Copy job, select the Fabric Lakehouse you just created via the Fabric Link (in step 3) as data source, and then choose any tables from the Lakehouse that you want to copy from.

8. Select the destination where you want to copy the data to. To replicate all changes—including inserts, updates, and deletions—choose a CDC-supported destination.

You can get the concrete supported connector list from Change data capture (CDC) in Copy Job – Microsoft Fabric | Microsoft Learn. For example, you can replicate data into an Azure SQL Database.

9. Select incremental copy.

10. Choose an update method to merge data to your destination.

11. After completing the creation wizard, a new Copy job will be created. When you run it, the first run will perform an initial full copy, and subsequent runs will replicate changes, including inserts, updates, and deletions.

Summary

You can easily use Copy job from Fabric Data Factory, via the Fabric Link, to replicate data—including inserts, updates, and deletions—from Dataverse into multiple destinations. This also enables replicate data from Dataverse across different tenants or regions.

Additional Resources

To learn more, explore Microsoft Fabric Copy job documentation.

Submit your feedback on Fabric Ideas and join the conversation in the Fabric Community.

If you have a question or want to share your feedback, please leave us a comment below!

Related blog posts

Simplifying Data Ingestion with Copy job – Replicate data from Dataverse through Fabric to multiple destinations

March 2, 2026 by Mihir Wagle (he/him)

Power Query has long been at the center of data preparation across Microsoft products—from Excel and Power BI to Dataflows and Fabric. We’re introducing a major evolution: the ability to execute Power Query programmatically through a public API. This capability turns Power Query into a programmable data transformation engine that can be invoked on demand … Continue reading “Evaluate Power Query Programmatically in Microsoft Fabric (Preview)”

February 25, 2026 by Katie Murray

Welcome to the February 2026 Microsoft Fabric update! This month brings a wide range of enhancements across the Fabric platform—from improvements to the OneLake Catalog and developer experiences, to meaningful updates in Data Engineering, Data Factory, Real‑Time Intelligence, and more. Whether you’re building, operating, or scaling solutions in Fabric, there’s plenty here to explore. And … Continue reading “Fabric February 2026 Feature Summary”