Microsoft Fabric Updates Blog

Simplifying Data Ingestion with Copy job – Replicate data from Dataverse through Fabric to multiple destinations

Copy job is the recommended approach in Microsoft Fabric Data Factory for moving data from any sources to any destinations in a simplified and efficient way—whether you’re transferring data across clouds, from on-premises systems, or between services. With native support for multiple delivery patterns, including bulk copy, incremental copy, and change data capture (CDC) replication, Copy job provides the flexibility to handle a wide range of data movement scenarios, all through an intuitive and easy-to-use experience.

Copy job already supports a list of CDC-enabled sources and destinations, allowing changed data—including inserts, updates, and deletions—to be automatically captured and replicated to supported targets. You can get more details in Change data capture (CDC) in Copy Job – Microsoft Fabric | Microsoft Learn.

For source systems that Copy job does not yet natively support for CDC, such as Dataverse, you can still achieve CDC replication by using Copy job together with Fabric Link. Copy job becomes essential when you need to move data across regions or tenants, or when you want to replicate Dataverse data to multiple destinations beyond Fabric.

This blog provides step-by-step guidance on how to use Copy job to replicate data—including inserts, updates, and deletions—from Dataverse through Fabric to multiple destinations.

How it works

Prerequisites 

Your Dynamics Finance & Operations (F&O) ERP environment is always linked to an associated Dataverse environment. You can view this mapping in the Power Platform Admin Center, where you can check which F&O environment is connected to which Dataverse environment.

  1. Go to the Power Apps maker portal (https://make.powerapps.com), and select “Link to Microsoft Fabric” to replicate data from your Dataverse environment to a Fabric Lakehouse.

2. Enter the required connection information to connect Dataverse to a Fabric Lakehouse.

You can either create a new workspace or use an existing one in Microsoft Fabric, and a Fabric Lakehouse will be created in the selected workspace.

3. After you click “Review and Create,” a Fabric Lakehouse will be provisioned.

4. Select the tables from Dataverse and F&O that you want to synchronize to Fabric.

5. Access your newly created Fabric Lakehouse by clicking “View in Microsoft Fabric.”

6. Now, go to Microsoft Fabric (https://app.powerbi.com/) to create a Copy job to replicate the same data to multiple destinations.

7. From the creation wizard of Copy job, select the Fabric Lakehouse you just created via the Fabric Link (in step 3) as data source, and then choose any tables from the Lakehouse that you want to copy from.

8. Select the destination where you want to copy the data to. To replicate all changes—including inserts, updates, and deletions—choose a CDC-supported destination.

You can get the concrete supported connector list from Change data capture (CDC) in Copy Job – Microsoft Fabric | Microsoft Learn. For example, you can replicate data into an Azure SQL Database.

9. Select incremental copy.

10. Choose an update method to merge data to your destination.

11. After completing the creation wizard, a new Copy job will be created. When you run it, the first run will perform an initial full copy, and subsequent runs will replicate changes, including inserts, updates, and deletions.

Summary

You can easily use Copy job from Fabric Data Factory, via the Fabric Link, to replicate data—including inserts, updates, and deletions—from Dataverse into multiple destinations. This also enables replicate data from Dataverse across different tenants or regions.

Additional Resources

To learn more, explore Microsoft Fabric Copy job documentation.

Submit your feedback on Fabric Ideas and join the conversation in the Fabric Community.

If you have a question or want to share your feedback, please leave us a comment below!

Related blog posts

Simplifying Data Ingestion with Copy job – Replicate data from Dataverse through Fabric to multiple destinations

January 20, 2026 by Aaron Merrill

OneLake security now supports Mirrored Databases in Microsoft Fabric with the ability to define OneLake data access roles on all Mirrored item types. This update brings granular, role-based access control to data replicated into OneLake from transactional systems, extending the OneLake security model beyond lakehouses and enabling secure reuse of mirrored data across the organization. … Continue reading “Manage OneLake security for Mirrored Databases (Preview)”

January 20, 2026 by Ye Xu

Copy job is the go-to solution in Microsoft Fabric Data Factory for simplified data movement, whether you’re moving data across clouds, from on-premises systems, or between services. With native support for multiple delivery styles, including bulk copy, incremental copy, and change data capture (CDC) replication, Copy job offers the flexibility to handle a wide range … Continue reading “Simplifying data movement across multiple clouds with Copy job – Enhancements on incremental copy and change data capture”