Microsoft Fabric Updates Blog

Simplifying Data Ingestion with Copy Job – Connection Parameterization, Expanded CDC and Connectors

Copy job is the go-to solution in Microsoft Fabric Data Factory for simplified data movement. With native support for multiple delivery styles, including bulk copy, incremental copy, and change data capture (CDC) replication, Copy job offers the flexibility to handle a wide range of scenarios—all through an intuitive, easy-to-use experience.

This update introduces several enhancements, including connection parameterization, expanded CDC capabilities, new connectors, and a streamlined Copy Assistant powered by Copy job.

Connection Parameterization with Variables Library for CI/CD (Preview)

This powerful capability helps automate your CI/CD processes by externalizing connection values. With it, you can deploy the same Copy job across multiple environments while relying on the Variable Library to inject the correct connection for each stage. Meaning, you can seamlessly use different data stores for development, testing, and production—without modifying your Copy job each time.

Capabilities:

  • Parameterize connection using variables stored in the Variable Library in Fabric.
  • Promote Copy job seamlessly across environments—for example, from Dev to Test to Production—without hardcoding or manually editing data store connections.
  • Centralize configuration management, reducing duplication with a unified approach that makes it easier to manage configurations consistently across different environments

To learn more, refer to the CI/CD for Copy Job in Data CI/CD for Copy job in Data Factory in Microsoft Fabric Factory documentation.

Data Replication from Fabric Lakehouse with Delta Change Data Feed (Preview)

Copy job now supports Fabric Lakehouse Table connector with native CDC support. This connector enables efficient, automated replication of changed data—including inserts, updates, and deletes—from a Fabric Lakehouse via Delta Change Data Feed (CDF) to supported destinations. With this enhancement, your destination data stays continuously up to date—no manual refreshes, no extra effort—making your data integration workflows more efficient and reliable.

  • Replicate data changes seamlessly from your Lakehouse after processing is completed in OneLake.
  • Distribute updated data to supported destinations outside Fabric, such as SQL or Snowflake.
  • Save time and reduce complexity with automated, change-aware data movement.

This new CDC connector brings you the flexibility to keep downstream systems in multi-cloud environment in sync—ensuring your data is always accurate, timely, and ready for action.

To learn more, refer to the Change data capture (CDC) in Copy Job (Preview) documentation.

Merge data into Snowflake

You now have the ability to choose to merge changed data—including inserts, updates, and deletions—into Snowflake, when the data originates from any CDC source connectors such as Azure SQL DB, SQL Server, SQL MI, or Fabric Lakehouse tables.

What’s more, with Storage Integration support in the Snowflake connector for Copy job, you gain enhanced security through a Snowflake-assigned role. This eliminates the need to expose sensitive credentials and allows you to implement more secure authentication methods when connecting to Azure Blob Storage.

Find more details in the CREATE STORAGE INTEGRATION or Change data capture (CDC) in Copy Job documentation.

More Connectors, More Possibilities

More source and destination connections are now available, giving you greater flexibility for data ingestion with Copy job. We’re not stopping here—even more connectors are coming soon!

Newly supported connectors:

  • Folder
  • REST
  • SAP Table
  • SAP BW Open Hub
  • Amazon RDS for Oracle
  • Cassandra
  • Greenplum
  • Informix
  • Microsoft Access database
  • Presto

Incremental copy now supported for more connectors:

  • SAP HANA
  • MariaDB
  • MySQL
  • SFTP
  • FTP
  • Oracle cloud storage
  • Amazon S3 Compatible

Learn more in What is Copy job in Data Factory – Microsoft Fabric | Microsoft Learn

Simplified Copy Assistant, Powered by Copy Job

Access the full power of Copy job by selecting Copy Assistant from a pipeline; eliminating the need for unnecessary parameterized foreach loops and copy activities as before for simple data copying. It also empowers you to benefit from all Copy job capabilities, including native incremental copy and Change Data Capture (CDC).

To learn more, refer to the What is Copy job in Data Factory for Microsoft Fabric? documentation.

Additional Resources

To learn more, explore Microsoft Fabric Copy Job documentation.

Submit your feedback on Fabric Ideas and join the conversation in the Fabric Community.

To get into the technical details, check out the Fabric documentation.  

If you have a question or want to share your feedback, please leave us a comment.

Relaterade blogginlägg

Simplifying Data Ingestion with Copy Job – Connection Parameterization, Expanded CDC and Connectors

november 10, 2025 från Wangui McKelvey

In today’s data-driven world, semantic models have become the backbone of trustworthy analytics. They define business logic, metrics, and relationships that turn raw data into meaningful, trusted and curated insights. As organizations embrace generative AI, semantic models provide the structure and context that AI needs to deliver accurate, reliable answers.  At Microsoft, we have spent nearly two decades refining the semantic layer that connects data … Continue reading “Microsoft named Leader and Outperformer in the 2025 GigaOm Radar for Semantic Layers & Metric Stores “

november 5, 2025 från Pam Spier

There’s no AI without data. Are you ready to invest in your future? Fabric Data Days offers 50+ days of immersive learning designed for data professionals at every level and students alike. Don’t miss your chance to gain practical experience, earn free certifications exam voucher, and connect with a global community of experts. Here’s what … Continue reading “Advance your career in Data & AI with Microsoft Fabric Data Days”