Microsoft Fabric Updates Blog

Simplify Your Data Ingestion with Copy Job (Generally Available)

Copy Job is now generally available, bringing you a simpler, faster, and more intuitive way to move data! Whether you’re handling batch transfers or need the efficiency of incremental data movement, Copy Job gives you the flexibility and reliability to get the job done. 

Since its preview last September, we’ve received incredible feedback from you. Your insights have been our driving force in making Copy Job even better. To address the top requests, Copy Job has rapidly evolved with several powerful enhancements. Let’s dive into what’s new! 

More connectors, more possibilities 

More source connections are now available, giving you greater flexibility for data ingestion with Copy Job we’re not stopping here—even more connectors are coming soon! 

  • Oracle
  • Snowflake
  • SAP HANA
  • Google Cloud Storage
  • MySQL
  • PostgreSQL
  • Fabric SQL
  • Amazon S3 compatible
  • Azure SQL MI
  • Azure MySQL
  • Azure PostgreSQL
  • Azure Data Explorer

Copy Job can also now connect to any ODBC-supported data source by installing the ODBC drivers with the on-premises gateway.

We’d love to hear from you—share your connector requirements and feedback on Fabric Ideas to help shape the future of Copy Job.

Public API & CI/CD support  

Fabric Data Factory now offers a robust Public API to automate and manage Copy Job efficiently. Plus, with Git Integration and Deployment pipelines, you can leverage your own Git repositories in Azure DevOps or GitHub and seamlessly deploy Copy Job with Fabric’s built-in CI/CD workflows.  

Learn more in Simplifying Data Ingestion with Copy Job – CI/CD is now available

VNET Gateway support 

Copy Job now supports the VNet data gateway in preview! The VNet data gateway enables secure connections to data sources within your virtual network or behind firewalls. With this new capability, you can now execute Copy Job directly on the VNet data gateway, ensuring secure data movement. 

Upsert to Azure SQL Database & Overwrite to Fabric Lakehouse table

By default, Copy Job appends data to ensure no changed data is lost. But now, you can also choose to merge data directly into Azure SQL DB or SQL Server, and overwrite data in Fabric Lakehouse tables. These options give you greater flexibility to tailor data ingestion to your specific needs. 

Learn more in Simplifying Data Ingestion with Copy Job: Upsert to Azure SQL Database & Overwrite to Fabric Lakehouse

More experience improvements

We’ve made Copy Job even more intuitive based on your feedbacks, with the following improvements:

  • Better flexibility to edit Copy Job.
  • Column mapping for simple data modification to storage destination store.
  • Data preview to help select the right incremental column.
  • Search box to quickly find tables or columns.
  • Real-time monitoring with in-progress view.
  • Customizable update methods & schedules before job creation.

Learn more in Powerful improvements for Copy Job.

What’s next? 

The journey doesn’t stop here! You can choose either a full copy or incremental data copy (currently in preview), and we can’t wait to see how you use it. We will also be introducing powerful features like Change Data Capture (CDC) enabling capture and replication of inserts, updates, and deletions automatically to keep your data in sync across supported stores. If you’re interested in early access, let us know by filling out this form!

We’re dedicated to making Copy Job even smarter and faster for your data ingestion needs. Stay tuned—exciting enhancements are on the way!

Learn more about Copy Job in: What is Copy job in Data Factory.

Submit your feedback on Fabric Ideas and join the conversation on the Fabric Community. To get into the technical details, head over to the Fabric documentation.  

Postagens relacionadas em blogs

Simplify Your Data Ingestion with Copy Job (Generally Available)

julho 10, 2025 de Matthew Hicks

Effortlessly read Delta Lake tables using Apache Iceberg readers Microsoft Fabric is a unified, SaaS data and analytics platform designed for the era of AI. All workloads in Microsoft Fabric use Delta Lake as the standard, open-source table format. With Microsoft OneLake, Fabric’s unified SaaS data lake, customers can unify their data estate across multiple … Continue reading “New in OneLake: Access your Delta Lake tables as Iceberg automatically (Preview)”

julho 10, 2025 de Vaibhav Shrivastava

A new feature has been added to Eventstream—the SQL Operator—which enables real-time data transformation within the platform. Whether you’re filtering, aggregating, or joining data streams, or handling complex data transformation needs like conditional logic, nested expression, string manipulation etc. SQL Operator gives you the flexibility and control to craft custom transformations using the language you … Continue reading “From Clicks to Code: SQL Operator under Fabric Eventstream (Preview)”