Microsoft Fabric Updates Blog

Announcing Copy Job Activity in Data Factory Pipeline (Generally Available)

This milestone marks a major step forward in unifying and simplifying data movement experiences across Data Factory. With Copy Job Activity, users can now enjoy the simplicity and speed of Copy Job while leveraging the orchestration power and flexibility of Data Factory pipelines.

What is the Copy job Activity 

Copy Job Activity allows you to run Copy jobs as native activities inside Data Factory pipelines.

Copy jobs are created and managed independently in Data Factory for quick data movement between supported sources and destinations. With Copy job Activity, that same fast, lightweight experience is now embedded within pipelines, making it easier to automate, schedule, and chain Copy jobs as part of broader data workflows.

Copy job activity in pipeline
Copy job activity in pipeline

The Copy job Activity orchestrates the Copy job item, which is designed to make data ingestion easier and more powerful, within Microsoft Fabric Data Factory pipelines: 

  • No-code simplicity: Select an existing Copy job or create a new one right from your pipeline canvas. 
  • Flexible orchestration: Chain Copy jobs with other activities like notebooks, dataflows, conditional logic, and even Power BI refresh – ideal for building medallion architecture or orchestrating downstream processing. 
  • Email support: With the Copy job activity + Outlook activity, you can trigger Copy jobs and send notifications or status updates via email directly to your pipeline. 
  • Comprehensive Copy job capabilities: Incorporates all Copy job features, including both batch and incremental copy with CDC, along with intuitive monitoring for built-in telemetry. 

Whether you’re building a quick prototype or orchestrating complex data workflows, the Copy job Activity helps you do more with less. 

The Copy job activity builds on the Copy job item, which was created in response to customer feedback asking for a simpler way to ingest data, with native support for multiple delivery styles, including bulk copy, incremental copy, and change data capture (CDC) replication. Now, with Copy job activity, you can reuse that logic inside pipelines; adding conditional execution, retries, and chaining with other activities. 

Check out our blog post on what Copy job has to offer Simplifying Data Ingestion with Copy job. 

Learn more with our documentation

관련 블로그 게시물

Announcing Copy Job Activity in Data Factory Pipeline (Generally Available)

4월 7, 2026 작성자 Premal Shah

Organizations today manage data across multiple storage systems, often in formats like CSV, Parquet, and JSON. While this data is readily available, turning it into analytics-ready tables typically requires building and maintaining complex ETL pipelines. Shortcut transformations remove that complexity. With Shortcut transformations, you can convert structured files referenced through OneLake shortcuts into Delta tables … Continue reading “Shortcut transformations: Turn files into Delta tables without pipelines (Generally Available)”

4월 6, 2026 작성자 Jovan Popovic

Fabric Data Warehouse now supports the ANY_VALUE() aggregate, making it easier to write readable, efficient T-SQL when you want to group by a key but still return descriptive columns that are functionally the same for every row in the group. What is ANY_VALUE()? ANY_VALUE() is an aggregate or analytic function that returns an arbitrary value … Continue reading “Use ANY_VALUE() for simpler grouping of results in Fabric Data Warehouse (Generally Available)”