Microsoft Fabric Updates Blog

New Innovations for Fabric Data Factory Orchestration at Fabric Conference Europe 2025

The Microsoft Data Integration team has been hard at work improving the Data Factory orchestration experiences across all of Fabric! It is with great excitement that we are ready to announce many exciting orchestration innovations to Data Factory unveiled at Fabric Conference Europe 2025.

As the community gathers in Vienna, we’re excited to unveil a suite of pipeline and Airflow features—shaped by your incredible feedback—that empower data engineers, architects, and integration professionals to build, orchestrate, and manage data solutions with greater agility. These innovations represent our ongoing commitment to making Fabric Data Factory orchestration versatile and user-friendly and we can’t wait for you to experience the future of data orchestration firsthand.

Data Movement

Data Movement has always been a central part of efficient data pipeline design, and the Copy job item in Fabric Data Factory is the next evolution of simplification of data movement by Microsoft. We are excited to introduce the new Copy job activity for pipelines to allow you to build powerful, easy Copy job items that you can orchestrate from the design-first pipelines to move data quickly and easily.

Whether you are synchronizing vast complete datasets or incremental data loads across cloud environments or orchestrating intricate ETL processes, the Copy job activity delivers performance and reliability you can trust. The intuitive interface, coupled with advanced performance optimizations, ensures that your data reaches the right destination, at the right time, with minimal friction. It’s never been easier to keep your data flowing and your operations resilient.

Copy Job activity in pipelines

Seamlessly Integrate Fabric Notebooks into your Airflow DAGs

This enhancement opens up powerful new pathways for collaboration, exploration, and automation. With just a few clicks, users can now easily embed Python code to call Fabric Notebooks directly within their Airflow workflows, leveraging rich data exploration and transformation capabilities right where orchestration happens.

In our drive to make Fabric Data Factory pipelines more inclusive and adaptable, we’re also simplifying our language and expanding our mission. ‘Data pipelines’ are now simply ‘Pipelines’, reflecting our commitment to extending Fabric Data Factory’s capabilities to fit more diverse and extended use cases. This update isn’t just semantic—it signals a broadened horizon where pipelines can orchestrate not just data, but also services, applications, and business processes. By embracing a more unified terminology, we invite our community to imagine new possibilities, integrating data engineering with broader business and workflow automation in Fabric Data Factory.

Data Pipelines are now simply Pipelines

Generally Available pipeline activities

  • Designed to unlock fresh capabilities for your projects in complex production scenarios:
    • The Invoke Pipeline Activity allows you to trigger other pipelines from within a pipeline, enabling modular design and promoting reuse.
    • Communication and collaboration step up a notch with integrated Teams and Outlook email activities, so you can notify stakeholders, send reports, or trigger workflows across your organization effortlessly.
    • The Dataflow activity brings seamless integration with code-free data transformation logic.
    • Fabric Functions empower you to execute custom code for advanced processing using the newly generally available Fabric Functions and Function activity.

These additions ensure that your pipelines are more connected, flexible, and capable—whether you’re orchestrating data, automating workflows, or enabling business-critical communications.

Our commitment to enterprise-grade automation continues with new features designed to deliver confidence and control. Continuous Integration and Continuous Deployment (CICD) is now supported for your Fabric Airflow projects, making it easier than ever to scale, update, and manage your data workflows with best-in-class DevOps practices.

Pipeline monitoring is enhanced, as pipelines now support workspace monitoring, providing real-time insights into performance and reliability. Additionally, you can now assign multiple schedules to a single pipeline, giving you unprecedented flexibility in how and when your pipelines run. These features work in concert to deliver a robust, secure, and agile automation environment where your data strategies can thrive.

Apply multiple schedules to your pipelines

For our pro-dev pipeline users who need to write lots of pipeline expression code, we are bringing a preview for an incredible expression evaluator capability and co-pilot for pipeline expressions that work in-line during design time. Additionally, we are also announcing the general availability of pipeline variable libraries for building powerful metadata-driven pipeline patterns and CICD collaboration.

Streamline your job of writing orchestration logic using our expression editor:

Pipeline expression co-pilot private preview

First, we have incorporated Microsoft’s amazing AI experience Co-pilot into the pipeline expression editor. This is currently in private preview; you’ll need to sign-up to access Imagine the possibilities to quickly build an expression that does something like this using natural language: ‘Write an expression that can tell me the day of the week from an integer pipeline parameter called days’.

Data Factory will build the appropriate pipeline expression code for you and now also allows you to test in design mode. While running a test pipeline remains a solid end-to-end practice for validating functionality, in this case, the goal is simply to test the expression generated by Copilot. The new ‘Evaluate expression’ feature in Data Factory enables real-time testing and debugging of expressions directly within the pipeline design experience.

Sign-up for the private preview of the pipeline expression co-pilot feature.


We want to extend our heartfelt gratitude to the entire Fabric Data Factory community, whose feedback and creativity fuel our innovation. As we celebrate these exciting advancements at the Fabric Conference Europe 2025, we invite you to explore, experiment, and build with the new features.

Our journey together is just beginning, and we are committed to supporting your ambitions with tools that spark creativity, enhance productivity, and inspire success. On behalf of everyone on the Microsoft Fabric Data Factory team, thank you for joining us as we take another bold step forward in data orchestration, shaping the future of Fabric orchestration together!

Liittyvät blogikirjoitukset

New Innovations for Fabric Data Factory Orchestration at Fabric Conference Europe 2025

marraskuuta 10, 2025 tekijä Arun Ulagaratchagan

SQL is having its moment. From on-premises data centers to Azure Cloud Services to Microsoft Fabric, SQL has evolved into something far more powerful than many realize and it deserves the focused attention of a big stage.  That’s why I’m thrilled to announce SQLCon, a dedicated conference for database developers, database administrators, and database engineers. Co-located with FabCon for an unprecedented week of deep technical content … Continue reading “It’s Time! Announcing The Microsoft SQL Community Conference”

marraskuuta 3, 2025 tekijä Arshad Ali

Additional authors – Madhu Bhowal, Ashit Gosalia, Aniket Adnaik, Kevin Cheung, Sarah Battersby, Michael Park Esri is recognized as the global market leader in geographic information system (GIS) technology, location intelligence, and mapping, primarily through its flagship software, ArcGIS. Esri empowers businesses, governments, and communities to tackle the world’s most pressing challenges through spatial analysis. … Continue reading “ArcGIS GeoAnalytics for Microsoft Fabric Spark (Generally Available)”