Microsoft Fabric Updates Blog

New Features for Fabric Data Factory Pipelines Announced at Ignite

Table and partition refreshes added to Semantic Model Refresh activity

One of the most popular features that we built in Fabric Data Factory came from our customer patterns that we observed being used in ADF and from our community. That is the Semantic Model Refresh activity. After first releasing this pipeline activity, we heard your ask to improve your ELT pipeline processing by including an option to refresh specific tables and partitions in your semantic models. We are pleased to announce that we’ve now enabled this feature making the pipeline activity the most effective way to refresh your Fabric semantic models!

You can now refresh specific tables in your semantic model
Optionally, you can now optionally refresh specific partitions

Import and export your Fabric Data Factory pipelines

As a Data Factory pipeline developer, you will often want to export your pipeline definition to share it with other developers or to reuse it in other workspaces. We’ve now added the capability to export and import your Data Factory pipelines from your Fabric workspace. This powerful feature will enable even more collaborative capabilities and will be invaluable when you troubleshoot your pipelines with our support teams. Just click the new export button on the pipeline canvas designer to download the JSON definition of your pipelines which you can then share with other users inside or outside of your organization. Your collaborators can then take that JSON pipeline definition and import it into their own workspaces using the import button.

Export and Import Fabric Pipelines from the Data Factory Pipeline Designer

Post di blog correlati

New Features for Fabric Data Factory Pipelines Announced at Ignite

aprile 20, 2026 da Penny Zhou

Coordinating dbt runs with upstream ingestion and downstream consumption often requires complex solutions and different tools. You can now add a dbt job activity (Preview) directly to your Fabric pipelines. This lets you orchestrate dbt transformations alongside other pipeline activities, so you can build end-to-end data workflows without switching tools. Why this matters Run dbt … Continue reading “Orchestrate dbt jobs activity in your Fabric pipelines (Preview)”

aprile 16, 2026 da Nick Salch

As the Data Integration Customer Advisory Team (CAT) lead, I spent a lot of time talking to customers at the recent FabCon/SQLCon about Fabric Data Factory, and I came away with a clear picture of what’s on customers’ minds when it comes to the future of data integration. Many of the same questions came up … Continue reading “Answers to common questions about Fabric Data Factory”