Microsoft Fabric Updates Blog

Passing parameter values to refresh a Dataflow Gen2 (Preview)

Screenshot of Dataflow Gen2 refresh activity in Fabric pipelines with the new Dataflow parameters section enabled

Parameters in Dataflow Gen2 enhance flexibility by allowing dynamic adjustments without altering the dataflow itself. They simplify organization, reduce redundancy, and centralize control, making workflows more efficient and adaptable to varying inputs and scenarios.

Leveraging query parameters while authoring Dataflows Gen2 has been possible for a long time, however, it was not possible to override the parameter values when refreshing the dataflow. The ability to pass values from a pipeline into a Dataflow parameter for refresh has been one of the top ideas in the Fabric ideas portal since Dataflow Gen2 was released.

Screenshot of the idea for "Enable to pass variables from pipelines as parameters into a Dataflow Gen2" from the Fabric ideas portal

We are happy to announce the public preview of the public parameters capability for Dataflow Gen2 with CI/CD support as well as the support for this new mode within the Dataflow refresh activity in Data Pipelines.

Public parameters in Dataflow Gen2 with CI/CD support allow users to refresh their Dataflows by passing parameter values outside of the Power Query editor through the Fabric REST API or native Fabric experiences. This enables a dynamic experience with Dataflows, where each refresh can be run with different parameters that affect how the Dataflow is refreshed.

This new capability is rolling out today and will be available in all production environments in the coming days.

How to: pass parameters to a Dataflow refresh

As a prerequisite, only Dataflow Gen2 (CI/CD) items can leverage this new functionality.

To leverage this new functionality, you’ll need to:

  1. Ensure the public parameter mode is enabled for the dataflow item
  2. Use the updated Dataflow refresh activity to pass parameters via Data Pipelines

The following sections showcase how this experience works, and you can also learn more about this feature from the official documentation on using public parameters in Dataflow Gen2.

Enabling the new public parameters mode

Once you have your Dataflow Gen2 with CI/CD support open, you can click the Options dialog and navigate to the Parameters section.

Screenshot of the Power Query editor inside of a Dataflow Gen2 with CI/CD enabled showing the Options entry point in the Home tab of the ribbon

Here you can enable the Enable parameters to be discovered and override for execution option.

Screenshot of the Options dialog showing the new Parameters section and the new setting to "Enable parameters to be discovered and overridden for execution"

This will change the behavior of your dataflow to now accept parameters when doing a refresh.

When you open the manage parameters dialog, you’ll also notice a new tooltip  at the top of the dialog that reminds you that the new public parameter mode is enabled. In the screenshot you can see that a parameter with the name Region has been defined as required, of the Text data type and with a current value of Eastern.

Screenshot of the Manage Parameters dialog showing a tooltip that reads "Public parameter mode is enabled"

How this parameter is referenced within your dataflow is completely up to you and you can always define if the parameter must be required for refresh invocation or not. You can find more details about how to create and reference parameters within your dataflow queries in this article: Parameters – Power Query | Microsoft Learn

Once the new mode and the parameters are set, you can go ahead and save your dataflow.

Screenshot of a Dataflow Gen2 editor and the save submenu within the home tab of the ribbon

Parameters section for Dataflow refresh activity in a pipeline

You can use pipelines and the Dataflow refresh activity to orchestrate the execution of the Dataflow Gen2 with CI/CD that you’ve just saved, and a new Dataflow Parameters section will appear.

You can manually enter the name of the parameters that you created in your Dataflow, set the data type of the value that you will pass and then enter the value that you wish to pass for refresh. Note that the parameter name and type that you specify in this activity have to match those in the original Dataflow item. In the future, we aim to simplify this experience by making the parameter name and type automatically discovered by the Data Pipeline UI.

Screenshot of Dataflow Gen2 refresh activity in Fabric pipelines with the new Dataflow parameters section enabled

Now you can go ahead and save and run your pipeline to refresh a Dataflow by passing parameters to it.

Concurrency support for Dataflow refreshes

You can also create more dynamic scenarios that could be metadata-driven. For example, you can use the Dataflow refresh activity within a foreach in your Data Pipeline where you pass an array of values to be used to refresh your Dataflow. Each of those values can run sequentially or in parallel as Dataflow Gen2 with CI/CD support and the public parameters mode bring concurrency support to you.

Notice how the screenshot showcases a pipeline that uses a Lookup activity and passes the result of it to a Foreach where each value is used to trigger a different Dataflow refresh, all running in parallel.

Screenshot of a Data pipeline run showing a Lookup activity that passes values to a For each where a Dataflow Gen2 is running with parallel execution

Plans ahead

You can expect support for more data types on parameters, new enhancements on observability items such as refresh history and monitoring hub, enhanced experience for the pipeline activity, REST API support to trigger refreshes with parameters and much more.

We’re eager to hear from you and about your experience with this new preview feature. Feel free to share your feedback in the Fabric Data Factory Forum.

More resources

Share your feedback for this new feature in the Data Factory forum

Related blog posts

Passing parameter values to refresh a Dataflow Gen2 (Preview)

May 22, 2025 by Jeroen Luitwieler

The introduction of file-based destinations for Dataflows Gen2 marks a significant step forward in enhancing collaboration, historical tracking, and sharing data for business users. This development begins with SharePoint file destinations in CSV format, offering a streamlined way to share with users on the SharePoint platform. Overview of file-based destinations File-based destinations allow data to … Continue reading “SharePoint files destination the first file-based destination for Dataflows Gen2”

May 22, 2025 by Penny Zhou

Understanding complex data pipelines created by others can often be a challenging task, especially when users must review each activity individually to understand its configurations, settings, and functions. Additionally, manul updates to general settings, such as timeout and retry parameters, across multiple activities can be time-consuming and tedious. Copilot for Data pipeline introduces advanced capabilities … Continue reading “AI-powered development with Copilot for Data pipeline – Boost your productivity in understanding and updating pipeline”