Microsoft Fabric Updates Blog

Utilize User Data Functions in Data pipelines with the Functions activity (Preview)

User Data Functions are now available in preview within data pipeline’s functions activity! This exciting new feature is designed to significantly enhance your data processing capabilities by allowing you to create and manage custom functions tailored to your specific needs.

What is a functions activity?

The functions activity in data pipelines is a powerful tool designed to enhance data processing capabilities by allowing you to execute custom logic and calculations within your data pipelines.

Data pipeline Functions activity named My UDF Function.

What is a User Data Functions item?

The User Data Functions item is a Fabric-native artifact that holds a collection of individual functions that share the same code file and configuration settings. You’ll also be able to access the Fabric resources available in your Workspace and any other ones that you have access to.

To learn more, check out the What is Fabric User data functions (Preview)? documentation.

Using User Data Functions in your Data pipeline

Step 1: Create a workspace

First, you’ll need a Fabric workspace to get started. You can follow the steps in this tutorial to Create a workspace in Microsoft Fabric.

Step 2: Create a User Data Function Item

Once you have a workspace, select the ‘+ New Item’ button at the top left of your workspace.

New item button in Fabric workspace.

Select the ‘User data functions (Preview)’ item under the Develop data group towards the bottom of the New item list.

User data functions (preview) item in Fabric's new item list.

Name your User Data Functions item and select the ‘Create’ button.

Create a new User data functions item with an option to name your item and either Create or Cancel.

Step 3: Create your functions in the User Data Functions item

Once your User Data Function loads, select the ‘New function’ button that populates at the center of the screen to get started on creating your functions.

Create a new sample function button.

By selecting the ‘New function’ button at the center of your screen, a sample “hello_fabric” function will automatically populate. For the purpose of this tutorial, we’ll just use this sample function that will take in a parameter of name. The function will display a welcome message that uses the name parameter alongside the current date/time.

User data functions Functions explorer + Functions pane that displays the hello_fabric function.

Optional

You can create/edit/remove any additional functions you’d like in the functions pane. If you are interested in learning more about User Data Functions, check out this previous post on Getting started with the User Data Functions item.

Step 4: Publish your function(s)

Select the ‘Publish’ button at the top right of the navigation bar to publish your User Data Functions so that the functions you just created can be referenced in your Fabric workspace.

Publish button inside of the User Data Functions editor.

After a moment, you should see a banner populate at the top right of your screen that confirms the publish has succeeded.

A banner that confirms the User Data Function published successfully.

Step 5: Create a Data pipeline with a functions activity

Navigate back to your workspace and select the ‘+ New Item’ button at the top left of your workspace.

New item button in Fabric workspace.

Select the ‘Data pipeline’ item under the Get data group towards the top of the New item list.

Data pipeline item in Fabric New item pane.

Name your data pipeline and select the ‘Create’ button.

Create a new data pipeline item with an option to name your item and either Create or Cancel.

When your data pipeline has been created, you will have an option to Start with a blank canvas. Select the ‘Pipeline activity’ button and navigate to the Orchestrate section of the dropdown towards the bottom of the activity list.

Select the ‘Functions’ button to create your functions activity.

Data pipeline canvas upon initial creation. When clicking the Pipeline activity option, a dropdown with activities populate on the screen.

You’ll now see a functions activity has been added to your data pipeline’s canvas. Notice that when you toggle into the functions activity by selecting it, you can view the General tab and Settings tab.

Functions activity in pipelines canvas with a General tab followed by a Settings tab.

Step 6: Utilize User Data Functions inside of your function’s activity

While toggling into your Functions activity by selecting it, navigate to the General tab. You’ll find a required Name section that comes auto filled with a name. You can rename your function to better describe your functions. In this tutorial, we’ve decided to name our functions activity “UDF hello_world”.

Renaming the functions activity through the General settings.

Navigate the Settings tab in your function’s activity. You’ll find that you’ll be able to select the Type of function you’d like to use. In today’s demo, we’ll keep the ‘Fabric user data functions’ type selected.

Next, select the dropdown for Connection to configure your connection to your User Data Function item.

Settings panel of the functions activity that has options for Type, Connection, Workspace, User data functions, and Function.

When selecting the Connection dropdown, you will be prompted to the Get data experience. You will start by choosing a data source to get started. Start by selecting ‘User Data Functions’ as your source under the New sources section towards the top of the panel.

First part of the Get data experience where you can Choose a data source to get started.

Next in the Get data experience, you’ll need to connect to your data source. To do so, opt to create a new connection in the Connection dropdown.

Then, name your Connection name anything you’d like to. In this tutorial, we’ve named the connection ‘My UDF Connection’.

Then, sign into your Fabric account. You can ignore the other sections of the Connection credentials for this tutorial.

Once you complete these steps, the ‘Connect’ button will be enabled. Select the ‘Connect’ button at the bottom right of the Get data experience to proceed.

Second part of the Get data experience where you can Connect to a data source.

You have now configured your connection as you’ll see that the Connection section of Settings is now filled out with your connection credentials.

From the Workspace section in Settings, select the Fabric workspace you created your User Data Functions item in by toggling into the Workspace dropdown.

Find the name of your User Data Functions item within the dropdown of the User data functions section of Settings. In today’s tutorial, we named our User Data Functions item ‘My User Data Function’.

Navigate to the function you’d like to use in the Function dropdown of Settings. In today’s tutorial, we used a sample function named “hello_fabric”.

Configuring functions activity settings by filling out Type, Connection, Workspace, User data functions, and Function

Since we are using the sample “hello_fabric” function, it requires a parameter of name that comes in the form of a string. So, input your name in the Value section of Parameters.

Input example name Connie in Value section of Parameters within your function.

Step 7: Save and run your Data pipeline to confirm your User Data Function in the function’s activity work

At the top left of the pipeline builder navigation bar, select the ‘Save’ button.

Purple save button from pipelines editor navigation bar.

Once your data pipeline is saved, select the ‘Run’ button just a few items to the right of the Save button.

Run button from pipelines editor navigation bar.

When your date pipeline successfully runs, you’ll be able to check the Input and Output of your pipeline to make sure that it aligns with your User Data Function. You can do this by selecting the input icon or output icon of your function’s activity.

Pipelines run outputs sorted by activities in your pipeline.

Let’s select the input icon for our function’s activity “UDF hello_world” that we created in this tutorial. The expected input in this tutorial is the value of the name you chose to type into the Parameters section your functions activity Settings.

When you are done checking your input for the function’s activity, you can select the ‘X’ at the top right of the input pop-out to exit out of the view.

Input of functions activity with parameter of name and name input.

Now, let’s select the output icon for your function’s activity. The expected output in this tutorial is the welcome message: ‘Welcome to Fabric Functions, {the name value you input}, at {current date and time}!’.

When you are done checking your output for the function’s activity, you can select the ‘X’ at the top right of the input pop-out to exit out of the view.

Output of functions activity that displayed a welcome message with the name input and current time.

Congratulations! You’ve now learned how to use User Data Functions inside of your data pipeline!

Next steps

Check out our documentation to learn more about the Functions activity in Data pipelines today. Submit your feedback on Fabric Ideas and join the conversation on the Fabric Community

Related blog posts

Utilize User Data Functions in Data pipelines with the Functions activity (Preview)

May 22, 2025 by Jeroen Luitwieler

The introduction of file-based destinations for Dataflows Gen2 marks a significant step forward in enhancing collaboration, historical tracking, and sharing data for business users. This development begins with SharePoint file destinations in CSV format, offering a streamlined way to share with users on the SharePoint platform. Overview of file-based destinations File-based destinations allow data to … Continue reading “SharePoint files destination the first file-based destination for Dataflows Gen2”

May 22, 2025 by Penny Zhou

Understanding complex data pipelines created by others can often be a challenging task, especially when users must review each activity individually to understand its configurations, settings, and functions. Additionally, manul updates to general settings, such as timeout and retry parameters, across multiple activities can be time-consuming and tedious. Copilot for Data pipeline introduces advanced capabilities … Continue reading “AI-powered development with Copilot for Data pipeline – Boost your productivity in understanding and updating pipeline”