Microsoft Fabric Updates Blog

Service principal support to connect to data in Dataflow, Datamart, Dataset and Dataflow Gen 2

Today I am very excited to announce that Azure service principal has been added as an authentication type for a set of data sources that can be used in Dataset, Dataflow, Dataflow Gen2 and Datamart. 

Azure service principal is a security identity that is application based and can be assigned permissions to access your data sources. Service principals are used to safely connect to data, without a user identity. Learn more about service principals.

SPN authentication type for SQL data source won’t work with DQ dataset

Supported data sources

  • Azure Synapse analytics
  • Azure SQL database
  • Azure data Lake gen 2
  • Azure data lake
  • Azure blob storage
  • Web
  • Dataverse
  • SharePoint online

Note: Service principal authentication is not supported for SQL data source with Direct Query in datasets

How to use service principals to connect to your data in dataflows gen 2 

In this example, we will show how you can use service principal to connect to an Azure Data Lake gen 2 through dataflows gen 2. Later in the article we show how you can use service principals in datasets.  

Prerequisite

  1. Create a service principal using the Azure portal
  1. Grant permission for the application to have read access on the data source. In the example of data lake, make sure the application has storage blob data reader access.  

Connect to your data using Service Principal within dataflows gen 

  1. Navigate to https://app.fabric.microsoft.com/ 
  1. Create a dataflow gen 2 
  1. Select Azure Data Lake Storage Gen2 as source 
  1. Fill in the URL and select create new connection 
  1. Change Authentication kind to Service Principal 
Select service principal as authentication type.
  1. Fill in Tenant ID. You can find the tenant ID in the Azure Portal. 
Find tenant ID in Azure portal.
  1. Fill in the Service principal ID 
Find client ID in Azure portal

8. Fill in the Service principal key 

  1. Click Next 

How to use service principals to connect to your data in datasets 

Prerequisite

  1. Create a service principal using the Azure portal.[ Create an Azure AD app and service principal in the portal – Microsoft Entra | Microsoft Learn
  1. Grant permission for the application to have read access on the data source. 
  1. Have a dataset published to te service. 

Connect to data using Service Principal within datasets 

  1. Navigate to https://app.fabric.microsoft.com/ 
  1. Navigate to the dataset settings page 
Open dataset settings page
  1. Navigate to the data source credentials and click edit credentials 
  1. Fill in Tenant ID. You can find the tenant ID in the Azure Portal. 
Find tenant ID in Azure portal
  1. Fill in the Service principal ID 
Find client ID in Azure portal
  1. Fill in the Service principal key 
  1. Click Sign in 

Other resources

  • Join the Fabric community to post your questions, share your feedback, and learn from others.
  • Visit Microsoft Fabric Ideas to submit feedback and suggestions for improvements and vote on your peers’ ideas!
  • Check our Known Issues page for up to date on product fixes!

Have any questions or feedback? Leave a comment below!

Related blog posts

Service principal support to connect to data in Dataflow, Datamart, Dataset and Dataflow Gen 2

April 10, 2025 by Miguel Llopis

We had such an exciting week for Fabric during the Fabric Conference US, filled with several product announcements and sneak previews of upcoming new features. Thanks to all of you who participated in the conference, either in person or by being part of the many virtual conversations through blogs, Community forums, social media and other … Continue reading “Recap of Data Factory Announcements at Fabric Conference US 2025”

April 7, 2025 by Connie Xu

User Data Functions are now available in preview within data pipeline’s functions activity! This exciting new feature is designed to significantly enhance your data processing capabilities by allowing you to create and manage custom functions tailored to your specific needs. What is a functions activity? The functions activity in data pipelines is a powerful tool … Continue reading “Utilize User Data Functions in Data pipelines with the Functions activity (Preview)”