Microsoft Fabric Updates Blog

Introducing support for Workspace Identity Authentication in Fabric Connectors

Co-author: Meenal Srivastva

Managing secure, seamless access to data sources is a top priority for organizations using Microsoft Fabric. With workspace identity authentication, teams can simplify credential management, enhance security, and streamline data access across their enterprise.

Workspace identity in Fabric is an automatically managed service principal associated with workspaces (excluding My Workspaces). When you create a workspace identity, Fabric generates a service principal in Microsoft Entra ID, enabling seamless authentication and trusted access to firewall-enabled storage accounts.

Workspace Identity Authentication in connections leverages Microsoft Entra ID (formerly Azure Active Directory) to provide seamless, secure access to data sources using your workspace’s managed identity. This modern authentication approach in Microsoft Fabric eliminates the need for storing credentials while providing fine-grained access control and comprehensive audit capabilities.

For more details, refer to the Workspace identity documentation.

Recently, we announced Fabric Workspace Identity: Removing Default Contributor Access for Workspace Identity changes with respect to the default contributor role previously assigned to Workspace identity during creation. As of July 27, 2025, a new Workspace Identity created in your workspace will no longer have default roles on the workspace. Existing workspace identity will no longer have the contributor role; Admins can always assign roles explicitly to workspace identity.

Now available!

Expanded support for workspace identity in new connectors for Data pipeline, semantic models, and Dataflow Gen2 (CI/CD).

With Workspace Identity authentication, Data pipelines, Copy job, semantic models, and Dataflows Gen2 can connect to data sources, eliminating the need for managing credentials and enabling centralized, secure access control.

Connectors that support Workspace Identity authentication

The following table shows workspace identity authentication availability in connectors across different fabric items:

Legend:
– ‘x’ – Available
– ‘N/A’ – Not applicable due to connector not supported by the runtime

ConnectorCopy JobData PipelineDataflow Gen2 (CI/CD), Semantic Models
Azure Analysis ServicesN/AN/Ax
Azure Blobsxxx
Azure Cosmos DB (SQL API)xxx
Azure Data Explorer (Kusto)xxx
Azure Data Lake Storage Gen1N/AxN/A
Azure Data Lake Storage Gen2xxx
Azure Synapse Analyticsxxx
Azure Synapse WorkspaceN/AxN/A
Azure Tablesxxx
Dataversexxx
Dynamics 365N/AxN/A
Dynamics AXN/AxN/A
Dynamics CRMN/AxN/A
SharePointxxx
SQL Serverxxx
Viva InsightsN/AN/Ax
Web N/AN/Ax
Connectors with Workspace Identity support

How to Use a Workspace Identity to Connect to Azure Blob Storage with Dataflow Gen2 (CI/CD)

In this example, we will demonstrate how to use workspace identity in Microsoft Fabric Dataflows Gen2 for authentication to Azure Blob Storage. While the steps for using workspace identity remain the same across different connectors, the procedures for granting permissions to data sources may vary.

Step 1: Create Workspace Identity

Creating a workspace identity is straightforward and can be done in the workspace settings of any workspace except personal workspaces (My Workspace):

  1. Navigate to your workspace and open the workspace settings.
  2. Select the Workspace identity tab.
  3. Select on the + Workspace identity button.

You can also create the workspace identity using the Workspaces – Provision Identity REST API. Workspace admins can create and delete the workspace identity. Admins, members, and contributors can configure workspace identity as an authentication method in supported items, such as Dataflows Gen2.

Step 2: Grant Permissions to the Storage Account

To enable the workspace identity to access Azure Blob storage accounts:

  1. Sign in to the Azure portal and navigate to the storage account.
  2. Select the Access control (IAM) tab and click on Role assignments.
  3. Select the Add button and choose Add role assignment.
  4. Select the appropriate role (e.g., Storage Blob Data Reader) and assign it to the workspace identity.
  5. Complete the assignment by selecting Review + assign.

Step 3: Create a Dataflows Gen2 and bind it to the Azure Blob storage data connection

  1. Create the Dataflow Gen2 in Fabric. Select ‘Enable Git integration, deployment pipelines, and Public API scenarios’.
    • Follow the steps listed for creating Azure Blob Storage connection using Get Data experience: https://learn.microsoft.com/power-query/connectors/azure-blob-storage#connect-to-azure-blob-storage-from-power-query-online.
      Note: Alternatively, you can create Azure Blob connection through ‘Manage Connections and Gateways’ and choose the auth type as workspace identity. Then reference the connection in Get Data.
  2. Select Workspace Identity as the authentication method.
Workspace Identity auth selection in Connection creation in Dataflow Gen2

Coming Soon

Check out the new updates and provide your feedback through comments on this post or Fabric Ideas

To learn more about this feature, refer to the Authenticate with workspace identity documentation. Ready to try it out? Explore the new connectors, share your feedback, and help shape the future of Microsoft Fabric!

We will continue to add support for new connectors or data sources along with workspace identity authentication. Stay tuned for product announcements and updates.

Related blog posts

Introducing support for Workspace Identity Authentication in Fabric Connectors

October 30, 2025 by Leo Li

Here is the October 2025 release of the on-premises data gateway (version 3000.290).

October 29, 2025 by Ye Xu

Copy job is the go-to solution in Microsoft Fabric Data Factory for simplified data movement, whether you’re moving data across clouds, from on-premises systems, or between services. With native support for multiple delivery styles, including bulk copy, incremental copy, and change data capture (CDC) replication, Copy job offers the flexibility to handle a wide range … Continue reading “Simplifying Data Ingestion with Copy job – More File Formats with Enhancements”