Microsoft Fabric Updates Blog

Service Principal Support in Semantic Link: Enabling Scalable, Secure Automation

Microsoft Fabric continues to evolve as a unified platform for data professionals, and the latest update to Semantic Link introduces Service Principal support—a key enhancement for teams looking to automate and scale their data workflows securely. 

Semantic Link enables seamless integration between notebooks and semantic models, allowing users to query and analyze data directly from their models. With the introduction of Service Principal authentication, organizations can now run notebooks and pipelines without relying on user credentials, unlocking new levels of reliability and security. 

Automating Workflows with Confidence

Service Principals are Azure identities designed for applications and automation tools. By supporting these identities, Semantic Link now allows notebooks to be triggered by Fabric Pipelines or the Job Scheduler API using non-interactive authentication. Meaning scheduled jobs can run consistently, even when no user is signed in—ideal for production-grade deployments and enterprise-scale operations.

For more advanced scenarios, teams can also manually configure Service Principal credentials to access the full feature set of Semantic Link. This flexibility ensures that both automated and interactive workflows are supported, while maintaining strict access controls.

You can set service principal authentication with Key Vault values: 

import sempy.fabric as fabric  
from sempy.fabric import set_service_principal  

dataset = "<replace-with-your-dataset-name>"  
workspace = "<replace-with-your-workspace-id>" 

tenant_kv = ("<replace-with-your-tenant-vault-url>", "<replace-with-your-tenant-secret-name>")  
client_kv = ("<replace-with-your-client-vault-url>", "<replace-with-your-client-secret-name>")  
client_cert_kv = ("<replace-with-your-client-certification-vault-url>", "<replace-with-your-client-certification-secret-name>")  

with set_service_principal(tenant_kv, client_kv, client_certificate=client_cert_kv): fabric.run_model_bpa(dataset, workspace=workspace) 

Benefits of Service Principal support for Semantic Link 

Service Principal support in Semantic Link is a game-changer for organizations operating at scale. This enhancement is especially valuable for enterprises that: 

  • Schedule or trigger notebooks using Semantic Link as an integral part of their broader data pipelines. 
  • Demand secure, credential-free connections to semantic models to uphold stringent security policies.
  • Face the challenge of scaling automation and data processes across multiple teams and complex environments.

Service Principal support streamlines authentication for Semantic Link, reducing manual credential management and enabling secure, automated workflows across teams and environments. 

Learn More

To get started with Service Principal authentication in Semantic Link, including version requirements and setup instructions, refer to the official documentation: Semantic Link Service Principal Support

Billets de blog associés

Service Principal Support in Semantic Link: Enabling Scalable, Secure Automation

janvier 20, 2026 par Xu Jiang

The exchange of real-time data across different data platforms is becoming increasingly popular. The Cribl source (preview) is now available in Real-Time Intelligence, allowing real-time data to flow into Fabric RTI Eventstream through our collaboration with Cribl, enabling you to take full advantage of Fabric Real-Time Intelligence’s robust analytics tools for their real-time needs. Collaborating to broaden … Continue reading “Expanding Real-Time Intelligence data sources with Cribl source (Preview)”

janvier 20, 2026 par Ye Xu

Copy job is the go-to solution in Microsoft Fabric Data Factory for simplified data movement, whether you’re moving data across clouds, from on-premises systems, or between services. With native support for multiple delivery styles, including bulk copy, incremental copy, and change data capture (CDC) replication, Copy job offers the flexibility to handle a wide range … Continue reading “Simplifying data movement across multiple clouds with Copy job – Enhancements on incremental copy and change data capture”