Microsoft Fabric Updates Blog

Orchestrate your Databricks Jobs with Fabric Data pipelines

We’re excited to announce that you can now orchestrate Azure Databricks Jobs from your Microsoft Fabric data pipelines!

Databrick Jobs allow you to schedule and orchestrate a task or multiple tasks in a workflow in your Databricks workspace. Since any operation in Databricks can be a task, this means you can now run anything in Databricks via FDF, such as serverless jobs, SQL tasks, Delta Live Tables, batch inferencing with model serving endpoints, or automatically publish and refreshing semantic models in the Power BI service.

With this new update, you’ll be able to trigger these workflows from your Fabric data pipeline. In your Azure Databricks activity, you’ll see a new Type to select from under your Settings tab called Job.

Once you have selected the Job type, you’ll be able to use the drop-down list to see all the Databricks jobs that you have created, allowing you to select a Job to run from your data pipeline.

We also know that being able to parameterize your data pipelines is important as it allows you to create generic reusable pipeline models.

Fabric Data Factory continues to provide end-to-end support for these patterns and is excited to extend this capability to the Azure Databricks activity.

After you select the Job type, you’ll also be able to send parameters to your Databricks job, allowing maximum flexibility and power for your orchestration jobs.

To learn more, read Azure Databricks activity – Microsoft Fabric | Microsoft Learn or watch this demo.

Have any questions or feedback? Leave a comment below!

Relaterade blogginlägg

Orchestrate your Databricks Jobs with Fabric Data pipelines

juli 10, 2025 från Matthew Hicks

Effortlessly read Delta Lake tables using Apache Iceberg readers Microsoft Fabric is a unified, SaaS data and analytics platform designed for the era of AI. All workloads in Microsoft Fabric use Delta Lake as the standard, open-source table format. With Microsoft OneLake, Fabric’s unified SaaS data lake, customers can unify their data estate across multiple … Continue reading “New in OneLake: Access your Delta Lake tables as Iceberg automatically (Preview)”

juli 10, 2025 från Vaibhav Shrivastava

A new feature has been added to Eventstream—the SQL Operator—which enables real-time data transformation within the platform. Whether you’re filtering, aggregating, or joining data streams, or handling complex data transformation needs like conditional logic, nested expression, string manipulation etc. SQL Operator gives you the flexibility and control to craft custom transformations using the language you … Continue reading “From Clicks to Code: SQL Operator under Fabric Eventstream (Preview)”