Microsoft Fabric Updates Blog

Efficiently build and maintain your Data pipelines with Copilot for Data Factory: new capabilities and experiences

The new Data pipeline capabilities in Copilot for Data Factory are now available in preview. These features function as an AI expert to help users build, troubleshoot, and maintain data pipelines.

What can new capabilities in Copilot for Data Factory do for you?

  • Understand your business intent and effortlessly translate it into data pipeline activities to build your data integration solutions.
  • Provide summary with clear explanation for you to better understand complex data pipeline in data integration solutions created by other members.
  • Troubleshoot Data pipeline error messages with clear and actionable summary and recommendations.

Let’s start the AI-powered journey with Copilot for Data Factory. After creating a new or opening an existing data pipeline, you can use the Copilot button in the home tab to get started easily with three starter options.

Effortlessly create your data pipelines

Copilot for Data Factory can intuitively understand your business intent to help you effortlessly create your data integration solutions. You can work with Copilot for Data Factory to create data pipelines for your data integration solutions by either providing a single comprehensive prompt or by engaging with Copilot step-by-step following clear guidance provided by Copilot in each stage. Copilot can understand user intention very well, such as setting up a metadata driven pipeline to copy different tables every time from source to destination by using lookup, foreach and copy activities.

Pipeline generation skill

Easily understand your complex data pipelines

Empowered by Copilot for Data Factory, you don’t need to be worried about maintaining amount of complex data pipelines. Copilot can provide super clear and useful summary with clear explanation for you to better understand the complex data pipelines created by other team members.

You can either clicking ‘Summarize this pipeline’ in the starter prompt or send it as a prompt, it will quickly generate the explanation summary for better understanding complex pipeline.

Summarize pipeline skill

Efficiently troubleshoot error messages in your Data pipelines

Copilot for Data Factory can interpret well in Data pipeline error messages and provide clear summary and actionable recommendations for you to better troubleshoot and resolve the errors.

Clicking the icons besides the failed Data pipeline activity, immediately you can get the easy-to-understand summary and recommendations.

Copilot for troubleshooting error messages

Learn more about Copilot for Data Factory

Entradas de blog relacionadas

Efficiently build and maintain your Data pipelines with Copilot for Data Factory: new capabilities and experiences

febrero 3, 2026 por Arun Ulagaratchagan

Data teams today are under extraordinary pressure. Expectations around analytics and AI have never been higher, yet enterprise data continues to live across a patchwork of systems, tools, and platforms. The result is friction, duplication, and complexity, making it harder for data teams to provide a unified, real-time view of their business. Microsoft and Snowflake … Continue reading “Microsoft OneLake and Snowflake interoperability (Generally Available)”

enero 29, 2026 por Bodhisatva Gautam

We announced Outbound Access Protection for Spark (Generally Available) and recently extended it to support SQL Endpoint and Warehouse. Now, Pipelines, Copy job, Dataflows, OneLake Shortcuts as well as Mirrored Databases (such as Mirrored SQL Database, Mirrored Snowflake) support Workspace level Outbound Access Protection (Preview). Key Benefits What to expect with Outbound access protection (OAP) … Continue reading “Workspace Outbound Access Protection for Data Factory and OneLake Shortcuts (Preview)”