Microsoft Fabric Updates Blog

Easily Move Your Data Across Workspaces Using Modern Get Data of Fabric Data Pipeline

We are excited to share that the new modern get data experience of data pipeline now supports copying to Lakehouse and Datawarehouse across different workspaces with an extremely intuitive experience.

When you are building a medallion architecture, you can easily leverage Data Pipeline to copy your data into Bronze Lakehouse/Warehouse across different workspaces. This feature was developed in response to valuable customer feedback, and we’re eager to hear your thoughts on it.

Let’s open the Copy Assistant inside a Data Pipeline to get started.

How to start Pipeline Copy assitant

When you choose your source or destination, the “OneLake data hub” tab offers a user-friendly way to locate your data across various workspaces by letting you to search your workspace name and filter on the connection types. This screenshot shows how easily users can find other workspace by searching the key words and filter on the connection types to continue.

When choosing your destination, you also can easily create the new Fabric artifacts from other workspaces by choosing different workspaces names under the “Workspace” dropdown.

Create LH in another workspace

Have any questions or feedback? Leave a comment below!

Relaterade blogginlägg

Easily Move Your Data Across Workspaces Using Modern Get Data of Fabric Data Pipeline

maj 22, 2025 från Jeroen Luitwieler

The introduction of file-based destinations for Dataflows Gen2 marks a significant step forward in enhancing collaboration, historical tracking, and sharing data for business users. This development begins with SharePoint file destinations in CSV format, offering a streamlined way to share with users on the SharePoint platform. Overview of file-based destinations File-based destinations allow data to … Continue reading “SharePoint files destination the first file-based destination for Dataflows Gen2”

maj 22, 2025 från Penny Zhou

Understanding complex data pipelines created by others can often be a challenging task, especially when users must review each activity individually to understand its configurations, settings, and functions. Additionally, manul updates to general settings, such as timeout and retry parameters, across multiple activities can be time-consuming and tedious. Copilot for Data pipeline introduces advanced capabilities … Continue reading “AI-powered development with Copilot for Data pipeline – Boost your productivity in understanding and updating pipeline”