Microsoft Fabric Updates Blog

Easily Move Your Data Across Workspaces Using Modern Get Data of Fabric Data Pipeline

We are excited to share that the new modern get data experience of data pipeline now supports copying to Lakehouse and Datawarehouse across different workspaces with an extremely intuitive experience.

When you are building a medallion architecture, you can easily leverage Data Pipeline to copy your data into Bronze Lakehouse/Warehouse across different workspaces. This feature was developed in response to valuable customer feedback, and we’re eager to hear your thoughts on it.

Let’s open the Copy Assistant inside a Data Pipeline to get started.

How to start Pipeline Copy assitant

When you choose your source or destination, the “OneLake data hub” tab offers a user-friendly way to locate your data across various workspaces by letting you to search your workspace name and filter on the connection types. This screenshot shows how easily users can find other workspace by searching the key words and filter on the connection types to continue.

When choosing your destination, you also can easily create the new Fabric artifacts from other workspaces by choosing different workspaces names under the “Workspace” dropdown.

Create LH in another workspace

Have any questions or feedback? Leave a comment below!

Postingan blog terkait

Easily Move Your Data Across Workspaces Using Modern Get Data of Fabric Data Pipeline

Desember 4, 2025 berdasarkan Connie Xu

Notebook activity in Microsoft Fabric Data Factory pipelines now supports connection property—unlocking a more secure and production-ready way to run your notebooks. What’s New? With this update, you can configure Notebook activities to run as Service Principal (SPN) or Workspace Identity (WI). These authentication methods are our recommended approach for production environments, ensuring: Why it … Continue reading “Run Notebooks in Pipelines with Service Principal or Workspace Identity”

Desember 1, 2025 berdasarkan Ye Xu

Copy job is the recommended approach in Microsoft Fabric Data Factory for moving data from any sources to any destinations in a simplified and efficient way—whether you’re transferring data across clouds, from on-premises systems, or between services. With native support for multiple delivery patterns, including bulk copy, incremental copy, and change data capture (CDC) replication, … Continue reading “Simplifying Data Ingestion with Copy job – Replicate data from Dataverse through Fabric to multiple destinations”