Microsoft Fabric Updates Blog

Easily connect your data with the new modern get data experience for data pipeline

We are excited to share the new modern get data experience in data pipelines provides an extremely simple way to connect to your data by intuitively browsing different Fabric artifacts through the OneLake datahub and getting you closer to the data in the quickest way possible.

We listen to customer feedback from various channels to keep improving our user experience. This new experience will empower you to easily move your data from various sources to your preferred destinations.  

To see this in action, let’s begin by creating a new pipeline and then click Copy data assistant card or the Use copy assistant button to get started with the modern get data experience.


The home page of modern get data empowers you to intuitively discover the right data and the right connection info, enabling you to quickly search and find any supported connections.

It also provides a super-efficient way to connect to recently used Fabric items (e.g. Lakehouse, Datawarehouse) in One Lake. 

Recently used Fabric items in the Modern Get Data experience.

OneLake data hub

With the new modern get data experience, users can easily choose artifacts from the OneLake Data hub.

OneLake Data hub page
Select Fabirc artifacts in the OneLake data hub.

Sample data

Sample data provides a convenient way for you to build up the test quickly and conveniently.

Sample data
Choose from a variety of sample datasets.

New Fabric Item in Destination

Users can easily create new Fabric item as your data destination with simple two clicks.

New fabric item in destination

Modern connection experience in Pipeline editing page

Users can easily choose existing connections or create a new connection with the new modern get data experience on the Pipeline editing page. You can choose an existing connections from the dropdown selection or click More to use the intuitive get data experience for more connections.

New connection experience
Modern get data experience in pipeline editing page

Share your feedback

Please continue to share your feedback and feature ideas with us via our official Community channels, and stay tuned to see updates on our public roadmap page on upcoming new features:

Related blog posts

Easily connect your data with the new modern get data experience for data pipeline

June 21, 2024 by Marc Bushong

Developing ETLs/ELTs can be a complex process when you add in business logic, large amounts of data, and the high volume of table data that needs to be moved from source to target. This is especially true in analytical workloads involving relational data when there is a need to either fully reload a table or incrementally update a table. Traditionally this is easily completed in a flavor of SQL (or name your favorite relational database). But a question is, how can we execute a mature, dynamic, and scalable ETL/ELT utilizing T-SQL with Microsoft Fabric? The answer is with Fabric Pipelines and Data Warehouse.

June 18, 2024 by RK Iyer

✎ Co-Author – Abhishek Narain Overview Building an effective Lakehouse starts with establishing a robust ingestion layer. Ingestion refers to the process of collecting, importing, and processing raw data from various sources into the data lake. Data ingestion is fundamental to the success of a data lake as it enables the consolidation, exploration, and processing … Continue reading “Demystifying Data Ingestion in Fabric: Fundamental Components for Ingesting Data into a Fabric Lakehouse using Fabric Data Pipelines”