Microsoft Fabric Updates Blog

Integrating On-Premises Data into Microsoft Fabric Using Data Pipelines in Data Factory

We are thrilled to announce the public preview of on-premises connectivity for Data pipelines in Microsoft Fabric.

Using the on-premises Data Gateway, customers can connect to on-premises data sources using dataflows and data pipelines with Data Factory in Microsoft Fabric. This enhancement significantly broadens the scope of data integration capabilities. In essence, by using an on-premises Data Gateway, organizations can keep databases and other data sources on their on-premises networks while securely integrating them to Microsoft Fabric (cloud).

Thank you to all product feedback from customers and the Microsoft Fabric community who have been working with us closely to deliver this new capability!

Let’s help you get started!

Create an on-premises data gateway

  1. An on-premises data gateway is a software designed to be installed within a local network environment. It provides a means to directly install the gateway onto your local machine. For detailed instructions on how to download and install the on-premises data gateway, refer to Install an on-premises data gateway.
  2. Sign-in using your user account to access the on-premises data gateway, after which it’s prepared for utilization.

Create a connection for your on-premises data source

  1. Navigate to the admin portal and select the settings button (an icon that looks like a gear) at the top right of the page. Then choose Manage connections and gateways from the dropdown menu that appears.

2. On the New connection dialog that appears, select On-premises and then provide your gateway cluster, along with the associated resource type and relevant information.

Using on-premises data in a pipeline

  1. Go to your workspace and create a Data Pipeline.

2. Add a new source to the pipeline copy activity and select the connection established in the previous step.

3. Select a destination for your data from the on-premises source.

4. Run the pipeline.

You have now created a pipeline to load data from an on-premises data source into a cloud destination.

Resources to help you get started

Have any questions or feedback? Leave a comment below!

Zugehörige Blogbeiträge

Integrating On-Premises Data into Microsoft Fabric Using Data Pipelines in Data Factory

Mai 22, 2025 von Jeroen Luitwieler

The introduction of file-based destinations for Dataflows Gen2 marks a significant step forward in enhancing collaboration, historical tracking, and sharing data for business users. This development begins with SharePoint file destinations in CSV format, offering a streamlined way to share with users on the SharePoint platform. Overview of file-based destinations File-based destinations allow data to … Continue reading “SharePoint files destination the first file-based destination for Dataflows Gen2”

Mai 22, 2025 von Penny Zhou

Understanding complex data pipelines created by others can often be a challenging task, especially when users must review each activity individually to understand its configurations, settings, and functions. Additionally, manul updates to general settings, such as timeout and retry parameters, across multiple activities can be time-consuming and tedious. Copilot for Data pipeline introduces advanced capabilities … Continue reading “AI-powered development with Copilot for Data pipeline – Boost your productivity in understanding and updating pipeline”