Microsoft Fabric Updates Blog

Integrating On-Premises Data into Microsoft Fabric Using Data Pipelines in Data Factory

We are thrilled to announce the public preview of on-premises connectivity for Data pipelines in Microsoft Fabric.

Using the on-premises Data Gateway, customers can connect to on-premises data sources using dataflows and data pipelines with Data Factory in Microsoft Fabric. This enhancement significantly broadens the scope of data integration capabilities. In essence, by using an on-premises Data Gateway, organizations can keep databases and other data sources on their on-premises networks while securely integrating them to Microsoft Fabric (cloud).

Thank you to all product feedback from customers and the Microsoft Fabric community who have been working with us closely to deliver this new capability!

Let’s help you get started!

Create an on-premises data gateway

  1. An on-premises data gateway is a software designed to be installed within a local network environment. It provides a means to directly install the gateway onto your local machine. For detailed instructions on how to download and install the on-premises data gateway, refer to Install an on-premises data gateway.
  2. Sign-in using your user account to access the on-premises data gateway, after which it’s prepared for utilization.

Create a connection for your on-premises data source

  1. Navigate to the admin portal and select the settings button (an icon that looks like a gear) at the top right of the page. Then choose Manage connections and gateways from the dropdown menu that appears.

2. On the New connection dialog that appears, select On-premises and then provide your gateway cluster, along with the associated resource type and relevant information.

Using on-premises data in a pipeline

  1. Go to your workspace and create a Data Pipeline.

2. Add a new source to the pipeline copy activity and select the connection established in the previous step.

3. Select a destination for your data from the on-premises source.

4. Run the pipeline.

You have now created a pipeline to load data from an on-premises data source into a cloud destination.

Resources to help you get started

Have any questions or feedback? Leave a comment below!

Zugehörige Blogbeiträge

Integrating On-Premises Data into Microsoft Fabric Using Data Pipelines in Data Factory

Juli 10, 2025 von Ulrich Christ

For enterprise customers, SAP represents one of the most valuable data sources. Integration of SAP data with other non-SAP data estates is a frequent requirement of Microsoft Fabric customers. Consequently, we are continually working to expand the options for SAP data integration to ensure that every customer has a viable solution tailored to their specific … Continue reading “What’s new with SAP connectivity in Microsoft Fabric – July 2025”

Juli 1, 2025 von Ye Xu

Copy job has been a go-to tool for simplified data ingestion in Microsoft Fabric, offering a seamless data movement experience from any source to any destination. Whether you need batch or incremental copying, it provides the flexibility to meet diverse data needs while maintaining a simple and intuitive workflow. We continuously refine Copy job based … Continue reading “Simplifying Data Ingestion with Copy job – Incremental Copy GA, Lakehouse Upserts, and New Connectors”