Microsoft Fabric Updates Blog

Simplifying Data Ingestion with Copy job – Copy data across tenants using Copy job in Fabric Data Factory 

Copy job is the go-to solution in Microsoft Fabric Data Factory for simplified data movement, whether you’re moving data across clouds, from on-premises systems, or between services. With native support for multiple delivery styles, including bulk copy, incremental copy, and change data capture (CDC) replication, Copy job offers the flexibility to handle a wide range of data movement scenarios—all through an intuitive, easy-to-use experience. Learn more in What is Copy job in Data Factory – Microsoft Fabric | Microsoft Learn.

With Copy job, you can also perform cross-tenant data movement between Fabric and other clouds, such as Azure. It also enables cross-tenant data sharing within OneLake, allowing you to copy data across Fabric Lakehouse, Warehouse, and SQL DB in Fabric between tenants with SPN support. This blog provides step-by-step guidance on using Copy job to copy data across different tenants.

Scenario

In this scenario, Tenant A has a Fabric Data Warehouse and Copy job, while Tenant B has an Azure Data Lake Gen2 account. A user from Tenant A will use Copy job to copy data from the Azure Data Lake Gen2 account with service principal authentication (owned by Tenant B) into Fabric Data Warehouse (owned by Tenant A).

How it works

Prerequisites 

An Azure Data Lake Gen2 account is available in Tenant B, with a service principal enabled to allow users to access the account.

How to Create an Azure Data Lake Gen2 account with service principal enabled in Tenant B

1. Go to https://portal.azure.com/ and sign in with your user account and credentials from Tenant B.

2. After signing in, create a Storage Account in the Azure Portal.

3. Open your Storage Account, create a container, and ensure that it contains data.

4. In the Azure Portal, search for App registrations, then click New registration to create a new app.

5. After registering your app, you’ll see the Application (client) ID and Directory (tenant) ID on the Overview page. Copy these values, as you’ll need them for authentication later.

6. In the left menu, go to Certificates & secrets, then click New client secret to add a client secret.

7. Copy the Value immediately to be used for authentication later— it won’t be shown again.

8. In the Azure Portal, go to your Azure Data Lake Gen account. Click Access Control (IAM), then select Add role assignment.

9. Choose a Role, for example, Storage Blob Data Contributor (read/write)

10. In Members, select User, group, or service principal, then choose the app you just registered.

11. Click Assign. You now have access to this storage account via the service principal.

You can learn more in Register a Microsoft Entra app and create a service principal – Microsoft identity platform | Microsoft Learn & Access storage using a service principal & Microsoft Entra ID(Azure Active Directory) – Azure Databricks | Microsoft Learn

Get started to copy data across tenants using Copy job

1. Go to Sign in | Microsoft Power BI and log in to Fabric using your user account and credentials from Tenant A.

2. Go to your workspace and create a Copy job to move your data.

3. After naming the Copy job, select Azure Data Lake Gen2 as the source from which to copy data.

4. Enter the URL of your Azure Data Lake Gen2 account, and select service principal as the authentication type.

Provide the Tenant ID, Client ID, and Client Secret that you got from the prerequisite steps, then click Next.

5. After connecting to your source store, you should be able to see your container and its files. Select the files you want to copy from your ADLS Gen2 account.

6. Select your destination store for the data. In this case, choose Fabric Data Warehouse.

7. [Optional] Configure table mapping or column mapping if you need.  

8. You can choose either full copy or incremental copy. If you select incremental copy, Copy job will first perform a full copy of all files, and then subsequently copy only the new files from the Azure Data Lake Gen account.

9. Review the job summary and save + run it.

10. In the job panel, you’ll see that your Copy job has successfully completed the initial full snapshot transfer to the destination.

11. Open  your Fabric Data Warehouse, and you will see that the data from the Azure Data Lake Gen account in Tenant B has been successfully loaded into the Fabric Data Warehouse.

Summary

You can easily use Copy job to copy data across different tenants. In this scenario, A user from Tenant A can use Copy job to successfully copy data from the Azure Data Lake Gen2 account with service principal authentication (owned by Tenant B) into Fabric Data Warehouse (owned by Tenant A).

Additional Resources

To learn more, explore Microsoft Fabric Copy job documentation.

Submit your feedback on Fabric Ideas and join the conversation in the Fabric Community.

To get into the technical details, check out the Fabric documentation.  

If you have a question or want to share your feedback, please leave us a comment.

Related blog posts

Simplifying Data Ingestion with Copy job – Copy data across tenants using Copy job in Fabric Data Factory 

November 24, 2025 by Jianlei Shen

This milestone marks a major step forward in unifying and simplifying data movement experiences across Data Factory. With Copy Job Activity, users can now enjoy the simplicity and speed of Copy Job while leveraging the orchestration power and flexibility of Data Factory pipelines. What is the Copy job Activity  Copy Job Activity allows you to … Continue reading “Announcing Copy Job Activity in Data Factory Pipeline (Generally Available)”

November 21, 2025 by Penny Zhou

Troubleshooting pipeline failures can be overwhelming, especially when a single run throws dozens or even hundreds of errors. The new Error Insights Copilot in Fabric makes this process faster, smarter, and easier. Powered by AI, Copilot provides clear explanations, root cause analysis, and actionable recommendations, so you can resolve issues without getting lost in technical … Continue reading “AI-powered troubleshooting for Fabric pipeline error messages”