Microsoft Fabric Updates Blog

OneLake as a Source for COPY INTO and OPENROWSET (Preview)

Simplified, workspace-governed ingestion without external dependencies

COPY INTO and OPENROWSET from OneLake are now available in Preview for Microsoft Fabric Data Warehouse.

With this release, you can load and query files directly from Lakehouse File folders, without relying on external staging storage, SAS tokens, or complex IAM configurations. This improvement reinforces Fabric’s vision of a fully SaaS-native platform, where data movement, access control, and analytics all live within the same governed environment.

What’s New

You can now use both COPY INTO and OPENROWSET directly with OneLake paths, enabling SQL-based read and write access to files stored in Lakehouse folders—without Spark, pipelines, or external staging.

  • COPY INTO supports ingesting CSV and Parquet files.
  • OPENROWSET enables ad hoc querying without loading data into tables.
  • Both use Fabric workspace permissions — no storage IAM, SAS tokens, or firewall rules required.

This update removes the need for external services and manual setup, delivering a true SaaS-native experience secured by Entra ID, and ready for scale.

Scenarios and Capabilities Unlocked

With OneLake as a source for COPY INTO and OPENROWSET, you can now:

  • Ingest from Lakehouse to Warehouse – Use COPY INTO directly on files stored in Lakehouse folders (CSV, Parquet, JSON).
  • Perform cross-workspace data loads – Load or query data from Lakehouses located in other workspaces (within the same tenant).
  • Operate in Private Link environments – Execute COPY operations entirely within trusted Fabric workspaces — no external storage needed.
  • Automate pipelines using Service Principals (SPN) – Securely trigger data movement using Entra ID–based service principals for scheduled or automated ingestion.

Example:

Copy Into

COPY INTO dbo.Sales FROM 'https://onelake.dfs.fabric.microsoft.com/<workspace>/<lakehouse>/Files/Sales.csv' WITH (     FILE_TYPE = 'CSV',     FIRSTROW = 2,     FIELDTERMINATOR = ',',     ERRORFILE = 'https://onelake.dfs.fabric.microsoft.com/<workspace>/<lakehouse>/Files/Sales_Errors.csv' ); 

OPENROWSET

SELECT *
FROM OPENROWSET(
    'https://onelake.dfs.fabric.microsoft.com/<workspace>/<lakehouse>/Files/Sales.csv'
);

Try it now (Preview)

This preview is available to all Microsoft Fabric users. Simply upload files into any Lakehouse’s Files folder and use COPY INTO or OPENROWSET from your Data Warehouse—with no external storage required.

What’s Coming Next

We’re working on simplifying the authoring experience even further. In upcoming updates, users will be able to reference workspace and Lakehouse names using friendly names rather than full path URLs—making scripts more readable, maintainable, and collaboration friendly.

Learn More

Explore our documentation for a more in-depth look.

Powiązane wpisy w blogu

OneLake as a Source for COPY INTO and OPENROWSET (Preview)

kwietnia 20, 2026 autor: Penny Zhou

Coordinating dbt runs with upstream ingestion and downstream consumption often requires complex solutions and different tools. You can now add a dbt job activity (Preview) directly to your Fabric pipelines. This lets you orchestrate dbt transformations alongside other pipeline activities, so you can build end-to-end data workflows without switching tools. Why this matters Run dbt … Continue reading “Orchestrate dbt jobs activity in your Fabric pipelines (Preview)”

kwietnia 16, 2026 autor: Tom Peplow

Have you ever tried to understand what’s stored in your Fabric items? Would you even know where to begin? I had 92,000 UK property transactions sitting in an open mirrored database. Rather than spending hours sorting through documentation, I just asked my AI agent: “Document what’s in the House Price Open Mirror in my UK … Continue reading “Give your AI agent the keys to OneLake: OneLake MCP (Generally Available)”