Microsoft Fabric Updates Blog

OneLake as a Source for COPY INTO and OPENROWSET (Preview)

Simplified, workspace-governed ingestion without external dependencies

COPY INTO and OPENROWSET from OneLake are now available in Preview for Microsoft Fabric Data Warehouse.

With this release, you can load and query files directly from Lakehouse File folders, without relying on external staging storage, SAS tokens, or complex IAM configurations. This improvement reinforces Fabric’s vision of a fully SaaS-native platform, where data movement, access control, and analytics all live within the same governed environment.

What’s New

You can now use both COPY INTO and OPENROWSET directly with OneLake paths, enabling SQL-based read and write access to files stored in Lakehouse folders—without Spark, pipelines, or external staging.

  • COPY INTO supports ingesting CSV and Parquet files.
  • OPENROWSET enables ad hoc querying without loading data into tables.
  • Both use Fabric workspace permissions — no storage IAM, SAS tokens, or firewall rules required.

This update removes the need for external services and manual setup, delivering a true SaaS-native experience secured by Entra ID, and ready for scale.

Scenarios and Capabilities Unlocked

With OneLake as a source for COPY INTO and OPENROWSET, you can now:

  • Ingest from Lakehouse to Warehouse – Use COPY INTO directly on files stored in Lakehouse folders (CSV, Parquet, JSON).
  • Perform cross-workspace data loads – Load or query data from Lakehouses located in other workspaces (within the same tenant).
  • Operate in Private Link environments – Execute COPY operations entirely within trusted Fabric workspaces — no external storage needed.
  • Automate pipelines using Service Principals (SPN) – Securely trigger data movement using Entra ID–based service principals for scheduled or automated ingestion.

Example:

Copy Into

COPY INTO dbo.Sales FROM 'https://onelake.dfs.fabric.microsoft.com/<workspace>/<lakehouse>/Files/Sales.csv' WITH (     FILE_TYPE = 'CSV',     FIRSTROW = 2,     FIELDTERMINATOR = ',',     ERRORFILE = 'https://onelake.dfs.fabric.microsoft.com/<workspace>/<lakehouse>/Files/Sales_Errors.csv' ); 

OPENROWSET

SELECT *
FROM OPENROWSET(
    'https://onelake.dfs.fabric.microsoft.com/<workspace>/<lakehouse>/Files/Sales.csv'
);

Try it now (Preview)

This preview is available to all Microsoft Fabric users. Simply upload files into any Lakehouse’s Files folder and use COPY INTO or OPENROWSET from your Data Warehouse—with no external storage required.

What’s Coming Next

We’re working on simplifying the authoring experience even further. In upcoming updates, users will be able to reference workspace and Lakehouse names using friendly names rather than full path URLs—making scripts more readable, maintainable, and collaboration friendly.

Learn More

Explore our documentation for a more in-depth look.

Relaterade blogginlägg

OneLake as a Source for COPY INTO and OPENROWSET (Preview)

januari 20, 2026 från Xu Jiang

The exchange of real-time data across different data platforms is becoming increasingly popular. The Cribl source (preview) is now available in Real-Time Intelligence, allowing real-time data to flow into Fabric RTI Eventstream through our collaboration with Cribl, enabling you to take full advantage of Fabric Real-Time Intelligence’s robust analytics tools for their real-time needs. Collaborating to broaden … Continue reading “Expanding Real-Time Intelligence data sources with Cribl source (Preview)”

januari 20, 2026 från Ye Xu

Copy job is the go-to solution in Microsoft Fabric Data Factory for simplified data movement, whether you’re moving data across clouds, from on-premises systems, or between services. With native support for multiple delivery styles, including bulk copy, incremental copy, and change data capture (CDC) replication, Copy job offers the flexibility to handle a wide range … Continue reading “Simplifying data movement across multiple clouds with Copy job – Enhancements on incremental copy and change data capture”