Microsoft Fabric Updates Blog

Fabric Data Factory now supports writing data in Iceberg format via Azure Data Lake Storage Gen2 Connector in Data pipeline

We’ve made a significant enhancement in Fabric Data Factory: Data pipeline can now write data in Iceberg format via the Azure Data Lake Storage (ADLS) Gen2 connector! This addition provides a powerful new option for users who need to manage and optimize large datasets with a high level of flexibility, reliability, and performance. Iceberg format support brings new efficiencies in how data is handled, transformed, and stored, enabling better performance and future scalability.

What is Iceberg Format and Why Does It Matter?

Apache Iceberg is a high-performance table format designed specifically for large analytical datasets, enabling more reliable data management and faster querying. It’s optimized for handling petabytes of data while supporting fast incremental reads, schema evolution, and ACID transactions, making it especially valuable in data engineering and analytics workflows. Iceberg is increasingly favored in big data ecosystems, enabling organizations to keep up with their ever-growing data demands while maintaining flexibility and control.

Getting Started: Writing Data in Iceberg Format via ADLS Gen2 Connector

With this new feature, Fabric Data Factory users can start writing their data in Iceberg format directly through the Azure Data Lake Storage Gen2 connector.

Here’s how it works

  1. Enable the Iceberg Format: When setting up your copy activity in data pipeline, select the option to write in Iceberg format within the ADLS Gen2 connector settings under destination section.

2. Customize and Optimize: Configure additional settings to tailor the data output to your specific needs.

3. Execute the pipeline: Once configured, your pipeline will automatically handle the Iceberg table format, allowing you to focus on higher-level tasks rather than managing the nuances of large-scale data management.

Planned expansions: read capability and additional connectors

While this release focuses on enabling write capability in Iceberg format, we are already working on expanding this functionality. Future updates will introduce read capability for Iceberg format in Fabric Data Factory, making it easier to both write to and read from Iceberg tables. Additionally, we aim to support more file-type connectors across Data Factory, further enhancing data integration flexibility and usability.

Get started with Iceberg Format today

The capability to write data in Iceberg format is available now in Fabric Data Factory’s Data pipeline via the ADLS Gen2 connector. We encourage you to explore this feature and experience the benefits of a high-performance, scalable data format designed for modern data workloads. To learn more about configuring Iceberg format in your pipelines, visit our documentation page.

We look forward to seeing how this new feature will empower your data workflows and look forward to continuing to innovate with you. Stay tuned for more updates as we expand our capabilities and bring new possibilities to Fabric Data Factory!

Fabric Data Factory Team

Bài đăng blog có liên quan

Fabric Data Factory now supports writing data in Iceberg format via Azure Data Lake Storage Gen2 Connector in Data pipeline

tháng 2 17, 2026 của Penny Zhou

Coauthor: Abhishek Narain Ensuring secure connectivity to your data sources is critical for modern data estates. We have released the Key-Pair authentication for Snowflake connections Preview in October and are happy to announce it is now Generally Available (GA). This release offers an enhanced security alternative to basic authentication methods, such as username and password, … Continue reading “Snowflake Key-Pair Authentication (Generally Available)”

tháng 2 9, 2026 của Leo Li

The On-premises Data Gateway manual update feature is now available in preview! This new capability simplifies gateway maintenance and helps you keep your environment secure and up to date with greater flexibility and control. With this new feature, administrators can now manually trigger updates—either directly through the gateway UI or programmatically via API or script. … Continue reading “Manual update for on-premises data gateway (Preview)”