Microsoft Fabric Updates Blog

Announcing New Storage Integration Support in Snowflake Connector for Fabric Data Factory

We are thrilled to announce the release of a highly anticipated feature in Fabric Data Factory: storage integration support for the Snowflake connector in data pipeline. This new capability enhances the security of your data pipelines, enabling seamless and secure integration between Snowflake and external cloud storage providers.

What is Storage Integration?

Storage integration in Snowflake allows data engineers to connect Snowflake with external storage solutions (such as Azure Blob Storage) using a secure and centralized approach. It simplifies the process of reading and writing data between Snowflake and these storage systems while maintaining strict security protocols. Previously, users had to manually configure access roles and manage keys, which could be both cumbersome and error prone. Learn more here.

What are the Key Benefits

  • Enhanced Security: The feature leverages Snowflake’s storage integration framework, ensuring access to external storage is managed through a Snowflake-assigned role, eliminating the need to expose sensitive credentials. This centralized control enhances security by governing data access and permissions through a single source.
  • Secure Authentication: Users can now implement more secure authentication methods when connecting to Azure Blob Storage, whether it’s being used as a source, destination or staging storage.
  • Fine-Grained Control: The feature also respects the PREVENT_UNLOAD_TO_INLINE_URL parameter setting, giving users the option to prevent ad-hoc data unloads to external cloud storage locations. This provides an extra layer of control for managing data flows. Learn more here.  

How to use the new feature

To use storage integration in your existing or new data pipelines, simply configure the storage integration settings directly within data pipeline in Fabric Data Factory.

  1. Create connection to your Snowflake instance in Fabric.
  2. Set up your copy activity in the data pipeline by choosing the Snowflake connection as source or destination.
  3. Specify your storage integration configured in your Snowflake instance under source or destination setting tab.

For detailed steps on how to utilize this feature and its prerequisites, you can refer to our documentation.

Looking Ahead

At Fabric Data Factory, we are committed to continually improving security, user experience and enhancing the capabilities of our data integration services. We look forward to seeing how this new capability helps you accelerate your data integration projects and achieve even greater outcomes. Stay tuned for more exciting features and updates!

Relaterade blogginlägg

Announcing New Storage Integration Support in Snowflake Connector for Fabric Data Factory

november 18, 2025 från Bogdan Crivat

Two years ago, we introduced Microsoft Fabric to Azure Synapse users as a bold new direction for analytics—one that promised unification, simplicity, scale. Today, that promise has matured into a complete platform that’s not just ready for enterprises, it’s built for them. To be very clear: Synapse continues to be supported. We will fix bugs, … Continue reading “Two Years in: How Fabric Redefines the Modernization Path for Synapse Users”

november 18, 2025 från Faisal Mohamood

Microsoft Fabric isn’t just another data platform—it’s a game-changer, delivering enterprise-grade data integration at scale. At its foundation is Microsoft OneLake, a unified data lake that enables zero-copy and zero-ETL strategies—eliminating redundant data movement and complex pipelines. This approach helps organizations break down silos and maximize the value of their data, whether it resides in … Continue reading “Advancing Data Integration: Innovation in Data Factory in MS Fabric at Ignite 2025”