Microsoft Fabric Updates Blog

Environment is now generally available

Exciting news! The environment has officially become a generally available feature within Microsoft Fabric.

What is the environment in Microsoft Fabric?

The environment serves as a comprehensive container for both your hardware and software settings within Spark. Within this unified interface, you have the ability to select the desired Spark runtime, install libraries, and configure Spark compute settings and properties. It simplifies the process of managing, maintaining, and sharing these configurations.

Environment authoring

Libraries and Spark compute

Managing libraries and Spark compute configurations have been integral features of the environment since its private preview. With these core functionalities, you can tailor distinct configuration sets. Currently, in Spark compute configurations, two Spark runtime versions are available, and administrators can select and fine-tune compute values for the designated pool. As for libraries, we currently support public libraries from PyPI and conda.

Resources folder (new feature)

A screenshot of a computer

Description automatically generated

The Resources folder facilitates the management of small resources during the development phase. When files are uploaded within the environment, they become accessible from various notebooks once they are attached to the same environment.

The beauty of this feature lies in its real-time manipulation capabilities. Regardless of the environment’s current state, you can seamlessly add, edit, or remove files and folders. Any changes made in one notebook are instantly reflected across other notebooks and the environment item.

Sharing (new feature)

A screenshot of a computer

Description automatically generated

Environment sharing is now available, allowing you to collaborate seamlessly. When you share an environment item, recipients automatically receive read permission. With this permission, they can explore the environment’s configurations and attach it to notebooks or Spark jobs. To ensure smooth code execution, remember to grant the ‘Read’ permission of attached environment when sharing your notebooks and Spark job definitions.

Furthermore, you have the option to share the environment with ‘Share’ and ‘Edit’ permissions. Users with ‘Share’ permission can continue sharing the environment with others, with their existing permissions. Meanwhile, recipients with ‘Edit’ permission can update the environment’s content.

Environment CICD

Git support (new feature)

A screenshot of a computer

Description automatically generated

Fabric environments offer Git integration for seamless source control with Azure DevOps now. Currently, Libraries and Spark compute are supported. Within the item root folder, environments are structured with a ‘Libraries’ folder containing ‘PublicLibraries’ and ‘CustomLibraries’ subfolders, alongside the ‘Setting’ folder.

Libraries

A screenshot of a computer

Description automatically generated

When you commit an environment to a Git repository, the public library section is transformed into its YAML representation. Additionally, the custom library is committed along with its source file. By either editing the existing YAML file or uploading your own YAML file, you can effectively manage the public libraries within the environment. Furthermore, you have the ability to control the custom libraries that impact the environment by uploading or deleting the corresponding custom library files in the designated folder.

Setting

A screenshot of a computer

Description automatically generated

The Spark compute section is seamlessly transformed into its YAML representation. Within this YAML file, you have the flexibility to switch the attached pool, fine-tune compute configurations, manage Spark properties, and select the desired Spark runtime.

Deployment pipeline (new feature)

A screenshot of a computer

Description automatically generated

Fabric’s deployment pipelines simplify the process of delivering modified content across different phases, such as moving from development to test. Excitingly, the deployment pipelines now support environment items, allowing you to efficiently manage environment deployments by configuring workspaces with phases.

Public APIs (new feature)

Public APIs have consistently ranked among the most requested features for our environment, and now they’re finally here. Libraries and Spark compute can be managed through public APIs. If you’re interested in learning how to utilize APIs for environment management, I recommend reading this article [Using public APIs for environment].

Related blog posts

Environment is now generally available

June 12, 2024 by Estera Kot

The Native Execution Engine showcases our dedication to innovation and performance, transforming data processing in Microsoft Fabric. We are excited to announce that the Native Execution Engine for Fabric Runtime 1.2 is now available in public preview. The Native Execution Engine leverages technologies such as a columnar format and vectorized processing to boost query execution … Continue reading “Public Preview of Native Execution Engine for Apache Spark on Fabric Data Engineering and Data Science”

May 29, 2024 by Liliam C Leme

This post will explore Microsoft Fabric REST APIs for seamless automation. Following, let’s review through some of the options for API integration in Fabric, unravelling the threads of development, deployment, governance, and beyond in this comprehensive series, this is Part One. Some additional Material to review: Fabric API quickstart – Microsoft Fabric REST APIs | … Continue reading “Fabric Change the Game: Revolutionizing Fabric with REST API, Part 1”