Microsoft Fabric Updates Blog

Introducing the end-to-end scenarios in Microsoft Fabric

Our customers are struggling with translating their data into competitive advantage and face challenges like technology complexity, fragmented data, security management of their data lake, lack of talent in the market and ability to be secure and compliant. Microsoft Fabric is a complete data platform for the age of AI and addresses every aspect of your data estate. Fabric not only provides a unified data stack with shared experiences, architecture, security, governance, and compliance, but it spans across all workloads and personas that touch data.
 
In this blog article, we will explore four end-to-end scenarios that are typical paths our customers take to extract value and insights from their data using Microsoft Fabric. In these scenarios, we will show you how Fabric seamlessly addresses common architectural scenarios with a single service that encompasses data ingestion, transformation, storage, and exposure through reports or SQL endpoints. Whether you are a citizen developer, professional developer, data scientist, or business analyst, Fabric empowers you to accomplish your analytics tasks. It’s important to note that these scenarios are not meant to be prescriptive guidance but to highlight the possibilities that Fabric offers. Fabric fosters collaboration among various roles within a single unified Software-as-a-Service (SaaS) experience that is secure and governed by default.
  

Lakehouse end-to-end scenario

This Lakehouse-centric model highlights how professional developers and analysts can collaborate on a single set of data. By leveraging the Lakehouse as a central repository, data can be ingested once, and subsequently transformed and shared. This allows multiple personas to collaborate without the need to create multiple copies of the data. Professional developers will be impressed by the capabilities of the Synapse Data Engineering experience in Fabric and Power BI users will be amazed by the power and speed of Direct Lake.
 
To learn how you can ingest, transform, and load the data of a fictional retail company, Wide World Importers, please read the Lakehouse tutorial.


  

Data Warehouse end-to-end scenario

The Data Warehouse-centric model caters to SQL developers and provides them with the ability to utilize the Data Warehouse experience in Fabric in the Power BI Service and through familiar tools such as SQL Server Management Studio. This scenario also eliminates data duplication and provides the same Direct Lake capabilities that Power BI users will come to love. Furthermore, in the future, users will have the option to link their existing Data Warehouse environments to Fabric data warehouse environments.
 
To learn how you can build an end-to-end data warehouse for the fictional Wide World Importers company, please read the Data Warehouse tutorial.


  

Data Science end-to-end scenario

The Data Science scenario welcomes data scientists into Fabric where they will be able to leverage the data that is already available in a Lakehouse or Data Warehouse, clean and prep their data and be able to iterate, build and track machine learning experiments and models using MLFlow, all within the Fabric environment. Moreover, the results from the models and experiments can be stored within the Fabric Lakehouse to be able to further collaborate or visualize the data with Power BI.
 
To learn how you can explore, clean, and transform data, and build a machine learning model to predict trip duration at scale on the NYC Yellow Taxi dataset, please read the Data Science tutorial.


  

Real-Time Analytics end-to-end scenario

The Real-Time Analytics scenario enables organizations to focus on and scale up their analytics solutions while democratizing data for the needs of both citizen and professional developers. By reducing complexity and streamlining data integration, Real-Time Analytics transforms real-time data into an interactive, on-demand resource accessible to all. This simplified experience maintains robust analytical capabilities and seamlessly integrates across the entire suite of Fabric experiences. With one logical data, all data that is ingested into the KQL Database will automatically be available in OneLake, providing support for data loading, data transformation, and advanced visualization scenarios.
 
To learn how you can use the streaming and query capabilities of Real-Time Analytics to analyze the NYC Taxi yellow trip dataset, please read the Real-Time Analytics tutorial.



  

Get started with Microsoft Fabric

Microsoft Fabric is currently in preview. Try out everything Fabric has to offer by signing up for the free trial—no credit card information required. Everyone who signs up gets a fixed Fabric trial capacity, which may be used for any feature or capability from integrating data to creating machine learning models. Existing Power BI Premium customers can simply turn on Fabric through the Power BI admin portal. After July 1, 2023, Fabric will be enabled for all Power BI tenants.
 
Sign up for the free trial. For more information read the Fabric trial documentation.
  

Other resources

If you want to learn more about Microsoft Fabric, consider:

منشورات المدونات ذات الصلة

Introducing the end-to-end scenarios in Microsoft Fabric

أكتوبر 29, 2024 بواسطة Dandan Zhang

Managed private endpoints allow Fabric experiences to securely access data sources without exposing them to the public network or requiring complex network configurations. We announced General Availability for Managed Private Endpoint in Fabric in May of this year. Learn more here: Announcing General Availability of Fabric Private Links, Trusted Workspace Access, and Managed Private Endpoints. … Continue reading “APIs for Managed Private Endpoint are now available”

أكتوبر 28, 2024 بواسطة Gali Reznick

The Data Activator team has rolled out usage reporting to help you better understand your capacity consumption and future charges. When you look at the Capacity Metrics App you’ll now see operations for the reflex items included. Our usage reporting is based on the following four meters: Rule uptime per hour: This is a flat … Continue reading “Usage reporting for Data Activator is now live”