Microsoft Fabric Updates Blog

Using Microsoft Fabric for Generative AI: A Guide to Building and Improving RAG Systems

In the previous blog, we explained how to use Microsoft Fabric to build custom AI applications, focusing on transforming your data into valuable knowledge for Generative AI applications. Specifically, we demonstrated how to implement a Retrieval-Augmented Generation (RAG) system using Microsoft Fabric, integrating Azure OpenAI and Azure AI Search.

We are excited to share a new series of tutorials that will build on this foundation, helping you turn your data into actionable insights for Generative AI applications using Microsoft Fabric and leveraging the capabilities of Azure OpenAI and Azure AI Search.

By the end of the series, you will have a deeper understanding of how to adapt your AI solutions to your specific datasets and improving the effectiveness and relevance of your applications. You can also access and run the tutorials here.

Series Overview

This tutorial series consists of three main notebooks, each covering a crucial aspect of building and optimizing RAG systems:

1. Building a RAG System in Microsoft Fabric

The first notebook offers a detailed, step-by-step guide to constructing a RAG system within Microsoft Fabric. It will walk you through the core components, including setting up data sources and configuring AI Skill integrations, ensuring you can get your RAG system operational quickly and efficiently.

2. Evaluating RAG Performance

This notebook focuses on evaluating the performance of your RAG application by introducing key metrics such as Groundedness, Relevance, and Similarity. These metrics help you assess how well your system is retrieving and generating accurate, contextually appropriate responses. You’ll learn how to fine-tune your system based on these metrics to ensure it meets user expectations and delivers high-quality results.

3. Exploring Azure Search Types for RAG Systems

In the final notebook, we explore the various Azure AI Search types available and examine their impact on the performance of your RAG system. Understanding how to select the most appropriate search type for your use case is essential for optimizing the accuracy and efficiency of your application. This notebook will guide you through different search methods and how they can be applied to improve your system’s response generation.

In this tutorial series, we will be using a modified version of the CMU Question/Answer Dataset as a demonstration dataset. This dataset has been adjusted to comply with the different licenses associated with its subsets. Specifically, only rows from S08/S09 are included in the modified dataset, with a reference to the ExtractedPath. For simplicity, the data has been cleaned and structured into a single table, containing the following fields.

However, it’s important to note that this choice of dataset is purely for demonstration and learning purposes. You can apply the same concepts and techniques covered in these tutorials to your own datasets, making it flexible for your specific needs and use cases.

This approach will help you understand how to implement and fine-tune your RAG system for a wide variety of data sources, allowing you to tailor it to real-world applications within your organization.

To build your own RAG application, Microsoft Fabric offers two options for integrating Azure OpenAI capabilities into your RAG system:

  1. Native Integration: Fabric integrates seamlessly with Azure AI services, allowing you to use prebuilt AI models without prerequisites. This option is recommended, as you can use your Fabric authentication to access AI services, and all usage is billed against your Fabric capacity. There’s no need for subscription keys or resource IDs. This integration simplifies setup, allowing you to focus on application development rather than configuration management.
  2. Bring-Your-Own-Key: You can provision your AI services on Azure and bring your own API keys to Fabric. If certain AI services aren’t supported in the prebuilt models, this option remains available. It provides more control over your usage and allows for custom integration based on your needs.

What You’ll Gain

By working through these notebooks, you will gain hands-on experience with building, evaluating, and optimizing RAG systems in Microsoft Fabric. Each tutorial is designed to provide you with the technical insights needed to adapt these systems to your specific data and use cases, enhancing the overall performance and reliability of your AI applications.

Stay tuned for more tutorials in this series, where we will continue to dive deeper into advanced topics and configurations to help you get the most out of your Generative AI experience with Microsoft Fabric!

Post Author(s):
Amir Jafari – Senior Product Manager in Azure Data.
Alexandra Savelieva – Principal AI Engineer in Azure Data.


相關部落格文章

Using Microsoft Fabric for Generative AI: A Guide to Building and Improving RAG Systems

1月 8, 2026 作者: Adi Eldar

What if generating embeddings in Eventhouse didn’t require an external endpoint, callout policies, throttling management, or per‑request costs? That’s exactly what slm_embeddings_fl() delivers: a new user-defined function (UDF) that generates text embeddings using local Small Language Models (SLMs) from within the Kusto Python sandbox, returning vectors that you can immediately use for semantic search, similarity … Continue reading “Create Embeddings in Fabric Eventhouse with built-in Small Language Models (SLMs)”

12月 16, 2025 作者: Alex Powers

As 2025 ends, we’re taking a moment to reflect on Microsoft Fabric’s second year in the market and the collective progress made alongside our community, customers, and partners. What began as a unified vision for data and AI has grown into a platform adopted by more than 28,000 organizations worldwide, anchored by OneLake and shaped … Continue reading “Microsoft Fabric 2025 holiday recap: Unified Data and AI Innovation”