Microsoft Fabric Updates Blog

Introducing MCP Support for Real-Time Intelligence (RTI) 

Co-author: Alexei Robsky, Data Scientist Manager

Overview 

As organizations increasingly rely on real-time data to drive decisions, the need for intelligent, responsive systems has never been greater. At the heart of this transformation is Fabric Real-Time Intelligence (RTI), a platform that empowers users to act on data as it arrives. Today, we’re excited to announce a major leap forward: Model Context Protocol (MCP) support for Real-Time Intelligence (RTI). 

What is Model Context Protocol (MCP)? 

MCP is a protocol designed to enable AI models, such as Azure OpenAI models, to interact seamlessly with external tools and resources.  Originally developed by Anthropic, MCP simplifies how agents discover, connect to, and reason over enterprise data. With this integration, RTI becomes even more powerful, enabling AI-driven insights and actions in real time 

MCP Support for Real-Time Intelligence (RTI) 

We are excited to announce a fully open-source MCP server (https://aka.ms/rti.mcp.repo) implementation for Microsoft Fabric Real-Time Intelligence (RTI). This server enables AI agents or AI Applications to interact with Fabric RTI by providing tools through the MCP interface, allowing for seamless data querying and analysis capabilities. 

Supported items 

Eventhouse : Execute KQL queries against Microsoft Fabric RTI Eventhouse and Azure Data Explorer(ADX)  backends, offering a unified interface for AI agents to query, reason, and act on real-time data. 

Coming soon 

  • Richer, real-time visualization tools. 
  • Activator integration for proactive insights. 
  • Additional RTI components for comprehensive analytics. 

Key Features 

Here’s what makes RTI MCP a game-changer: 

  • Real-Time Data Access: Agents can access up-to-the-second data from Eventhouse, enabling timely decisions. 
  • Natural Language Interfaces: Users (or Agents) can ask questions in plain English (or other languages), which are translated into optimized queries (NL2KQL). 
  • Schema Discovery: MCP servers expose schema and metadata, allowing agents to dynamically understand data structures. 
  • Plug-and-Play Integration: MCP clients like Github copilot, Claude, Cline etc can connect to RTI with minimal configuration, thanks to standardized APIs and discovery mechanisms. 
  • Extensibility: Support for custom actions, anomaly detection, and vector search is built in  
  • Local Language Inference: Interact with your data using your preferred language with automatic query translation (depends on the LLM you choose to use with MCP server) 

How It Works 

MCP follows a client-server architecture that allows AI models to interact with external tools efficiently. Here’s how it works: 

Components of MCP 

  1. MCP Host – The AI model (e.g., GPT-4, Claude, Gemini etc) requesting data or actions. 
  1. MCP Client – An intermediary service that forwards the AI model’s requests to MCP servers. e.g. Github Copilot, Cline, Claude Desktop, etc. 
  1. MCP Server – Lightweight applications that expose specific capabilities (APIs, databases, files, etc.). e.g. Translates requests into KQL queries for real-time data retrieval. E.g. Translates requests into KQL queries for real-time data retrieval 

Architecture 

At the core of the system is the RTI MCP Server, which acts as a bridge between AI agents and data sources. Agents send requests to the MCP server, which translates them into queries against Eventhouse. 

This architecture enables a modular, scalable, and secure way to build intelligent applications that respond to real-time signals. 

The server supports: 

  • Listing databases in Eventhouse 
  • Executing natural language queries 
  • Intelligent schema discovery 
  • Sampling data 

Getting Started 

Prerequisites 

  1. Install either the stable or Insiders release of VS Code: 
  1. Install the GitHub Copilot and GitHub Copilot Chat extensions 
  1. Install uv 

powershell -ExecutionPolicy ByPass -c “irm https://astral.sh/uv/install.ps1 | iex” 

or, check here for other install options for non-Windows OS 

  1. Open VS Code in an empty folder 

Install from PyPI (Pip) 

The Fabric RTI MCP Server is available on PyPI, you can install it using pip. This is the easiest way to install the server. 

From VS Code 

1. Open the command palette (Ctrl+Shift+P) and run the command `MCP: Add Server` 

2. Select install from Pip 

3. When prompted, enter the package name `microsoft-fabric-rti-mcp` 

4. Follow the prompts to install the package and add it to your settings.json file 

The process should end with the below settings in your settings.json file. 

settings.json 

    “mcp”: { 

        “server”: { 

            “fabric-rti-mcp”: { 

                “command”: “uvx”, 

                “args”: [ 

                    “microsoft-fabric-rti-mcp” 

                ] 

            } 

        } 

    } 

Start Analyzing your data 

Example prompt: 

I have data about user executed commands in ProcessEvents table, can you sample few rows and classify the executed commands with threat tolerance of low/med/high. Provide a tabular view of the overall summary. 

VS Code Copilot 

Dive in today: 

We look forward to seeing what you build with MCP and Real-Time Intelligence.  

Please share your feedback, issues, and requests in the repo, or reach out to us at https://community.fabric.microsoft.com or r/MicrosoftFabric  

Relaterade blogginlägg

Introducing MCP Support for Real-Time Intelligence (RTI) 

juni 25, 2025 från Patrick LeBlanc

Welcome to the June 2025 update. The June 2025 Fabric update introduces several key enhancements across multiple areas. Power BI celebrates its 10th anniversary with a range of community events, contests, expert-led sessions, and special certification exam discounts. In Data Engineering, Fabric Notebooks now support integration with variable libraries in preview, empowering users to manage … Continue reading “Fabric June 2025 Feature Summary”

juni 12, 2025 från RK Iyer

Introduction Whether you’re building analytics pipelines or conversational AI systems, the risk of exposing sensitive data is real. AI models trained on unfiltered datasets can inadvertently memorize and regurgitate PII, leading to compliance violations and reputational damage. This blog explores how to build scalable, secure, and compliant data workflows using PySpark, Microsoft Presidio, and Faker—covering … Continue reading “Privacy by Design: PII Detection and Anonymization with PySpark on Microsoft Fabric”