Microsoft Fabric Updates Blog

Introducing MCP Support for Real-Time Intelligence (RTI) 

Co-author: Alexei Robsky, Data Scientist Manager

Overview 

As organizations increasingly rely on real-time data to drive decisions, the need for intelligent, responsive systems has never been greater. At the heart of this transformation is Fabric Real-Time Intelligence (RTI), a platform that empowers users to act on data as it arrives. Today, we’re excited to announce a major leap forward: Model Context Protocol (MCP) support for Real-Time Intelligence (RTI). 

What is Model Context Protocol (MCP)? 

MCP is a protocol designed to enable AI models, such as Azure OpenAI models, to interact seamlessly with external tools and resources.  Originally developed by Anthropic, MCP simplifies how agents discover, connect to, and reason over enterprise data. With this integration, RTI becomes even more powerful, enabling AI-driven insights and actions in real time 

MCP Support for Real-Time Intelligence (RTI) 

We are excited to announce a fully open-source MCP server (https://aka.ms/rti.mcp.repo) implementation for Microsoft Fabric Real-Time Intelligence (RTI). This server enables AI agents or AI Applications to interact with Fabric RTI by providing tools through the MCP interface, allowing for seamless data querying and analysis capabilities. 

Supported items 

Eventhouse : Execute KQL queries against Microsoft Fabric RTI Eventhouse and Azure Data Explorer(ADX)  backends, offering a unified interface for AI agents to query, reason, and act on real-time data. 

Coming soon 

  • Richer, real-time visualization tools. 
  • Activator integration for proactive insights. 
  • Additional RTI components for comprehensive analytics. 

Key Features 

Here’s what makes RTI MCP a game-changer: 

  • Real-Time Data Access: Agents can access up-to-the-second data from Eventhouse, enabling timely decisions. 
  • Natural Language Interfaces: Users (or Agents) can ask questions in plain English (or other languages), which are translated into optimized queries (NL2KQL). 
  • Schema Discovery: MCP servers expose schema and metadata, allowing agents to dynamically understand data structures. 
  • Plug-and-Play Integration: MCP clients like Github copilot, Claude, Cline etc can connect to RTI with minimal configuration, thanks to standardized APIs and discovery mechanisms. 
  • Extensibility: Support for custom actions, anomaly detection, and vector search is built in  
  • Local Language Inference: Interact with your data using your preferred language with automatic query translation (depends on the LLM you choose to use with MCP server) 

How It Works 

MCP follows a client-server architecture that allows AI models to interact with external tools efficiently. Here’s how it works: 

Components of MCP 

  1. LLM – The AI model (e.g., GPT-5, Claude, Gemini etc) requesting data or actions. 
  1. MCP Client – An intermediary service that forwards the AI model’s requests to MCP servers. e.g. Github Copilot, Cline, Claude Desktop, etc. 
  1. MCP Server – Lightweight applications that expose specific capabilities (APIs, databases, files, etc.). e.g. Translates requests into KQL queries for real-time data retrieval.

Architecture 

At the core of the system is the RTI MCP Server, which acts as a bridge between AI agents and data sources. Agents send requests to the MCP server, which translates them into queries against Eventhouse. 

This architecture enables a modular, scalable, and secure way to build intelligent applications that respond to real-time signals. 

The server supports: 

  • Listing databases in Eventhouse 
  • Executing natural language queries 
  • Intelligent schema discovery 
  • Sampling data 

Getting Started 

Prerequisites 

  1. Install either the stable or Insiders release of VS Code: 
  1. Install the GitHub Copilot and GitHub Copilot Chat extensions 
  1. Install uv 

powershell -ExecutionPolicy ByPass -c “irm https://astral.sh/uv/install.ps1 | iex” 

or, check here for other install options for non-Windows OS 

  1. Open VS Code in an empty folder 

Install from PyPI (Pip) 

The Fabric RTI MCP Server is available on PyPI, you can install it using pip. This is the easiest way to install the server. 

From VS Code 

1. Open the command palette (Ctrl+Shift+P) and run the command `MCP: Add Server` 

2. Select install from Pip 

3. When prompted, enter the package name `microsoft-fabric-rti-mcp` 

4. Follow the prompts to install the package and add it to your settings.json file 

The process should end with the below settings in your settings.json file. 

settings.json 

    “mcp”: { 

        “server”: { 

            “fabric-rti-mcp”: { 

                “command”: “uvx”, 

                “args”: [ 

                    “microsoft-fabric-rti-mcp” 

                ] 

            } 

        } 

    } 

Start Analyzing your data 

Example prompt: 

I have data about user executed commands in ProcessEvents table, can you sample few rows and classify the executed commands with threat tolerance of low/med/high. Provide a tabular view of the overall summary. 

VS Code Copilot 

Dive in today: 

We look forward to seeing what you build with MCP and Real-Time Intelligence.  

Please share your feedback, issues, and requests in the repo, or reach out to us at https://community.fabric.microsoft.com or r/MicrosoftFabric  

Related blog posts

Introducing MCP Support for Real-Time Intelligence (RTI) 

December 16, 2025 by Alex Powers

As 2025 ends, we’re taking a moment to reflect on Microsoft Fabric’s second year in the market and the collective progress made alongside our community, customers, and partners. What began as a unified vision for data and AI has grown into a platform adopted by more than 28,000 organizations worldwide, anchored by OneLake and shaped … Continue reading “Microsoft Fabric 2025 holiday recap: Unified Data and AI Innovation”

December 15, 2025 by Will Thompson (HE/HIM)

At Ignite, we announced operations agents that helps create autonomous agents that monitor data, infer goals, and recommend actions. Soon, we will enable billing for these agents as the Preview period continues. Operations agents will use Fabric Capacity Units (CU) like any other Fabric features. In the Capacity Metrics App, you’ll find the following operations show … Continue reading “Understanding Operations Agent Capacity Consumption, Usage Reporting and Billing (Preview)”