Microsoft Fabric Updates Blog

Integrating Azure API Management with Fabric API for GraphQL

Introduction

Integrating Azure API Management (APIM) with Microsoft Fabric’s API for GraphQL can significantly enhance your API’s capabilities by providing robust scalability and security features such as identity management, rate limiting, and caching. This post will guide you through the process of setting up and configuring these features.

You may not be familiar with API Management. If that’s the case, perhaps you may want to take a few minutes to familiarize yourself with the product and its capabilities. This video can help you get started:

Intro to Azure API Management, by Julia Kasper

Let’s take a look at how we can integrate a Fabric GraphQL API with Azure APIM:

Add a Fabric GraphQL API to Azure API Management

For this section, we assume that you have a GraphQL API in Fabric and APIM instance already up and running. If not, you can follow the instructions in the documentation to create an API in Fabric or you can click on ‘Start with sample SQL database’ in the API for GraphQL portal to start from a new API in minutes.

To get started retrieve your API endpoint from the Fabric portal by going in your GraphQL item and clicking on the ‘Copy endpoint’ button in the ribbon. You will also need to save your GraphQL schema to a file, which you can accomplish by clicking on the ‘Export schema’ button and saving it to a file in your local device:

Now navigate to your API Management instance in the Azure portal and select APIs > + Add API.

Choose the GraphQL icon and, in the APIM ‘Create from GraphQL schema’ screen, fill in the required fields such as Display name, Name, and GraphQL API endpoint. Select ‘Upload schema’ and use the schema file you downloaded previously:

New GraphQL API settings in APIM

Next, we need to configure a policy for authentication. In this case, we’re going to show you how to allow a managed identity to handle authentication for this API. We assume that you know how to create a managed identity in the Azure portal or by using any of the tools available to do so and have already created one.

Using Managed Identities with APIM and API for GraphQL in Fabric

Now that we have a credential we can use for authentication, we need to go ahead and grant that managed identity permissions to the GraphQL item in Fabric. For the sake of simplicity, we add the managed identity (in this example, ‘apim-id’) as a member of the workspace where both the GraphQL API and the API data source are located:

Manage Fabric workspace access

If you prefer to enable access directly to the Fabric items such as the API itself and the data sources attached to the API such as a LakeHouse or SQL database, you need to grant the appropriate permissions for the managed identity on each item, especially in case they were attached to the API using Single Sign-On (SSO) authentication. You can find more information in the documentation.

Once you have granted your credential permissions to your workspace, Fabric GraphQL API and/or data sources attached to it, you will need to indicate to APIM that you want to leverage that credential to perform authentication. This is rather simple – Back to the APIM console, go to Security > Managed identities and add the same user assigned managed identity you’re using to access the Fabric GraphQL API.

Next go to the ‘API Policies’ tab in the GraphQL API you created earlier, then edit the inbound processing policy by adding the following entries below <inbound><base/>:

<authentication-managed-identity 
            resource="https://analysis.windows.net/powerbi/api" 
            client-id="MANAGED IDENTITY CLIENT ID GOES HERE" 
            output-token-variable-name="token-variable" 
            ignore-error="false" />
<set-header name="Authorization" exists-action="override">
            <value>@("Bearer " + (string)context.Variables["token-variable"])</value>
</set-header>

Make sure to replace the client ID in the snippet above with your actual managed identity’s client ID. Save your policy, and you’re almost good to go.

Now, back to the API, head to the ‘Test’ tab and confirm you can issue queries and/or mutations to your Fabric data via GraphQL:

Testing the successful connection between APIM and Fabric GraphQL

Caching

APIs and operations in API Management can be configured with response caching. Response caching can significantly reduce latency for API callers and backend load for API providers. APIM has support for built-in caching, or you can choose to use your own Redis instance. In either case, you need to define your caching policy. Here we have the previous policy amended with a simple caching configuration that would work for most scenarios:

<policies>
    <inbound>
        <base />
        <authentication-managed-identity 
            resource="https://analysis.windows.net/powerbi/api" 
            client-id="MANAGED IDENTITY CLIENT ID GOES HERE" 
            output-token-variable-name="token-variable" 
            ignore-error="false" />
        <set-header name="Authorization" exists-action="override">
            <value>@("Bearer " + (string)context.Variables["token-variable"])</value>
        </set-header>
        <cache-lookup-value 
            key="@(context.Request.Body.As<String>(preserveContent: true))" 
            variable-name="cachedResponse" 
            default-value="not_exists" />
    </inbound>
    <!-- Control if and how the requests are forwarded to services  -->
    <backend>
        <choose>
            <when condition="@(context.Variables.GetValueOrDefault<string>("cachedResponse") == "not_exists")">
                <forward-request />
            </when>
        </choose>
    </backend>
    <!-- Customize the responses -->
    <outbound>
        <base />
        <choose>
            <when condition="@(context.Variables.GetValueOrDefault<string>("cachedResponse") != "not_exists")">
                <set-body>@(context.Variables.GetValueOrDefault<string>("cachedResponse"))</set-body>
            </when>
            <when condition="@((context.Response.StatusCode == 200) && (context.Variables.GetValueOrDefault<string>("cachedResponse") == "not_exists"))">
                <cache-store-value key="@(context.Request.Body.As<String>(preserveContent: true))" value="@(context.Response.Body.As<string>(preserveContent: true))" duration="60" />
            </when>
        </choose>
    </outbound>
    <!-- Handle exceptions and customize error responses  -->
    <on-error>
        <base />
    </on-error>
</policies>

You can confirm the requests are getting cached by tracing a GraphQL API query or mutation in the APIM portal:

Tracing API cache

For more advanced caching scenarios, please refer to the APIM documentation on caching.

Rate Limiting

Another great feature of APIM! You can limit the number of API calls a client can make in a specific time period. Here’s a sample rate limiting policy entry you can add below <inbound><base/> that enforces no more than 2 calls every 60 seconds for a given user:

<rate-limit-by-key 
    calls="2" 
    renewal-period="60" 
    counter-key="@(context.Request.Headers.GetValueOrDefault("Authorization"))" 
    increment-condition="@(context.Response.StatusCode == 200)" 
    remaining-calls-variable-name="remainingCallsPerUser" />

After sending more than 2 API calls in a minute, you’ll receive an error message:

{
  "statusCode": 429,
  "message": "Rate limit is exceeded. Try again in 58 seconds."
}

For more details on how to configure rate limiting policies in APIM, please refer to the documentation.

Conclusion

Integrating Microsoft Fabric API for GraphQL with Azure API Management brings together the best of both worlds: the rich data capabilities of Fabric and the enterprise-grade gateway features of APIM. By configuring managed identities, you enable secure authentication to Fabric. With custom caching and rate limiting policies, you gain fine-grained control over performance, cost, and user experience—tailored for the unique characteristics of GraphQL APIs.

This setup not only provides more options to secure your Fabric data but also provides the scalability and observability required to support production workloads across teams and tenants.

Give the integration between APIM and Fabric API for GraphQL a try and let us know what you think! Submit your feedback to Fabric Ideas.

Related blog posts

Integrating Azure API Management with Fabric API for GraphQL

July 17, 2025 by Sunitha Muthukrishna

Microsoft Fabric has introduced new features for its User Data Functions (UDFs), enhancing Python-based data processing capabilities within the platform. These updates include support for asynchronous functions and the use of pandas DataFrame and Series types for input and output, enabling more efficient handling of large-scale data. • Async function support: Developers can now write async functions in Fabric UDFs to improve responsiveness and efficiency, especially for managing high volumes of I/O-bound operations, such as reading files asynchronously from a Lakehouse. • Pandas DataFrame and Series integration: UDFs can accept and return pandas DataFrames and Series, allowing batch processing of rows with improved speed and performance in data analysis tasks. An example function calculates total revenue by driver using pandas groupby operations. • Usage in notebooks: These functions can be invoked directly from notebooks using pandas objects, facilitating efficient aggregation and analysis of large datasets interactively within Microsoft Fabric. • Getting started and benefits: Users can enable these features by updating the fabric-user-data-functions library to version 1.0.0. The enhancements reduce I/O operations, enable concurrent task handling, and improve performance on datasets with millions of rows.

July 10, 2025 by Matthew Hicks

Effortlessly read Delta Lake tables using Apache Iceberg readers Microsoft Fabric is a unified, SaaS data and analytics platform designed for the era of AI. All workloads in Microsoft Fabric use Delta Lake as the standard, open-source table format. With Microsoft OneLake, Fabric’s unified SaaS data lake, customers can unify their data estate across multiple … Continue reading “New in OneLake: Access your Delta Lake tables as Iceberg automatically (Preview)”