Microsoft Fabric Updates Blog

Automating Real-Time Intelligence Eventstream deployment using PowerShell

Real-Time Intelligence is generally available now! Along with it, we have released capabilities around Application Lifecycle Management (ALM) & REST APIs for its items.

This blog post is part of a series that helps us deploy Real-Time Intelligence items automatically using PowerShell. The previous blog covered automated deployment of Eventhouse using PowerShell. Now that Eventstream APIs are released, we can use them to create the full Eventstream topology in any Fabric workspace.

Let’s build a PowerShell script to automate the deployment of Eventstream with the definition of source, processing and destination into a workspace in Microsoft Fabric. Once end to end script is ready, we can then simply call the script using parameters.

All the files, example script and instructions are available at: SuryaTejJosyula/FabricRTI_Accelerator

Before creating the PowerShell script, make sure that Eventstream definition is ready.

Step 1: Ready the Eventstream Definition file

Create a json payload that will be converted to base64 in the next step. Below json defines the Evenstream with:

  • Sample data source that uses Bicycles sample.
  • Eventhouse as destination that uses ‘Stream_Bikepoint_IOT’ database and ‘APICreatedTable2’ table.
  • Eventstream node is named ‘AutoDeployES2-stream’. Note that AutoDeployES2 is the name of the Eventstream item we will use next. The name of Eventstream item and Eventstream node in the topology should have the same name.
  • Provide workspace Id and Eventhouse item Id in the placeholders below.
{
    "sources": [
         {
          "name": "SampleDataSource",
          "type": "SampleData",
          "properties": {
              "type": "Bicycles"
              }
          }
    ],
    "destinations": [
      {
        "name": "EventhouseDestination",
        "type": "Eventhouse",
        "properties":
        {
          "dataIngestionMode": "ProcessedIngestion",
          "workspaceId": "<Workspace Id>",
          "itemId": "<Eventhouse item Id>",
          "databaseName": "Stream_Bikepoint_IOT",
          "tableName": "APICreatedTable2",
          "inputSerialization":
          {
            "type": "Json",
            "properties":
            {
              "encoding": "UTF8"
            }
          }
        },
        "inputNodes": [
         {
            "name": "AutoDeployES2-stream"
          }
      ]
      }
    ],
    "streams": [
    {
        "name": "AutoDeployES2-stream",
        "type": "DefaultStream",
        "properties": {},
        "inputNodes": [
          {
            "name": "SampleDataSource"
          }
        ]
      }],
    "operators": [],
    "compatibilityLevel": "1.0"
  }

Step 2: Create a base64 string of above json

For this let’s use a website like Base64 Encode and Decode – Online.

Paste the json into placeholder to get a base64 encoded string.

Step 3: Create the body for API request

This is the actual json payload that will be used as body in the API call. Let’s call this file as EventstreamCreate.json .

Note the following points:

  • displayName ‘AutoDeployES2’ is the same that was used in stream definition.
  • Payload is where we need to use the above base64converted string of Eventstream definition.
"displayName": "AutoDeployES2",
"description": "This Eventstream is created using APIs",
"type": "Eventstream",
    "definition": {
        "parts": [
            {
                "path": "eventstream.json",
                "payload": "<Base64 converted string of entire Eventstream Definition from Step 2>",
                "payloadType": "InlineBase64"
            }
        ]
    }
}

Step 4: Being creating PowerShell script

Please make sure all azure modules are installed.

Install-Module Az

Step 5: To begin, log into Fabric

$tenantId = "1234-11111-2222-123141"
Connect-AzAccount -TenantId $tenantId | Out-Null

Step 6: Get the token to authenticate to Fabric

$baseFabricUrl = "https://api.fabric.microsoft.com"
# Get authentication token
    $fabricToken = (Get-AzAccessToken -ResourceUrl $baseFabricUrl).Token

# Setup headers for API call
    $headerParams = @{'Authorization'="Bearer {0}" -f $fabricToken}
    $contentType = @{'Content-Type' = "application/json"}

Optional Step: Get body contents of Eventstream API from a file

We can store the API request body in a json file in the same directory of the PowerShell script and get the contents. This way, any changes can be made at a file level instead of making changes directly to the script.

function Get-ScriptDirectory {
        if ($psise) {
            Split-Path $psise.CurrentFile.FullPath
        }
        else {
            $PSScriptRoot
        }
    }
$DBScriptPath=Get-ScriptDirectory
$DBScriptloc = (Join-Path $DBScriptPath $eventstreamCreateFileName)
$body=Get-content -Path $DBScriptloc

Step 6: Create Eventstream

$evenstreamAPI = "https://api.fabric.microsoft.com/v1/workspaces/$workspaceId/items" 

## Invoke the API to create the Eventstream
Invoke-RestMethod -Headers $headerParams -Method POST -Uri $evenstreamAPI -Body ($body) -ContentType "application/json"

Step 7: Eventstream is created!

Learn More

Postagens relacionadas em blogs

Automating Real-Time Intelligence Eventstream deployment using PowerShell

abril 16, 2026 de Tom Peplow

Have you ever tried to understand what’s stored in your Fabric items? Would you even know where to begin? I had 92,000 UK property transactions sitting in an open mirrored database. Rather than spending hours sorting through documentation, I just asked my AI agent: “Document what’s in the House Price Open Mirror in my UK … Continue reading “Give your AI agent the keys to OneLake: OneLake MCP (Generally Available)”

abril 14, 2026 de Tzvia Gitlin Troyna

Modern analytics isn’t just about storing data. It’s about detecting issues early, understanding them fast, and acting with confidence. Eventhouse in Microsoft Fabric brings advanced analytics capabilities together so teams can move from raw events to insight and action without stitching tools or duplicating data. With native integrations for Anomaly Detection, Data Agents, SQL Endpoints, … Continue reading “One platform, many insights: How Eventhouse brings analytics together (Preview)”