Microsoft Fabric Updates Blog

Automating Real-Time Intelligence Eventstream deployment using PowerShell

Real-Time Intelligence is generally available now! Along with it, we have released capabilities around Application Lifecycle Management (ALM) & REST APIs for its items.

This blog post is part of a series that helps us deploy Real-Time Intelligence items automatically using PowerShell. The previous blog covered automated deployment of Eventhouse using PowerShell. Now that Eventstream APIs are released, we can use them to create the full Eventstream topology in any Fabric workspace.

Let’s build a PowerShell script to automate the deployment of Eventstream with the definition of source, processing and destination into a workspace in Microsoft Fabric. Once end to end script is ready, we can then simply call the script using parameters.

All the files, example script and instructions are available at: SuryaTejJosyula/FabricRTI_Accelerator

Before creating the PowerShell script, make sure that Eventstream definition is ready.

Step 1: Ready the Eventstream Definition file

Create a json payload that will be converted to base64 in the next step. Below json defines the Evenstream with:

  • Sample data source that uses Bicycles sample.
  • Eventhouse as destination that uses ‘Stream_Bikepoint_IOT’ database and ‘APICreatedTable2’ table.
  • Eventstream node is named ‘AutoDeployES2-stream’. Note that AutoDeployES2 is the name of the Eventstream item we will use next. The name of Eventstream item and Eventstream node in the topology should have the same name.
  • Provide workspace Id and Eventhouse item Id in the placeholders below.
{
    "sources": [
         {
          "name": "SampleDataSource",
          "type": "SampleData",
          "properties": {
              "type": "Bicycles"
              }
          }
    ],
    "destinations": [
      {
        "name": "EventhouseDestination",
        "type": "Eventhouse",
        "properties":
        {
          "dataIngestionMode": "ProcessedIngestion",
          "workspaceId": "<Workspace Id>",
          "itemId": "<Eventhouse item Id>",
          "databaseName": "Stream_Bikepoint_IOT",
          "tableName": "APICreatedTable2",
          "inputSerialization":
          {
            "type": "Json",
            "properties":
            {
              "encoding": "UTF8"
            }
          }
        },
        "inputNodes": [
         {
            "name": "AutoDeployES2-stream"
          }
      ]
      }
    ],
    "streams": [
    {
        "name": "AutoDeployES2-stream",
        "type": "DefaultStream",
        "properties": {},
        "inputNodes": [
          {
            "name": "SampleDataSource"
          }
        ]
      }],
    "operators": [],
    "compatibilityLevel": "1.0"
  }

Step 2: Create a base64 string of above json

For this let’s use a website like Base64 Encode and Decode – Online.

Paste the json into placeholder to get a base64 encoded string.

Step 3: Create the body for API request

This is the actual json payload that will be used as body in the API call. Let’s call this file as EventstreamCreate.json .

Note the following points:

  • displayName ‘AutoDeployES2’ is the same that was used in stream definition.
  • Payload is where we need to use the above base64converted string of Eventstream definition.
"displayName": "AutoDeployES2",
"description": "This Eventstream is created using APIs",
"type": "Eventstream",
    "definition": {
        "parts": [
            {
                "path": "eventstream.json",
                "payload": "<Base64 converted string of entire Eventstream Definition from Step 2>",
                "payloadType": "InlineBase64"
            }
        ]
    }
}

Step 4: Being creating PowerShell script

Please make sure all azure modules are installed.

Install-Module Az

Step 5: To begin, log into Fabric

$tenantId = "1234-11111-2222-123141"
Connect-AzAccount -TenantId $tenantId | Out-Null

Step 6: Get the token to authenticate to Fabric

$baseFabricUrl = "https://api.fabric.microsoft.com"
# Get authentication token
    $fabricToken = (Get-AzAccessToken -ResourceUrl $baseFabricUrl).Token

# Setup headers for API call
    $headerParams = @{'Authorization'="Bearer {0}" -f $fabricToken}
    $contentType = @{'Content-Type' = "application/json"}

Optional Step: Get body contents of Eventstream API from a file

We can store the API request body in a json file in the same directory of the PowerShell script and get the contents. This way, any changes can be made at a file level instead of making changes directly to the script.

function Get-ScriptDirectory {
        if ($psise) {
            Split-Path $psise.CurrentFile.FullPath
        }
        else {
            $PSScriptRoot
        }
    }
$DBScriptPath=Get-ScriptDirectory
$DBScriptloc = (Join-Path $DBScriptPath $eventstreamCreateFileName)
$body=Get-content -Path $DBScriptloc

Step 6: Create Eventstream

$evenstreamAPI = "https://api.fabric.microsoft.com/v1/workspaces/$workspaceId/items" 

## Invoke the API to create the Eventstream
Invoke-RestMethod -Headers $headerParams -Method POST -Uri $evenstreamAPI -Body ($body) -ContentType "application/json"

Step 7: Eventstream is created!

Learn More

Postingan blog terkait

Automating Real-Time Intelligence Eventstream deployment using PowerShell

Februari 17, 2026 berdasarkan Virginia Roman

We’re introducing billing reporting updates that make it easier to track AI-related usage in Microsoft Fabric. New AI Functions operation Until now, Fabric AI functions usage was reported under other operations, such as Spark-related operations, or Dataflows Gen2-related operations, depending on where the functions were used. To provide more transparency, Fabric AI functions will have … Continue reading “Billing updates: new operations for Fabric AI functions and AI services”

Februari 3, 2026 berdasarkan Arun Ulagaratchagan

Data teams today are under extraordinary pressure. Expectations around analytics and AI have never been higher, yet enterprise data continues to live across a patchwork of systems, tools, and platforms. The result is friction, duplication, and complexity, making it harder for data teams to provide a unified, real-time view of their business. Microsoft and Snowflake … Continue reading “Microsoft OneLake and Snowflake interoperability (Generally Available)”