Microsoft Fabric Updates Blog

Introducing upgrades to AI functions for better performance—and lower costs

Earlier this year, we released AI functions in public preview, allowing Fabric customers to apply LLM-powered transformations to OneLake data simply and seamlessly, in a single line of code. Since then, we’ve continued iterating on AI functions in response to your feedback. Let’s explore the latest updates, which make AI functions more powerful, more cost-effective, and easier to use than ever before.

As a reminder, AI functions are accessible to all paid Fabric SKUs—and they turbocharge analytics workflows, whether you’re transforming Spark or pandas DataFrames. The demo below shows how AI functions combined with Copilot can accelerate low-code users and experienced developers alike.

Enhanced intelligence—and lower costs—with GPT-4o-mini

We’ve upgraded the default model that powers AI functions to GPT-4o-mini, boosting the feature’s intelligence and effectively reducing its price. The update results in an improved context window (from 16k to 128k input tokens) and significant cost reductions for both input and output tokens. With new optimizations to the system prompts, you can expect better performance and more accurate results from each of the eight AI functions, all while saving Fabric capacity.

No more manual library installations

The library that includes AI functions is now preinstalled on the Fabric 1.3 runtime. That means you can get started with AI-powered data-engineering even faster, skipping the previously required installation steps. The code below shows a simple example of AI-powered translation in PySpark. For more, check out our documentation: Transform and enrich data seamlessly with AI functions.

# This code uses AI. Always review output for mistakes. 
# Read terms: https://azure.microsoft.com/support/legal/preview-supplemental-terms/

from synapse.ml.spark.aifunc.DataFrameExtensions import AIFunctions
from synapse.ml.services.openai import OpenAIDefaults
defaults = OpenAIDefaults()
defaults.set_deployment_name("gpt-4o-mini")

df = spark.createDataFrame([
        ("Hello! How are you doing today?",),
        ("Tell me what you'd like to know, and I'll do my best to help.",),
        ("The only thing we have to fear is fear itself.",),
    ], ["text"])

translations = df.ai.translate(to_lang="spanish", input_col="text", output_col="translations")
display(translations)

AI function support in pure Python notebooks

If you don’t need the weight of Spark, you can now use AI functions on pandas DataFrames in Fabric’s pure Python notebooks. The functionality is the same, letting you harness GenAI for data preparation with even more flexibility. Just remember to install the required OpenAI and SynapseML dependencies using the code below.

# Install fixed version of packages
%pip install -q --force-reinstall openai==1.30

# Install latest version of SynapseML-core
%pip install -q --force-reinstall https://mmlspark.blob.core.windows.net/pip/1.0.11-spark3.5/synapseml_core-1.0.11.dev1-py2.py3-none-any.whl

# Install SynapseML-Internal .whl with AI functions library from blob storage:
%pip install -q --force-reinstall https://mmlspark.blob.core.windows.net/pip/1.0.11.1-spark3.5/synapseml_internal-1.0.11.1.dev1-py2.py3-none-any.whl
# This code uses AI. Always review output for mistakes. 
# Read terms: https://azure.microsoft.com/support/legal/preview-supplemental-terms/

import synapse.ml.aifunc as aifunc
import pandas as pd

df = pd.DataFrame([
        ("Scarves"),
        ("Snow pants"),
        ("Ski goggles")
    ], columns=["product"])

df["response"] = df.ai.generate_response("Write a short, punchy email subject line for a winter sale.")
display(df)

Low-code flows directly from your notebook

The syntax for AI functions is so straightforward that you may not need documentation for long. But Fabric notebooks now include a simple interface that generates the code for you. Just select a function, choose an active pandas or Spark DataFrame, and fill in the required inputs.

A new tab in the Fabric notebook ribbon allows you to generate code to enrich and transform your OneLake data with AI functions.
A new tab in the Fabric notebook ribbon allows you to generate code to enrich and transform data with AI functions.

Next steps

You can learn more about AI functions—and explore starter code samples—using our documentation: Transform and enrich data seamlessly with AI functions. Before getting started, please make note of a few prerequisites:

We’re eager for you to incorporate AI functions in your own data-science and data-engineering workflows. Let us know what you think by submitting feedback on Fabric Ideas or joining the conversation on the Fabric Community. The features that you request may end up in a future blog post.

Entradas de blog relacionadas

Introducing upgrades to AI functions for better performance—and lower costs

diciembre 9, 2025 por Kunal Parekh

Discover how Microsoft Fabric’s Forecasting Service system reduces Spark startup latency and cloud costs through proactive AI and ML-driven resource provisioning. Context & Relevance Waiting minutes for a Spark cluster to become available can throttle analytics velocity, delay insights, and drive-up cloud spend. In a world where data teams expect near‐instant execution and seamless burst … Continue reading “How does Fabric make Spark Notebooks Instant?”

noviembre 21, 2025 por Penny Zhou

Troubleshooting pipeline failures can be overwhelming, especially when a single run throws dozens or even hundreds of errors. The new Error Insights Copilot in Fabric makes this process faster, smarter, and easier. Powered by AI, Copilot provides clear explanations, root cause analysis, and actionable recommendations, so you can resolve issues without getting lost in technical … Continue reading “AI-powered troubleshooting for Fabric pipeline error messages”