Microsoft Fabric Updates Blog

Fabric AI Functions Enhancements (Generally Available)

Transform your data with AI in an instant by using Fabric AI functions. These powerful functions allow you to turn data into insights with just a single line of code.

AI functions have been saving customers hours, and even weeks’ worth of work. Read on to learn about the latest updates!

What’s New?

Since releasing AI Functions to preview earlier this year, we have expanded the functionality to include new optional parameters for additional control, added a new ai.embed() function, implemented support for advanced configurations when using gpt-5, reduced default execution time, and added support for additional models beyond OpenAI.

ai.analyze_sentiment() parameters

ai.analyze_sentiment() now has configurable labels. You can continue using the function with the default parameters of ‘positive’, ‘negative’, ‘neutral’, and ‘mixed’, or you can pass in your own labels.

This image shows an example using default and custom labels to analyze sentiment.

ai.extract() parameters 

ai.extract() now has ExtractLabel, a custom object that lets users set detailed instructions about the information to be extracted. It includes the following parameters: 

  • label – The name of the extracted column. 
  • description – Natural language description to add extra context for the AI model. It can include requirements, context, or instructions for the AI to consider while performing the extraction.
  • max_items – Maximum number of items to extract for this label.
  • type JSON schema type for the extracted value. Supported types for this class include ‘string’, ‘number’, ‘integer’, ‘boolean’, ‘object’, and ‘array’
  • properties – Additional JSON schema properties for the type as a dictionary. It can include supported properties like ‘items’ for arrays, ‘properties’ for objects, ‘enum’ for enum types, and more. For example usage, refer to the documentation on Supported Schemas.

This image shows an example using additional parameters to specify how to extract the number of goals.

ai.generate_response() parameters

ai.generate_response() now has a response_format parameter for more control over the output format. You can choose text, json, or specify the json_schema, including with pydantic model.

This image shows an example using the response_format parameter to specify the output in JSON.

ai.summarize() parameters

ai.summarize() has a new instructions parameter that allows you to provide additional context to the AI model, such as specifying output length.

This image shows an example using the instructions parameter to specify output length for the summary.

New function: ai.embed()

This brand-new function allows you to convert text into numeric vectors that capture its meaning and context — a process known as embedding. These vectors let AI understand relationships between texts, so you can search, group, and compare content based on meaning rather than exact wording.

This image shows an example using the ai.embed() function to generate embeddings.

Advanced configurations for gpt-5

For more complicated tasks, AI functions can be configured to use the gpt-5 model, which has a configurable reasoning_effort parameter, to regulate the number of reasoning tokens the AI model should use, and a verbosity parameter, to regulate the number of reasoning tokens the AI model should use.

To learn more, refer to the custom configurations documentation.

Faster execution

The concurrency parameter allows you to configure the maximum number of rows to process in parallel with asynchronous requests to the model. We have increased the default concurrency to 200 so you can enjoy faster execution out-of-the-box.

Refer to the custom configurations documentation.

Support for additional models

AI functions give you more flexibility than ever when choosing models. You can use the default model, any of the models supported in Fabric, bring your own Azure OpenAI LLM resource, and you can now also bring an AI Foundry resource. This allows you to access models beyond OpenAI, such as Claude, LLaMA, and more, with both PySpark and pandas.

For more details, refer to the custom configurations documentation.

Next Steps

  • These updates will be generally available across all geographies in the coming weeks.
  • Want to dive deeper? Explore the AI functions documentation to learn more.
  • Submit your feedback on Fabric Ideas and join the conversation in the Fabric Community.

Related blog posts

Fabric AI Functions Enhancements (Generally Available)

February 10, 2026 by Ruixin Xu

Great technology does not succeed on design alone—it succeeds when it helps people solve real problems. Semantic Link is one of those transformative capabilities in Microsoft Fabric: it brings AI, BI, and data engineering together through a shared semantic layer, enabling teams to work faster and more intelligently on the data they already trust. From … Continue reading “Supercharge AI, BI, and data engineering with Semantic Link (Generally Available)”

January 28, 2026 by Katie Murray

If you’ve been trying to keep up with everything shipping in Microsoft Fabric, this January 2026 round-up is for you—covering the biggest updates across the platform, from new AI-powered catalog experiences and OneLake governance improvements to enhancements in Data Engineering, Data Warehouse, Real-Time Intelligence, and Data Factory. If you haven’t already, make sure FabCon Atlanta … Continue reading “Fabric January 2026 Feature Summary”