Microsoft Fabric Updates Blog

Fabric AI Functions Enhancements (Generally Available)

Transform your data with AI in an instant by using Fabric AI functions. These powerful functions allow you to turn data into insights with just a single line of code.

AI functions have been saving customers hours, and even weeks’ worth of work. Read on to learn about the latest updates!

What’s New?

Since releasing AI Functions to preview earlier this year, we have expanded the functionality to include new optional parameters for additional control, added a new ai.embed() function, implemented support for advanced configurations when using gpt-5, reduced default execution time, and added support for additional models beyond OpenAI.

ai.analyze_sentiment() parameters

ai.analyze_sentiment() now has configurable labels. You can continue using the function with the default parameters of ‘positive’, ‘negative’, ‘neutral’, and ‘mixed’, or you can pass in your own labels.

This image shows an example using default and custom labels to analyze sentiment.

ai.extract() parameters 

ai.extract() now has ExtractLabel, a custom object that lets users set detailed instructions about the information to be extracted. It includes the following parameters: 

  • label – The name of the extracted column. 
  • description – Natural language description to add extra context for the AI model. It can include requirements, context, or instructions for the AI to consider while performing the extraction.
  • max_items – Maximum number of items to extract for this label.
  • type JSON schema type for the extracted value. Supported types for this class include ‘string’, ‘number’, ‘integer’, ‘boolean’, ‘object’, and ‘array’
  • properties – Additional JSON schema properties for the type as a dictionary. It can include supported properties like ‘items’ for arrays, ‘properties’ for objects, ‘enum’ for enum types, and more. For example usage, refer to the documentation on Supported Schemas.

This image shows an example using additional parameters to specify how to extract the number of goals.

ai.generate_response() parameters

ai.generate_response() now has a response_format parameter for more control over the output format. You can choose text, json, or specify the json_schema, including with pydantic model.

This image shows an example using the response_format parameter to specify the output in JSON.

ai.summarize() parameters

ai.summarize() has a new instructions parameter that allows you to provide additional context to the AI model, such as specifying output length.

This image shows an example using the instructions parameter to specify output length for the summary.

New function: ai.embed()

This brand-new function allows you to convert text into numeric vectors that capture its meaning and context — a process known as embedding. These vectors let AI understand relationships between texts, so you can search, group, and compare content based on meaning rather than exact wording.

This image shows an example using the ai.embed() function to generate embeddings.

Advanced configurations for gpt-5

For more complicated tasks, AI functions can be configured to use the gpt-5 model, which has a configurable reasoning_effort parameter, to regulate the number of reasoning tokens the AI model should use, and a verbosity parameter, to regulate the number of reasoning tokens the AI model should use.

To learn more, refer to the custom configurations documentation.

Faster execution

The concurrency parameter allows you to configure the maximum number of rows to process in parallel with asynchronous requests to the model. We have increased the default concurrency to 200 so you can enjoy faster execution out-of-the-box.

Refer to the custom configurations documentation.

Support for additional models

AI functions give you more flexibility than ever when choosing models. You can use the default model, any of the models supported in Fabric, bring your own Azure OpenAI LLM resource, and you can now also bring an AI Foundry resource. This allows you to access models beyond OpenAI, such as Claude, LLaMA, and more, with both PySpark and pandas.

For more details, refer to the custom configurations documentation.

Next Steps

  • These updates will be generally available across all geographies in the coming weeks.
  • Want to dive deeper? Explore the AI functions documentation to learn more.
  • Submit your feedback on Fabric Ideas and join the conversation in the Fabric Community.

相關部落格文章

Fabric AI Functions Enhancements (Generally Available)

3月 26, 2026 作者: Jene Zhang

Fabric notebooks now support lakehouse auto-binding when used with Git. It is designed to simplify multi-environment workflows and reduce the operational overhead of managing lakehouse references across development, test, and production workspaces.

3月 20, 2026 作者: Jene Zhang

If you haven’t already, check out Arun Ulag’s hero blog “FabCon and SQLCon 2026: Unifying databases and Fabric on a single, complete platform” for a complete look at all of our FabCon and SQLCon announcements across both Fabric and our database offerings.  Fabric Notebook Public APIs (Generally Available), enabling data engineers and data scientists to … Continue reading “Fabric Notebook Public APIs (Generally Available)”