Announcing AI functions for seamless data engineering with GenAI
If you saw the keynote at FabCon last fall, you may remember an early demo of AI functions in Fabric, a new feature that makes it easy to apply LLM-powered transformations to your OneLake data. We’re thrilled to announce that AI functions are now in preview.
With AI functions, you can harness the power of GenAI for summarization, classification, text generation, and so much more—all with a single line of code. It’s seamless to incorporate AI functions as part of data-science and data-engineering workflows, whether you’re transforming Spark or pandas DataFrames. There is no complex setup, no tricky syntax, and, hopefully, no hassle.
Getting started with AI functions
AI functions turbocharge your analytics workflow, whether you want to classify customer reviews by product categories or to generate action items with custom prompts.
To get started, just install and import the relevant libraries for AI functions with Python or PySpark code in your Fabric notebook. You can copy and paste this code from our documentation – Transform and enrich data seamlessly with AI functions under ‘Getting started with AI functions‘.

Once those libraries are installed and imported, you can call any of the following AI functions to transform and enrich your data with simple, lightweight logic:
AI function | Description |
ai.similarity | Compare the meaning of input text with a single common text value, or with corresponding text values in another column. |
ai.classify | Classify input text values according to labels you choose. |
ai.analyze_sentiment | Identify the emotional state expressed by input text. |
ai.extract | Find and extract specific types of information from input text, for example locations or names. |
ai.fix_grammar | Correct the spelling, grammar, and punctuation of input text. |
ai.summarize | Get summaries of input text. |
ai.translate | Translate input text into another language. |
ai.generate_response | Generate responses based on your own instructions. |

The syntax is so straightforward that, you may not even need our documentation. For example, check out how simple it is to translate foreign customer-service call transcripts with the ai.translate() function:

Customizing AI functions
AI functions are designed to work out-of-the-box, with the underlying model and settings configured by default. Users who want more flexible configurations, however, can customize their solutions with a few extra lines of code.
If you want to use your own Azure OpenAI resources instead of the Fabric defaults—or if you want to experiment with the underlying language model’s properties—follow the instructions in our documentation – Customize the configuration of AI functions.
Next steps
We can’t wait for you to try out AI functions on your own data and let us know what you think. Submit your feedback on Fabric Ideas and join the conversation on the Fabric Community.
Before trying out the feature yourself, please make note of a few prerequisites:
- AI functions currently require an F64 or higher SKU or a P SKU to use Fabric’s built-in AI endpoint. You can still use the functions with a smaller capacity resource, but you’ll need to provide your own Azure OpenAI resources using custom settings.
- Your tenant administrator must enable the tenant switch for Copilot and other features powered by Azure OpenAI. (Depending on your location, you may also need to enable the tenant setting for cross-geo processing.)
- You must be on the Fabric 1.3 runtime or higher to use AI functions.
We’ll be showing off AI functions—and more LLM-powered features on our roadmap—at the upcoming Microsoft Fabric Community Conference in Las Vegas.
Register now to secure a spot, we look forward to hearing from you soon!