Expanded Data Agent Support for Large Data Sources
We are continuously enhancing data agents in Fabric to deliver more powerful and flexible data experiences. In February of this year, we introduced a host of new improvements coming to the AI Skill—including support for additional Data Sources such as Eventhouse KQL and Semantic Models. Initially, integration to data sources was limited to sources with fewer than 1,000 tables or under 100 columns plus measures, which restricted many users from fully leveraging LLMs for data analysis and reasoning.
We are excited to announce that these schema size restrictions have now been lifted. You can now seamlessly integrate large-scale data sources–including Kusto, Semantic Models, Lakehouse, and Warehouse datasets–with over 1,000 tables and more than 100 columns and measures into Fabric data agent. This update significantly broadens the scope of what users can achieve with Fabric data agent, enabling deeper insights, richer semantic modeling, and more robust AI-powered data experiences.
The example demonstrates how to add a data source with a large schema to data agent:


While this expansion unlocks exciting new possibilities, we want to be transparent about performance expectations. With larger schema sizes, the reliability of results may vary. We are actively working to enhance reliability across Fabric data agent, and targeted improvements for handling larger schemas are well under way. Stay tuned for more updates and new feature releases!
Next Steps
To learn more about Fabric data agents, please explore our Data Agent Documentation.
For guidance on improving your data agent’s reliability, please refer to the Best Practices for Configuring your Data Agent guide.
Give us your feedback on Fabric Ideas.