Building a Smart Chatbot with SQL Database in Microsoft Fabric, LangChain and Chainlit
Microsoft’s recent advancements in the Data & AI space, particularly with the introduction of SQL Database in Microsoft Fabric, have significantly lowered the barrier for developers to create innovative AI applications. These innovations bring AI closer to data, empowering every developer to build an AI app natively and in a more secure manner.
Imagine you’re the founder of Contoso, a rapidly growing e-commerce startup. As your online store grows, you realize that many customer inquiries are about basic product information: price, availability, and specific features. To automate these routine questions, you decide to build a chatbot.
In this post, we’ll guide you through the process of creating a chatbot that can efficiently handle these product-related queries, freeing up your customer support team to focus on more complex issues.
Smart Chatbot Architecture
Let’s break down the building blocks of our chatbot. The core components that power our AI assistant are:
- LangChain: An open-source framework for building AI applications powered by Large Language Models (LLMs). It serves as the orchestrator in the Retrieval Augmented Generation (RAG) pattern.
- Chainlit: An open-source Python library for building scalable conversational AI applications. We’ll leverage Chainlit’s native integration with LangChain and its intuitive chat interface to provide seamless user interface for our chatbot.
- SQL Database in Microsoft Fabric: The SQL database in Fabric offers several key features that make it an excellent choice for developing new AI applications. These features range from but are not limited to providing native support for Vector and RAG, ability to call any model via APIs from within the database, integration with popular AI platforms and services such as Azure AI services and Azure AI Foundry. We’ll utilize SQL database in Fabric to chunk text, generate embeddings, store the data, and perform similarity searches to retrieve relevant information.
- Azure Functions: A serverless computing platform that allows us to build and deploy event-driven applications without managing infrastructure. We’ll leverage Azure Functions to create a scalable and efficient pipeline for our chatbot, automating tasks like data ingestion and storing embedding.
- Azure Open AI: A platform providing advanced AI models and tools. Our chatbot will utilize OpenAI’s language models for tasks like generating text responses and embeddings for semantic search.
With a solid understanding of these core components, we’re ready to dive into the practical implementation. In the next section, we’ll walk through a step-by-step guide to build our chatbot.
Prerequisites
- Azure Subscription: Create a free account.
- Microsoft Fabric Subscription: Create a free trial account.
- Azure OpenAI Resource: Deploy models for chat completion (GPT-4.0) and embedding (text-emebdding-3-small).
- Azure OpenAI Access: Apply for access in the desired Azure subscription as needed.
- Visual Studio Code: Download Visual Studio Code for free and optionally install Azure Functions Core Tools if you plan to use Azure functions.
- Download Git and clone the Chatbot repository.
- Download and install Python 3.12 for optimal performance. While Python 3.13 is supported, additional tool installation may be required to build NumPy from source. Refer to the NumPy documentation for detailed instructions.
- Download and install ODBC driver.
- Download and install Azure CLI.
Step 1: Create and Deploy Database
In this step, we will create an SQL DB in Microsoft Fabric and execute the database deployment project which will:
- Create Product and Product Embeddings tables.
- Create three stored procedures that will:
- Generate embeddings for any given text.
- Find products using similarity search.
- Automatically update the product embeddings for any new products created or updated.
- Execute data creation scripts to populate both tables.
The database project leverages DbUp, a .NET library to track and manage database schema changes throughout the deployment process.
Creating a SQL DB in Microsoft Fabric
- Login to Microsoft Fabric portal and create a workspace.
- From your workspace click ‘+ New item’ and search for SQL in the search text box in ‘New item’ page as shown below. Select SQL database (preview).
Create SQL DB in Microsoft Fabric Workflow - When prompted for Database name enter ContosoDB.
- After the database is created click on the settings icon and copy the Connection String information and store it separately in a notepad. You will need this later, to configure the environment.

Database Connection String Information
Deploying the Database Project
- Open the cloned project in Visual Studio Code and navigate to the database folder.
- Create a ‘. env’ file, based on ‘. env.example’ template.
- To obtain Azure Open AI values (OPENAI_URL and OPENAI_KEY), navigate to Open AI Studio and login using your Azure credentials. On the Home page, copy the ‘Azure OpenAI Endpoint’ and ‘API Key 1′ (or API key 2 either works) values respectively.

- Fill in the required configuration values, including the database connection string (MSSQL). From the connection string values saved earlier use Data Source and Initial Catalog values for the Server and Database values, respectively. ENCRYPTION_PASSWORD value will be used to encrypt the master key in database and must meet the Windows password policy requirement.
- Open a new terminal window, navigate to the database folder.
- Execute the following commands to deploy the database scripts,
dotnet build .\Database.Deploy.csproj
dotnet run
7. Once the deployment is complete. On Microsoft Fabric portal, navigate to the database explorer and verify that the tables and stored procedures are deployed correctly. 
Verify database deployment
With our database now fully populated, let’s move on to building the python-based chatbot interface using LangChain and Chainlit.
Step 2: Create a Chatbot interface
This section dives into the core capabilities of LangChain and ChainLit that enriches the process of building a chatbot with a smooth experience. You’ll find the code example in the app.py file.
LangChain’s Capabilities
- Effortless Prompt Construction: LangChain simplifies crafting prompts for your chatbot, guiding the LLM towards the desired response.
- Agentic Retrievers: Define specialized agents within LangChain that can perform tasks on your behalf, like retrieving relevant contextual information from your database. This enriches the data available for the chatbot to generate more informed responses.
- LangChain Expression Language (LECL): LECL acts as the bridge, combining the constructed prompt and retrieved context. It then passes this combined information to the configured Large Language Model (LLM) for generating responses.
ChainLit’s Native Integration
ChainLit streamlines chatbot development through its built-in support for LangChain, specifically via the LangChainCallbackHandler function. This integration happens at two key points:
- ‘on_chat_start’: When a user initiates a chat session, LangChain’s chain is initialized, preparing it to handle future interactions.
- ‘on_message’: As users send messages, ChainLit efficiently relays them to LangChain’s chain for processing and response generation. This ensures a smooth conversation flow, allowing the chatbot to understand and respond accurately.
In addition to this, the file ‘utlilities.py‘ contains the code required to connect and interact with database.
Now that we’ve explored the core functionalities, let’s see it in action.
Create a Python Environment and Install Requirements
- Navigate to the chainlit folder of the cloned project.
- Create a ‘. env’ file, based on ‘. env.example’ template.
- To obtain Azure Open AI values (AZURE_OPENAI_ENDPOINT and AZURE_OPENAI_API_KEY), navigate to Open AI Studio and login using your Azure credentials. On the Home page, copy the ‘Azure OpenAI Endpoint’ and ‘API Key 1′ (or API key 2 either works) values respectively.

- Fill in the required configuration values, including the database connection string (FABRIC_SQL_CONNECTION_STRING). From the connection string values saved earlier use Data Source and Initial Catalog values for the Server and Database values, respectively.
- Open a new terminal window, navigate to the chainlit folder, create a virtual environment and install the requirements:
python -m venv .venv
source .venv/bin/activate
pip install -r requirements.txt
For windows
python -m venv .venv
.venv/Scripts/activate
pip install -r requirements.txt
Launch Chatbot
Fire it up! Within your Python environment, execute the following command.
chainlit run app.py
Your chatbot will launch automatically at http://localhost:8000/. Now you can start interacting with it, just like asking.
What are some recommendations for a wireless charger?

Chatbot In Action
LangChain calls the GetRelevantProducts function, which executes the dbo.find_relevant_products stored procedure in the database to perform vector or similarity search.
SQL DB Connection – User Permissions
Occasionally, you may encounter a SQL authentication for “user login failure“ due to an invalid token. In this case, you will need to configure your .env file to use the CONN_TOKEN configuration for authentication. To generate a token, follow these steps:
Open a terminal window and execute the following Azure CLI command to login to your Azure account,
az login
Select the subscription and then execute the following Azure CLI command to generate a token,
az account get-access-token --resource https://database.windows.net
Note, the token expires one hour after its generated and will need to be regenerated.
Step 3 (Optional): Automate Data Ingestion
This is an optional step; in case you would like to ingest data in real-time and also generate and store embeddings in the Product Embedding table. For this we will be leveraging Azure SQL Trigger Binding that would trigger an Azure function whenever a row is created, updated or deleted in the Products table.
- Open the cloned project in Visual Studio Code and navigate to the azure-function folder.
- Create a ‘local.settings.json’ file, based on ‘local.settings.json.example’ template.
- Fill in the required configuration values, including the database connection string (FABRIC_SQL_CONNECTION_STRING). From the connection string values saved earlier use Data Source and Initial Catalog values for the Server and Database values, respectively.
- Open a new terminal window, navigate to the azure-function folder.
- Execute the following commands to build and start the azure function.
dotnet build .\FunctionTrigger.csproj
func start
Now, every time you insert or update a record in the Products table, the Azure Function ‘Product Trigger’ will be invoked. This function will call the ‘update_product_embeddings‘ stored procedure to generate embeddings and store them in the Product Embeddings table.
We hope this guide has provided you with a solid foundation for building your own AI-powered chatbots. We’re excited to see what you create. Feel free to share your projects and experiences in the comments below.
Happy Programming!!!
Acknowledgement
Would like to express sincere gratitude to Muazma Zahid and Davide Mauri for their invaluable contributions to this blogpost. Their insightful reviews and significant code contributions were instrumental in enhancing the quality and functionality of this post.
Create SQL DB in Microsoft Fabric Workflow
