Recently, I took a break from writing to focus on my exams. During this time, I had an interesting experience: I had the chance to explain SQL (Structured Query Language) to my peers. While exploring SQL in-depth, I encountered a common frustration: writing SQL queries to fetch specific data from a database.
This sparked an idea. What if I could build a tool where I didn't have to write SQL queries manually? Instead, I could type in plain, natural English and let the database do the heavy lifting for me.
Given that we live in the era of AI, leveraging artificial intelligence was the only way to turn this vision into reality.
In this tutorial, I'll walk you through creating an AI-powered SQL query data extractor. This tool will enable you to fetch data from a database effortlessly, without writing a single line of SQL code.
In this tutorial, we'll build an AI-powered SQL query data extractor tool. It'll allow us to interact with a database using natural language, like plain English, and receive the same results as if we had written SQL queries.
Here's an overview of the tools we'll use to create this cool app:
The database is a critical component where we'll store data and later extract it for our AI model to use when performing NLP operations. Instead of hosting a database locally, I chose a cloud-based free database that allows data extraction via REST APIs. For this project, I opted for restdb.io because it offers seamless SQL database provisioning and supports REST APIs.
An AI Agent will act as the intermediary between the database and the AI model. This agent will manage the AI model's operations and facilitate seamless communication. For this, I am using CopilotKit, which simplifies the integration process.
The AI model translates plain English queries into SQL queries. For this, I am using GroqAI, which supports various popular AI models and provides the flexibility needed for this project.
To develop a web application that supports both frontend and backend functionalities, I chose Next.js. It's an ideal framework for building robust, scalable web apps with server-side rendering capabilities.
For deployment, you can choose any service. I prefer Vercel, as it integrates seamlessly with Next.js and is free for hobby projects.
By combining these tools, we'll build a powerful, user-friendly application that effortlessly bridges natural language and SQL databases.
These are the steps we'll follow in this tutorial to build our app:
Step 1 - Set Up the Database: Either set up the database locally, deploy it, and access it, or use an online database tool that allows data access and extraction via REST APIs.
Step 2 - Obtain Cloud API Keys: Get the necessary API keys for your AI model to enable seamless integration.
Step 3 - Build a Web App: Create a web application and set up the backend to integrate CopilotKit. Configure it within the app for optimal functionality.
Step 4 - Train CopilotKit on Your Database: Provide your database's data to CopilotKit. It will read and understand the data to facilitate natural language processing.
Step 5 - Integrate CopilotKit Chat: Add the CopilotKit chat interface into your application and configure it to ensure smooth operation.
Step 6 - Test Locally: Test the app on your local machine to identify and fix any issues.
Step 7 - Deploy the App: Once everything is working as expected, deploy the application to a hosting platform.
Have you ever wondered how writing plain English could allow you to fetch data from a SQL database?
The magic lies in CopilotKit. It lets you create AI-powered copilots that can perform operations on your applications. Think of CopilotKit as your personal AI assistant or chatbot. So how does it work?
Well, first we have CopilotKit which serves as our chatbot powered by advanced AI models.
Then when you provide data to the chatbot, it uses that data to train itself, building an understanding of your database structure and content.
Finally, when a natural language query (like "Who is using this email address?") is inputted, the AI model processes it, translates it into a corresponding SQL query, and retrieves the desired data from the database.
With CopilotKit's powerful AI capabilities, your application can seamlessly bridge natural language and SQL, making database interactions more intuitive.
Now we'll go through everything you need to set up the project.
First, you'll need to create a NextJS app. Go to the terminal and run the following command:
Go to the project root folder through the terminal and run the below command. It will install all the important CopilotKit dependencies and other important packages like dotenv and Axios.
Above you can see the login page for RestDB.io you can either log in if you already have an account or create a new account .
Once logged in you will redirected to this page. There you'll see the button to create a new database.
When you click on the Create New button, a pop will appear. There, you'll have to enter the database name as shown in the image below:
When you enter the database name, then click "Go". I have put demosql as the database name. At this point, you'll get your newly created database link as shown in the image below:
Now Click on the database URL it will take you to this page shown in the image :
Now it is time to make an API Key for accessing the database. To do this, click on Settings and it will take you to a new page shown below:
Now you can configure your API actions here like GET, POST, PUT, and DELETE, name it whatever you want, and save it. Your database is now ready to interact via the REST API.
Copy the database URL and API KEY and put it into the .env file.
You can add tables, define the schema with columns and data types (for example, VARCHAR, INTEGER), and populate data manually or via uploads (Excel, CSV, or JSON). For this project, we've added 21 records.
This part is pivotal for the project, as we're setting up the LLM (Large Language Model) to handle the conversion of NLP (plain English) queries into SQL queries.
Numerous LLMs are available in the market, each with its strengths. While some are free, others are paid, which made selecting the right one for this project a challenge.
After extensive experimentation, I chose the Groq Adapter because:
To get started with Groq Cloud, visit its website and either login if already have an account or create a new account if you're new. Once logged in, navigate to the Groq Dashboard.
Once logged in, a new page will open that'll look like this:
As you can see, the sidebar has an API Keys link. Click on it, and it will open a new page as shown in the image below. You can also select any LLM of your choice which is given at the top right before the view code option.
Here, click on the Create API Key button it will open a pop up like you see below. Just enter the name of your API key and click on Submit it will create a new API key for you. Then copy this API key and paste it inside your .env file.
To enable seamless access to various LLMs on Groq Cloud, generate an API key by going to the Groq API Keys section. Create a new API key specifically for the LLM, ensuring that it is properly configured.
With the LLM set-up and all components ready, you are now prepared to build the project.
We will approach this project in a straightforward way, focusing on simplicity and functionality. The primary goal is to create a basic webpage that allows us to:
Before we start building the back end, you'll need to put all important credentials into your .env file which will look something like this:
So what are all these? Let's go through them one by one:
We have successfully added the environment variables to our project. Now, it's time to configure the CopilotKit API backed.
Open your Next.js app in any code editor - I prefer VSCode - and go to the root folder, which looks like this:
This code defines a server-side handler for a Next.js API route using CopilotKit and Groq SDKs. It sets up a runtime environment to process requests to a specified endpoint.
With this setup, you have successfully integrated CopilotKit into your Next.js application. The backend is now fully functional, enabling seamless communication with the database via REST APIs and the CopilotKit interface.
For the front end, we'll keep it as simple as we can. We just need a few things to get this project done: we need a Header component and a Table component.
To achieve this, we'll use ShadCN, a popular frontend component library known for its clean design and ease of use.
ShadCN provides pre-built components that help speed up development without compromising on quality. By leveraging this library, we can focus on functionality while ensuring the UI looks polished and professional.
Run the following command to install ShadCN components:
To add specific components, use the following command:
In the frontend, we have a folder that contains the Table component. This component is responsible for displaying the database data in a structured tabular format.
Apart from the component, there are two additional files in the front end. These files serve different purposes and will be integrated later in the project for specific functionalities.
This modular structure ensures the front end remains clean and organized, making it easier to manage and expand as needed.
This code renders a styled, dynamic table with data passed from a database or API.
In this file, we handle the API calls, define CopilotKit actions, and pass the fetched data to the Table component. This file acts as the central logic hub for connecting the backend API, AI actions, and the frontend display.
Designing and deploying a database can take various forms, depending on the tools and requirements. For this project, I have chosen the simplest and most accessible approach.
CopilotKit is a powerful tool that converts NLP queries into actionable backend code. If you have an alternative that works similarly, feel free to use it. It bridges the gap between natural language input and technical execution, making it ideal for projects like this.
I selected GroqCloud because it's free and provides access to multiple LLMs with a single API key. While you can opt for alternatives like ChatGPT, note that they may require paid plans. GroqCloud's versatility and affordability make it perfect for this tutorial.
The size of your database can vary from very small to enormous. However, interacting with the database depends on the token limits of the LLM you're using.
Since I'm working with free-tier tools, my focus is on a small database to ensure seamless interactions.
Never expose your credentials publicly. Always store sensitive information like API keys in an file to keep your project secure.
While this tutorial focuses on setting up and querying a database, the potential of CopilotKit extends to CRUD operations (Create, Read, Update, Delete). In my next tutorial, I will demonstrate how to implement full CRUD operations using CopilotKit for a more dynamic and functional application.
You can explore the live project via the following link and ask any questions related to the database data: live link .
For a deeper understanding of the code, here's the GitHub repository link: github .
Also, here's a screenshot demonstrating its practical use. In this example, instead of writing a plain SQL query like to extract the name of the person, we used an NLP query to achieve the exact same result.
I hope you've enjoyed building this simple AI chatbot to interact with the database. In this project, we've used a simple SQL database, but you can apply this approach to any database as long as you can retrieve the data.
In the future, I plan to implement many new projects involving AI and other tools. AI tools are truly game-changing in the IT field, and I look forward to providing you with more detailed insights and practical implementations of the latest tools emerging in the space.
So this is the end from my side. If you found this article useful, then do share it and connect with me - I am open to opportunities: