Flashcards have long been used as an effective tool for learning by providing quick, repeatable questions that help users memorize facts or concepts. Traditionally, flashcards contain a question on one side and the answer on the other. The concept is simple, yet powerful for retention, whether you're learning languages, mathematics, or any subject.
An AI-powered flashcard game takes this learning method to the next level. Rather than relying on static content, AI dynamically generates new questions and answers based on user input, learning patterns, and performance over time. This personalization makes the learning process more interactive and adaptive, providing questions that target specific areas where the user needs improvement.
In this tutorial, we'll use LLaMA 3.1, a powerful open-source large language model, to create dynamic flashcards. The AI engine will generate new questions and answers in real time based on the subject matter or keywords the user provides. This enhances the learning experience by making the flashcards more versatile, personalized, and efficient.
We need to set up our working environment before we start writing code for our flashcard app.
The first step is to install Node.js and npm. Go to the Node.js website and get the Long-Term Support version for your computer's running system. Follow the steps given for installation.
Start up your terminal and go to the location where you want to make your project. After that, run these commands:
It will make a new Next.js project and take you to its path. You'll be given a number of configuration choices during the setup process, set them as given below:
In the directory of your project, execute the following command: .
Make a new file called firebase.js in the root directory of your project and add the following code, replacing the placeholders with the real Firebase settings for your project:
We will use the free version of LLaMA 3.1 from OpenRouter and for that, we need to get the API token. Below are the steps to get one:
The code works by receiving a request from the client and extracting the raw text input using . It then sends a request to the OpenRouter API with a system prompt that outlines how LLaMA 3.1 should generate the flashcards. The response, containing the flashcards in JSON format, is parsed and returned to the client. In case of an error during the API call or processing, the error is logged, and a 500 response is returned to the client.
Run the following command to install Clerk's Next.js SDK: .
To securely store your Clerk credentials, add them to your .env.local file. Create this file if it doesn't exist:
In this part, we utilize Clerk's hook to manage user authentication. This helps identify whether the user is logged in and provides access to the user's data, which is crucial for associating flashcards with the correct user.
Here, we define the state variables using React's to handle the flashcards, their flipped state, user input, and dialog management for saving the flashcards.
This function handles sending the input text to an API to generate flashcards and updates the state based on the API response.
This function allows users to click on a flashcard to "flip" it, revealing either the front or back of the card.
Here, the functions manage the dialog's visibility. The user can open the dialog to save flashcards and close it when finished.
This function saves the generated flashcards into Firebase Firestore under the current user's collection, ensuring that each flashcard set is uniquely associated with the user.
This is the main part of the JSX, which handles the form for entering text, displays the flashcards, and renders the save dialog.
This wraps up the creation of our flashcard application. In this example, I have utilized the LLaMA 3.1 language model, but feel free to experiment with any other model of your choice.