Building a Simple FAQ Bot Using AISuite
A Step-by-Step Guide to Leveraging models like GPT and Claude for AI Integration

Hello everyone! Welcome to another article from Victoria. I have recently been dabbling a lot in different AI integration tools. After all, in the world of AI, the ability to integrate advanced language models quickly and easily into your applications is crucial.
Recently, I come across AISuite, an open source library designed to make it easy to build AI systems and integrate advanced language models, like OpenAI’s GPT and Anthropic’s Claude, into your applications.
In this article, I will show you how to use this library to create a FAQ bot that can provide responses to predefined questions.
We will cover the following steps:
Setting up the AISuite library
Adding FAQ data
Implementing a basic FAQ bot
Running the bot
What is AISuite?
AISuite is a Python library that provides a unified interface to various Generative AI providers. It abstracts away the complexity of interacting with multiple LLMs, enabling you to seamlessly switch between different models without modifying your code.
Currently, AISuite supports major providers like OpenAI, Anthropic, Google, Azure, AWS, Mistral, and HuggingFace.
It allows you to integrate chat completions, allowing for easier testing and comparing different AI models with minimal effort.
Prerequisites
Before we start coding, here are the things you need:
Python 3.6+: AISuite is compatible with Python 3.6 and higher.
API Keys: You need API keys from the providers you plan to use (e.g., OpenAI, Anthropic).
AISuite Installation: We'll install the base AISuite package and the provider-specific libraries.
Step 1a: Install AISuite
To install AISuite, you can use the following commands depending on your needs:
Install just the base AISuite package:
pip install aisuiteInstall AISuite along with the Anthropic provider:
pip install 'aisuite[anthropic]'Install all the available providers:
pip install 'aisuite[all]'
Step 1b: Set Up API Keys
For AISuite to work with providers like OpenAI and Anthropic, you need to set up API keys for those services.
- OpenAI: You'll need to create an account at OpenAI and generate an API key.

- Anthropic: Do the same thing with Anthropic

Once AISuite is installed, set up your API keys. You can use environment variables to store your keys:
export OPENAI_API_KEY="your-openai-api-key"
export ANTHROPIC_API_KEY="your-anthropic-api-key"
Alternatively, you can pass the keys directly when creating the AISuite client.
Step 2: Set Up the FAQ Knowledge Base
In a typical FAQ bot, we need a set of predefined questions and answers that the bot will reference. For simplicity, let’s store the FAQ data in a dictionary.
faq_data = {
"What is AI?": "AI (Artificial Intelligence) refers to the simulation of human intelligence in machines.",
"What is Machine Learning?": "Machine Learning is a subset of AI that involves training models to learn from data and make predictions.",
"What is AISuite?": "AISuite is an open-source Python library that simplifies the process of building AI systems."
}
Step 3: Initialize the AISuite Client
We’ll begin by creating a client for AISuite. This client allows us to interact with multiple LLMs such as GPT-4 and Claude.
import aisuite as ai
# Initialize the AISuite client
client = ai.Client()
Step 4: Build the FAQ Bot Logic
Now that we have our knowledge base and client set up, let's implement the core FAQ bot logic. The bot will use chat completions to generate responses based on the user's input and compare the results from different language models.
This code sample from the documentation shows how to use it.
# Define the models to use
models = ["openai:gpt-4", "anthropic:claude-3.5-20240620"]
# Set the system message that instructs the bot to respond in a specific manner
messages = [
{"role": "system", "content": "Respond to only AI-related questions."},
{"role": "user", "content": "What is AI?"}, # This is where the FAQ query will be passed
]
# Iterate over models to query them
for model in models:
response = client.chat.completions.create(
model=model,
messages=messages,
temperature=0.75
)
print(f"Response from {model}: {response.choices[0].message.content}")
Explanation of the Code:
Messages: The
messagesarray contains two entries: the first one is a system message that instructs the bot to respond in a specific way or scope, and the second one is the user message, which contains the FAQ query.Models: The list of models includes
openai:gpt-4for OpenAI’s GPT-4 andanthropic:claude-3.5-20240620for Anthropic’s Claude.Temperature: The
temperatureparameter controls the randomness of the response. A higher value makes the response more creative, while a lower value keeps it more focused and deterministic.
Step 4: Interactivity for Real-Time Queries
To make the bot interactive, let’s create a simple loop where the user can ask questions in real-time. We’ll modify the messages array dynamically based on the user's input and get answers from the LLMs.
def start_faq_bot():
print("Welcome to the FAQ bot! Ask your questions or type 'exit' to quit.")
while True:
user_query = input("You: ")
if user_query.lower() == 'exit':
print("Goodbye!")
break
# Set the user query dynamically
messages[1] = {"role": "user", "content": user_query}
# Query the models and get the response
for model in models:
response = client.chat.completions.create(
model=model,
messages=messages,
temperature=0.75
)
print(f"Response from {model}: {response.choices[0].message.content}")
# Start the FAQ bot
start_faq_bot()
How It Works:
The
start_faq_bot()function creates a simple interactive loop.Users can type their queries, and the bot will use the models to respond.
You can exit the loop by typing
'exit'.
Sample Interaction:
Welcome to the FAQ bot! Ask your questions or type 'exit' to quit.
You: What is AI?
Response from openai:gpt-4: AI is the simulation of human intelligence by machines...
Response from anthropic:claude-3.5-20240620: Artificial Intelligence (AI) refers to the ability of a machine to perform tasks that would normally require human intelligence...
You: What is AISuite?
Response from openai:gpt-4: AISuite is a Python library designed to simplify the process of building AI systems...
Response from anthropic:claude-3.5-20240620: AISuite is an open-source Python library that helps developers build AI applications...
Step 5: Enhancements and Extensions
Once your FAQ bot is working, there are many ways to improve and extend it:
Expand the Knowledge Base: Instead of hardcoding FAQ data, you could connect the bot to a database or upload a PDF as a knowledge bank.
Improve Retrieval: Integrate more advanced document retrieval systems (e.g., using FAISS or BM25) to retrieve answers from a large corpus.
Use Memory: Implement a memory system to make the bot more conversational, remembering past interactions for context.
Add More Providers: AISuite supports multiple LLM providers, so you can easily add more models like Google’s PaLM or HuggingFace’s transformers to your bot.
The format for the environment variables for all supported providers are as follows:
# OpenAI API Key
OPENAI_API_KEY=
# Anthropic API Key
ANTHROPIC_API_KEY=
# AWS SDK credentials
AWS_ACCESS_KEY_ID=
AWS_SECRET_ACCESS_KEY=
AWS_REGION=
# Azure
AZURE_API_KEY=
# Google Cloud
GOOGLE_APPLICATION_CREDENTIALS=./google-adc
GOOGLE_REGION=
GOOGLE_PROJECT_ID=
# Hugging Face token
HUGGINGFACE_TOKEN=
# Fireworks
FIREWORKS_API_KEY=
# Together AI
TOGETHER_API_KEY=
Conclusion
AISuite provides a simple and unified interface for interacting with various language models, making it an excellent choice for developers looking to build a multi-LLM chatbot or FAQ bot.
By leveraging existing models, you can create a bot that responds to user queries with natural language generation, while comparing how each model answers. The setup is straightforward, and the code is minimal—allowing you to focus on your application's logic rather than managing multiple API integrations.
I hope this guide helped you create your first FAQ bot using AISuite. If you have any questions, feel free to ask in the comments and share your thoughts! Also, do check out the GitHub repo for more demos that you can follow! Cheers!






