Build Your Own Chat System Using HuggingChat

Hello everyone! I’m back! Today we are going to talk about a super hot topic – how to build your own chat system using HuggingChat. This tool provides us with a “building blocks” platform, allowing us to easily create chatbots similar to ChatGPT. Alright, let’s begin today’s Python journey!
What is HuggingChat?
HuggingChat is a powerful conversational system framework launched by Hugging Face, which allows us to easily build chat applications based on various language models. Simply put, it’s like a “chatbot toolbox” filled with various ready-made tools that enable us to quickly create our own AI assistants.
Tip: HuggingChat is based on the Transformers library and supports multiple open-source models, not just GPT!
Installing Necessary Packages
Before we start, we need to install some necessary packages:
pip install huggingchat
pip install gradio  # For building web interface
pip install torch transformers  # Basic dependencies
Basic Usage
Let’s first take a look at how to create the simplest chatbot:
from huggingchat import ChatBot  # Create chatbot instance
bot = ChatBot()  # Send a message and get a response
response = bot.chat("Hello, please introduce yourself!")
print(response)
This simple example demonstrates the basic usage of HuggingChat. It’s as easy as chatting with a friend; you send a message, and it replies to you.
Building a Web Interface
Now, let’s create a cooler web-based chat interface:
import gradio as gr
from huggingchat import ChatBot
def chat_response(message, history):    
    bot = ChatBot()    
    response = bot.chat(message)    
    return response  # Create web interface
demo = gr.ChatInterface(    
    chat_response,    
    title="My AI Assistant",    
    description="Welcome to chat with me!",    
    theme="soft")  # Launch service
demo.launch()
This code creates a clean and beautiful web chat interface that anyone can access through a browser and chat with your AI assistant.
Customizing Models
Want to make your chatbot smarter? We can use different language models:
from huggingchat import ChatBot
from transformers import AutoTokenizer, AutoModelForCausalLM  # Load custom model
model_name = "your-preferred-model"
tokenizer = AutoTokenizer.from_pretrained(model_name)
model = AutoModelForCausalLM.from_pretrained(model_name)  # Create chatbot with custom model
bot = ChatBot(model=model, tokenizer=tokenizer)
Note: Be mindful of your hardware configuration when selecting models, as some large models may require significant GPU memory.
Adding Memory Functionality
Allow the chatbot to remember conversation history for a more coherent dialogue:
class ChatBotWithMemory:    
    def __init__(self):        
        self.bot = ChatBot()        
        self.history = []        
    def chat(self, message):        
        # Use conversation history as context        
        context = "\n".join(self.history[-5:])  # Keep the last 5 rounds of conversation        
        full_message = f"{context}\n{message}" if context else message                
        response = self.bot.chat(full_message)                
        # Update history        
        self.history.append(f"User: {message}")        
        self.history.append(f"Bot: {response}")                
        return response  # Use chatbot with memory
memory_bot = ChatBotWithMemory()
print(memory_bot.chat("Hello"))
print(memory_bot.chat("What did we talk about earlier?"))
Advanced Feature: Sentiment Analysis
Let’s add sentiment analysis capability to the chatbot:
from transformers import pipeline
class EmotionalChatBot:    
    def __init__(self):        
        self.bot = ChatBot()        
        self.sentiment_analyzer = pipeline("sentiment-analysis")        
    def chat(self, message):        
        # Analyze the sentiment of the user's message        
        sentiment = self.sentiment_analyzer(message)[0]                
        # Adjust the response style based on sentiment        
        if sentiment['label'] == 'POSITIVE':            
            prompt = f"Respond in a positive tone: {message}"
        else:            
            prompt = f"Respond in a comforting tone: {message}"                
        return self.bot.chat(prompt)  # Create chatbot with sentiment analysis
emotional_bot = EmotionalChatBot()
Tip: The model’s response may take a few seconds, which is normal. If it feels too slow, consider using a smaller model.
Deploying to the Cloud
Want more people to use your chatbot? We can deploy it to the cloud:
import gradio as gr
from huggingchat import ChatBot
def create_app():    
    bot = ChatBot()        
    def chat(message, history):        
        response = bot.chat(message)        
        return response        
    app = gr.ChatInterface(        
        chat,        
        title="AI Assistant",        
        description="Available 24/7 to serve you",        
        theme="soft"    )        
    return app  # Create app
app = create_app()  # Launch service (can be deployed to cloud)
app.launch(server_name="0.0.0.0", server_port=7860)
Notes
  1. Please ensure your Python version is ≥3.7
  2. The first run will download the model, which may take some time
  3. Using a GPU can significantly improve response speed
  4. Remember to save conversation history in a timely manner to avoid data loss
Friends, today’s Python learning journey ends here! We learned how to use HuggingChat to build our own chat system, from basic usage to advanced features. I believe you have mastered a lot of knowledge. Remember to code along, and feel free to ask me any questions in the comments. Wish everyone happy learning and may your Python skills soar!

Leave a Comment