AutoGen is a multi-agent LLM framework released by Microsoft, designed to facilitate the creation of LLM applications by utilizing multiple agents that can interact with each other to handle tasks. These AutoGen agents can be customized according to specific needs, engage in conversations, and seamlessly integrate human participation. They adapt to various operational modes, including the use of LLMs, human input, and various tools.
Flowise is an open-source no-code tool for building customized large language model workflows. Users can build backends for LLM applications used for question answering, summarization, and data analysis through a visual interface.
In the past two sessions, we introduced AutoGen + LangChain and AutoGen + LangChain + PlayHT:
No longer just a chatbot! AutoGen + LangChain = Super AI Assistant
AutoGen + LangChain + PlayHT = Speaking Super AI Assistant
Flowise has always been a popular no-code platform for developing LLM workflows based on Langchain, and many of you may have built impressive features on it.
In this tutorial, I will show you how to integrate Flowise workflows into AutoGen. You will be able to empower AutoGen’s Agents with Flowise’s AI capabilities at minimal cost.
We utilize Flowise to simplify AI agents. This agent was built in an earlier tutorial – no longer just a chatbot! AutoGen + LangChain = Super AI Assistant. This agent can perform tasks involving Uniswap knowledge.
Note: A link to the complete code in a Python Notebook will be provided at the end as usual.
Creating Flowise Workflows
You can refer to the following link to learn how to use Flowise:
https://github.com/FlowiseAI/Flowise
Let’s start by creating a workflow. This is a PDF document chatbot named Uniswap Chatbot. The white paper for the Uniswap V3 protocol I am using in this example can be found at https://uniswap.org/whitepaper-v3.pdf.
The required components include:
-
Recursive Character Text Splitter
-
PDF File
-
OpenAI Embedding
-
In-Memory Vector Store
-
ChatOpenAI
-
Buffer Memory
-
Conversational Retrieval QA Chain
Click the Save button in the upper right corner. Then you can click the Chat button to start chatting.
To access programmatically, we can click the API Endpoint button in the upper right corner to get the integration code. You will likely see an interface similar to the one below:
Each workflow in Flowise supports API access, making it very suitable for programmatic integration. This is the Python code snippet we will use shortly.
Modifying AutoGen Application
Now that we have implemented the QA chain as a Flowise workflow, we can complete the coding of the AutoGen application in fewer steps.
-
Get the Python code snippet for Flowise integration.
-
Define a function named answer_flowise_uniswap_question.
-
Validate the answer_flowise_uniswap_question function.
-
Set up the AutoGen UserProxyAgent and AssistantAgent, and enable function calling.
Get the Python code snippet for Flowise integration
Obtain a code snippet similar to the one below from the Flowise workflow.
import requests
API_URL = "http://localhost:4000/api/v1/prediction/433ed37e-9546-4e73-a688-7352b78bf852"
def query(payload): response = requests.post(API_URL, json=payload) return response.json()
output = query({ "question": "Hey, how are you?",})
Defineanswer_flowise_uniswap_question‘s function
import requests
API_URL = "http://localhost:4000/api/v1/prediction/433ed37e-9546-4e73-a688-7352b78bf852"
def answer_flowise_uniswap_question(question): response = requests.post(API_URL, json={ "question": question }) return response.json()
Validateanswer_flowise_uniswap_question function
answer_flowise_uniswap_question("What are the main changes in Uniswap v3?")
SetAutoGen UserProxyAgent and AssistantAgent
llm_config={ "config_list": config_list, "temperature": 0, "functions": [ { "name": "answer_flowise_uniswap_question", "description": "Answer any Uniswap related questions", "parameters": { "type": "object", "properties": { "question": { "type": "string", "description": "The question to ask in relation to Uniswap protocol", } }, "required": ["question"], }, } ],}
assistant = autogen.AssistantAgent( name="assistant", llm_config=llm_config,)
user_proxy = autogen.UserProxyAgent( name="user_proxy", human_input_mode="NEVER", max_consecutive_auto_reply=10, code_execution_config={"work_dir": "."}, llm_config=llm_config, system_message="""Reply TERMINATE if the task has been solved at full satisfaction.Otherwise, reply CONTINUE, or the reason why the task is not solved yet.""", function_map={"answer_flowise_uniswap_question": answer_flowise_uniswap_question})
Now you can submit tasks using user_proxy. The code is as follows:
user_proxy.initiate_chat( assistant, message="""I'm writing a blog to introduce the version 3 of Uniswap protocol. Find the answers to the 3 questions below and write an introduction based on them.
1. What is Uniswap?2. What are the main changes in Uniswap version 3?3. How to use Uniswap?
Start the work now.""")
The complete example code will be published in the form of a Python Notebook on GitHub. If you are interested, please click the link below to go:
https://github.com/sugarforever/LangChain-Advanced/blob/main/Integrations/AutoGen/autogen_flowise_ai_agent.ipynb
Wishing everyone a happy life!