Flowise – Visual Building of LLM Applications

Flowise is a low-code/no-code drag-and-drop tool that helps people easily visualize and build LLM applications.

Flowise - Visual Building of LLM Applications

LLM applications have wide applications in many industries, from finance to healthcare, to retail and logistics. With FlowiseAI, even those without programming experience can create these applications without writing any code. This is also beneficial for organizations striving to quickly prototype and develop LLM applications in an agile way.

Let’s take a look at some of the standout features of Flowise AI:

  • Drag-and-drop UI: Flowise makes designing your own custom LLM workflows simple.
  • Open-source: As an open-source project, Flowise can be freely used and modified.
  • User-friendly: Flowise is easy to get started with, even for those without coding experience.
  • Versatile: Flowise AI can be used to create various LLM applications.

Installation and Setup

To install and start using Flowise, follow these steps:

  1. Download and install NodeJS >= 18.15.0 (check your installed version using node -v; if it’s not high enough, I recommend resolving the issue directly on this page: https://nodejs.org/en/download)
  2. Install Flowise using the following command: npm install -g flowise.
  3. Start Flowise: npx flowise start, use username and password to start: npx flowise start --FLOWISE_USERNAME=user --FLOWISE_PASSWORD=1234.
  4. Open http://localhost:3000 in your browser
Flowise - Visual Building of LLM Applications

If you have a Docker environment, it’s more convenient to start with Docker.

Docker Compose

  1. Navigate to the docker folder in the project root directory
  2. Create a .env file and specify PORT (refer to .env.example)
  3. Run docker-compose up -d
  4. Open http://localhost:3000
  5. You can stop the container using docker-compose stop

Docker Image

  1. Build the image locally: docker build --no-cache -t flowise .
  2. Run the image: docker run -d --name flowise -p 3000:3000 flowise
  3. Stop the image: docker stop flowise

Flowise supports a rich set of components.

Flowise - Visual Building of LLM Applications

Example 1: Building a Basic LLM Chain

  1. On a blank canvas, click the + Add New button to bring up the left-side Add Nodes panel.
Flowise - Visual Building of LLM Applications
  1. Select the following components from the Add Nodes panel, which will appear on the canvas.
  • Drag OpenAI from LLMs to the panel
  • Drag LLM chain from the Chains category
  • Drag Prompt Template from the Prompts category

Now, the canvas should look like this:

Flowise - Visual Building of LLM Applications
  1. Connect the components
  • Link the output of OpenAI to the language model of LLM Chain
  • Link the output of Prompt Template to the Prompt of LLM Chain
Flowise - Visual Building of LLM Applications
  1. Enter the necessary information
  • Enter your OpenAI key in the OpenAI field
  • Write the following prompt template in the Prompt Template Template field: What is a good name for a company that makes {product}?
  • Give a name to the LLM Chain
  • Click the save icon in the top right corner to save
  • Click the chat icon in the top right corner to start sending “product names”. Here, we got the expected answer
Flowise - Visual Building of LLM Applications

Example 2: Building a PDF Reader Bot

Now, let’s create a PDF reading bot using Flowise.

  1. Add the following components to the blank canvas:
  • Select Recursive Character Text Splitter from Text Splitters
  • Select PDF file from Document Loaders
  • Select OpenAI Embeddings from Embeddings
  • Select In-memory Vector Store from Vector Stores
  • Select OpenAI from LLMs
  • Select Conversational Retrieval QA Chain from Chains

Now we have all the necessary components on the canvas.

Flowise - Visual Building of LLM Applications
  1. Connect the components
  • Link the output of Recursive Character Text Splitter to the input of PDF file
  • Link the output of PDF file to the input of In-memory Vector Store
  • Link the output of OpenAI Embeddings to the input of In-memory Vector Store
  • Link the output of In-memory Vector Store to the input of Conversational Retrieval QA Chain
  • Link the output of OpenAI to the input of Conversational Retrieval QA Chain
Flowise - Visual Building of LLM Applications
  1. Enter the necessary information
  • Click Upload File in PDF File, upload the sample PDF file titled Introduction to AWS Security.
  • Enter your OpenAI key in the OpenAI and OpenAI Embeddings fields
  • Click the save button and then click the chat button to start sending requests.
Flowise - Visual Building of LLM Applications

The response should be as expected, and the bot can now answer any questions related to this PDF document.

For more uses of Flowise, refer to the official documentation: https://docs.flowiseai.com/ for more information.

Reference Documentation

GitHub repository: https://github.com/FlowiseAI/Flowise

Leave a Comment